C++ Remote - Having files local and on the host - c++

i have the following problem. I am using Netbeans 8.2 on Windows 7. I configure a C/C++ Build Host which is a linux system. On this linux-system i have a c/c++ projekt. Then I create a new project with right-click on this host -> New C/C++ Project and than --> C/C++ Project with Existing Source.
This work fine. On my windows-machine i can change the code, can compile and run the project. This all happens on the linux machine.
But the project and the sourcefiles only exist on the host-machine. Is there a way, that each compiling the data (source-files and the project) was copying to the local-machine so that the files exist local and on the host?
Greetings

I'd discourage such a way of keeping code in sync between two machines. What you would like to achieve is having code (and I count the project files as code here) in sync between two machines. That you work on one and compile on the other is not that relevant here.
The - in my opinion by far - best way to keep things in sync is to have some kind of version control system, i.e., a repository. I prefer git in almost every case. You can - and should - regularly commit your changes to a repository, and then you can simply push/pull the changes to keep all your machines in sync.

Related

DLL locked - Visual Studio 2010 C++

I’m currently logged onto a machine and my current problem involves a custom build step that has trouble copying a .dll to the Bin directory because Windows says it cannot access the file because it’s currently being used by another process.
I’m able to reproduce this on several other projects. The sequence of events is that I build a release successfully, do some test, checkout another SHA when doing a git bisect, and attempt to build a release from that SHA without doing a git clean -xfd (intentionally, because I’m trying to cache as much reusable data as possible). The weird thing is that I tried to use Process Explorer (procexp) and tasklist /m <locked_dll.dll> to search for whatever is holding onto this dll, and am unable to find anything holding onto the dll. I’m on a non-admin account, and I’m not sure if that is causing Windows to hide certain processes from me. Rebooting the machine helps, but that’s not an acceptable solution since I’m trying to automate things. I’m able to delete the .dll, and when I try to build the project in VS, it’ll complain that it still can’t access the dll when trying to copy it over to the Bin folder. Any ideas? I’m going to keep researching the issue, but as of right now, I’m sort of stumped.
EDIT:
This seems to be a duplicate question (Error: Cannot access file bin/Debug/... because it is being used by another process), but I'll leave this open to see if anyone has found anything new related to the topic.
I've seen this problem in VS 2010 with a large .Net solution with multiple projects in it. Every case I've seen so far pertains to have one project with dependency DLLs that another project also uses, and that other project also uses the first project as a reference, and also uses the same dependency DLLs that happen to be a different version from the first project.
To describe it a different way:
Project A depends on v1 of DLL A
Project B depends on project A and v2 of DLL A
Both project A and B are in the same solution
The solution is to use the same version of DLL A. I usually run into this when upgrading to a new version of SQLite, and I forget to update the dependency in all of my projects.
After talking with a few coworkers, I found the solution to my problem. procexp and tasklist did not see which process was locking the dll because there was no process locking the dll on that particular machine.
I have a hardware configuration where machine A (a host PC) is connected to machine B (acts as a client that retrieves instructions from machine A) using a network switch. machine B runs the same binaries that link to the same dll's. Thus, obviously, running procexp or tasklist on machine A will not see anything locking the dll's because machine B is the culprit.

RTC with Eclipse: is it desirable for code to be stored in a fully configured Eclipse project?

Recently my project group bought a C/C++ codebase from a contractor which does not use Eclipse. Basically a big /src tree organized for building with Autotools, with a few top-level build scripts masking some of the Autotools complexity.
Developers on our project team have managed to set up the code in Eclipse (Luna) as an Autotools project...but what is currently causing grief, is that as we begin to work with this code, project CM is also moving to Jazz / RTC 5 (Formal process, not Agile) from ClearCase/ClearQuest.
None of us are clear about whether the code should go into the RTC repository in the form of a fully configured Eclipse project ready for developers to use.
My reading as a developer is that it must: if it doesn't, when I download the code to my repository workspace, I have to begin by bringing in new .project, .cproject, and .autotools files "behind the scenes" to get to a project that specifies the include paths I need, allows for C/C++ code analysis, and (hopefully) can be re-tweaked for Autotools building from within Eclipse. It also means when I deliver change sets back, it is likely to take a variety of error-prone workarounds to avoid delivering project-specific settings that aren't part of the codebase as conceived by CM. Right now, that's being held as close as possible to the contractor's delivered (non-Eclipse) package.
What I'm hoping, is that anyone can tell me if it is standard practice when using RTC with Eclipse, to set up one's code in RTC in the form of fully configured, ready-to-use Eclipse projects. The language used in the articles I'm finding suggests it, eg., talking about "Find and load Eclipse projects", but nothing I'm seeing makes this explicit.
is that anyone can tell me if it is standard practice when using RTC with Eclipse, to set up one's code in RTC in the form of fully configured, ready-to-use Eclipse projects.
That is a standard with any source control tool.
See "Shoul I keep my project files under version control?" or ".classpath and .project - check into version control or not?".
RTC simply suggest to create a .project just to reference the files of the component in the Eclipse workspace (as a convenience, to facilitate the file exploration of a given RTC component).
But that is separate from having a full-fledged .project, with many additional settings configured there.
I do not keep IDE specific files under version control.
You basically have an autotools project so what I do with that is put all the source autotool files (autogen.sh, configure.ac, Automake.am) under version control.
I also have a couple of scripts to setup autotools under different basic configurations (configure-debug.sh, configure-release.sh).
Then each developer simply runs the scripts which produce Makefiles.
Now they can use any IDE they wish based on the Makefiles. Each developer should be capable of working from a Makefile at least.
In eclipse I create an unmanaged "Makefile" style project and plug in the Makefiles that autotools produces.
But the project is not bound to eclipse, it is bound to any environment that runs autotools. Developers can use whatever IDE they prefer.

Seamless (RSE) Remote Projects in Eclipse

I've been trying - without much success - to make Eclipse (for C/C++, but that should be irrelevant) play nicely with remote projects. It would make my life at work much easier if I can set things in the following way:
Run Eclipse from my local Windows machine
Connect (through Eclipse) to the remote Linux development box
Create an Eclipse project from files and directories already created on the remote box
Configure project dependencies and symbols using files and directories from the remote box
Building and running the project in Eclipse is not needed - since this is done with a million makefiles, it's easier for me to just SSH into the box and build from command line. I just need Eclipse to recognize included resources
I tried setting this up with Remote System Explorer (RSE), but couldn't quite get it to work. I can create a connection to the remote box, browse its files, and even convert certain directories to Remote Projects. Once the remote project is created, however, it's useless to me - Eclipse underlines everything that's not a C/C++ keyword, saying it doesn't recognize it (even #include statements of system libraries); equally important, it doesn't allow me to add remote resources to the Paths & Symbols of the project.
Am I missing something here, or RSE just not capable of doing what I need it to?
No, I don't think you're missing something here. I already faced a similar problem when creating/editing remote projects. As far as I can judge, this must be due to the C/C++ indexer not working correctly for remote projects. One action to make Eclipse recognize the #includes is to close and reopen the project through the Project Explorer View. If this doesn't help, try highlighting the #include statement and press F3. Opening the included file seems to trigger the indexer to update the index (although this should also be possible by right-clicking the project and selecting the Rebuild Index function; but this didn't work me). But even after performing these steps, indexing isn't fully functional, e.g., the Call Hierarchy is not working at all (it tells me "File XY is currently not part of the index").
Btw, which protocol do you use for your connection (ssh, ftp, or dstore)? I read some posts that RSE only works seamlessly if the dstore protocol is used. Unfortunately, this wasn't the case for me...

How to keep a cross-platform library in sync across XCode/Visual Studio

I'm developing a system which will have a PC (windows) component and an iPad component. I'd like to share some C++ code between the iPad and the PC. Is there a way to automatically sync the source files between the project? In other words, if I'm working on the PC and add a new .h/.cpp pair, can I somehone get the xcode project to recognize the new files and add them to the xcode project? Same goes for getting Visual Studio to recognize new files on the PC end.
If this isn't possible, would it make sense to use Eclipse on both the Mac and the PC for this shared library? Is there any other option I should look in to for maintaining a project on both Apple and Windows development environments?
First, you need one common build configuration for all your target platforms. Of course, this means that you can't use the build configurations tied to your IDEs (Visual Studio, XCode, etc.). You need a cross-platform build-system. The best candidate for that, IMO, is CMake. With that system, the CMakeLists.txt files are the primary configuration files for your project. Any new source files / headers will have to be added to that configuration file (or one of them). It might be a little bit less convenient than using the in-IDE facilities to add a header/source pair, but the advantage is that you only have to add the source file once to the build configuration (CMakeLists.txt) and it will apply to all operating systems and IDEs that you are using. CMake can be used to generate project files for most IDEs so that they can be used easily, and some of the better IDEs also support CMake build-configurations directly (which makes it even more convenient). Personally, I don't know of any serious cross-platform project that does not employ an independent cross-platform build-system (like CMake or others with similar capabilities), so this is not really much of a debate anymore.
Second, you need a means to synchronize your files between the two systems, which I presume are physically separated (i.e., not in a virtual box or whatever). There are simple programs like rsync and other more GUI-ish programs to synchronize folders and all its underlying files. However, for source code, it is much more convenient to use a version-control system. Personally, I recommend Git, especially for personal projects. There are many features to a version control system, but the basic thing is that it gives you a simple way to keep source folders synchronized and keep track of the changes that have been made to the code (e.g., allowing to back-track if a bug suddenly appears out of the latest changes). Even if you are working alone, it is still totally worth it to use such a system (and even if you don't really need it, it gives you experience working with one). Git is a decentralized system, meaning that you don't need a central server for the version control, it is all local to each copy of the repository. This allows you to have (as I do for some simple projects), a completely local set of repositories, for instance, I have two computers I work with, with a copy of the repository on each of them, plus a copy of the repository on an external hard-drive, so all the synchronization is done locally between the computers and external drive (with the added bonus of a constantly up-to-date triple backup of everything). You can also use a central server, such as github, which is even more convenient.

Allowing developer-specific settings in VS2008 Native C++ projects

Is it possible to combine the following properties, and if so, how?
Store in our version control system some Visual Studio 2008 native C++ (VCPROJ) project files for the developers in our team that use this IDE.
Allow some of those developers to tweak their projects (e.g. using debug version of third-party libraries instead of the usual ones).
Make sure these modifications are done in files that are not versioned.
In other words, I would like to allow developers to tweak some settings in their projects without risking that these changes are committed.
An 'optional VSPROP' file approach seems doomed to fail, as VS2008 refuses to load projects that refer to non-existent VSPROP files...
Any other suggestion? Is this possible with VS2010?
You may not be able to do this but using a solution that generates the vcproj like CMake for example would let you do this. Scripts all your project with CMake and literally conditionally include a config file(if present for example) that developers can change on their setup.
Branches could solve this problem: you create a branch, play with different versions of third-party, merge changes to trunk if results are good.
Well, as a preliminary solution you could put the project file into something like .hgignore or .gitignore after its initial commit.
This way changes to it can't be done accidentally.
At least that's how I handle .hgignore itself.
We use a versionned "common_configuration" folder, and a script which copies project files from this "common_configuration" folder towards the "project" folder.
We have another script to copy the configuration backwards, so the developpers need to make a conscious action to commit their local changes to the global version control system.
It answers partly your needs :
The upside : we have a way to keep a common configuration for everyone, and no accidental committing of local configuration
The downside : blindly copying the files actually crushes local changes. We live with it. We could write some more clever merger tool (using diff, or xml specific manipulations), but don't want to spend to much time on supporting the deployment tools.