Eclipse CDT update/sync project list automatically (to easy "refresh" related project set) - c++

Historical context:
We have a project consisting of following parts:
Host application (C++)
Scripting Engine library (also written in C++)
A lot of C++ plugins (around 30+)
A lot of scripts that tie all the stuff together...
From version to version some plugins are added and some are removed.
Till now we used Visual Studio solution (*.sln) to contain all the projects (*.vcxproj) for Host application, Scripting Engine library and plugins (one *.vcxproj per plugin!).
To share sources/projects we use proprietary source control system, and till now once we merged updates from the server (some plugin projects are added and some plugin projects are removed) all the project tree in the VS were refreshed thanks to "reload" feature (no action was required from developer to see and build updated source tree).
The problem:
Now our senior management decided to switch to Eclipse CDT/MinGW pair and we faced the issue that Eclipse Workspace is not the same thing as Visual Studio *.sln ...
Now when some plugin project folder appears or some plugin project folders disappears corresponding workspace items do not update accordingly.
Thus from now every developer has to use File>Import...>General>"Existing projects into workspace" File/"Open Projects from File System" to add new projects to own Workspace manually once they were added by other developer to the source control.
Also one has to manually remove from own Workspace those plugin projects that were deleted from source control...
This is a great contrast with what we previously had with Visual Studio where "reload" feature automatically updated project/source tree (just bacause all the information arrived with *.sln/*.vcxproj from server).
Our first option was to place Workspace\.metadata etc stuff to source control (as we previously did for *.sln files) but "that is not the way how Eclipse Workspace is designed to be used" (this is even not possible just because paths in .metadata\* are absolute and tons of Workspace\* stuff it is not mergeable at all)
Question:
Is there some way to automatically syncronize Eclipse CDT Workspace with project set obtained from source control. Like just press some (hidden?) magic "refresh" button (in special plugin to install or something like that) and all the new projects will be automatically added to the source tree in the Workspace and deleted projects will also disappear automatically, wothout need to use all those "Import" wizards, and withot need to remove deleted projects manually?
May be there is a special "Container" project type in Eclipse to play the same role as *.sln did in Visual Studio or something like that?
May be other options available?... Overall intention is not in replacing *.sln by some Eclipse equivalent but to support similar workflow when bunch of plugin projects is managed as a whole and project set "refresh" to be simple operation that does not require from each person in the team to manually track projects appeared/disappeared in that set.

Have you looked at using CMake to generate the Eclipse project files? You can then import those into an Eclipse workspace.
Its not automatic, but if you create separate CMakeLists.txt files for each part, then you can easily comment the include of that part in the main CMakeLists.txt file and regenerate the project files when you only want to load subset of the project.
https://cmake.org/Wiki/Eclipse_CDT4_Generator
Should you ever want to change back to VS or to another IDE CMake can generate project files for that too.
I've personally only used CMake to generate VS-solutions and Unix make files so I can't vouch for how well this works.
HTH.
On side note, why did management decide that Eclipse should be used instead of Visual Studio? It sounds like a poor decision without factual grounds or impact research prior to the decision being made.
Was it because Eclipse is free? Did they consider what reduced developer productivity costs?

Related

RTC with Eclipse: is it desirable for code to be stored in a fully configured Eclipse project?

Recently my project group bought a C/C++ codebase from a contractor which does not use Eclipse. Basically a big /src tree organized for building with Autotools, with a few top-level build scripts masking some of the Autotools complexity.
Developers on our project team have managed to set up the code in Eclipse (Luna) as an Autotools project...but what is currently causing grief, is that as we begin to work with this code, project CM is also moving to Jazz / RTC 5 (Formal process, not Agile) from ClearCase/ClearQuest.
None of us are clear about whether the code should go into the RTC repository in the form of a fully configured Eclipse project ready for developers to use.
My reading as a developer is that it must: if it doesn't, when I download the code to my repository workspace, I have to begin by bringing in new .project, .cproject, and .autotools files "behind the scenes" to get to a project that specifies the include paths I need, allows for C/C++ code analysis, and (hopefully) can be re-tweaked for Autotools building from within Eclipse. It also means when I deliver change sets back, it is likely to take a variety of error-prone workarounds to avoid delivering project-specific settings that aren't part of the codebase as conceived by CM. Right now, that's being held as close as possible to the contractor's delivered (non-Eclipse) package.
What I'm hoping, is that anyone can tell me if it is standard practice when using RTC with Eclipse, to set up one's code in RTC in the form of fully configured, ready-to-use Eclipse projects. The language used in the articles I'm finding suggests it, eg., talking about "Find and load Eclipse projects", but nothing I'm seeing makes this explicit.
is that anyone can tell me if it is standard practice when using RTC with Eclipse, to set up one's code in RTC in the form of fully configured, ready-to-use Eclipse projects.
That is a standard with any source control tool.
See "Shoul I keep my project files under version control?" or ".classpath and .project - check into version control or not?".
RTC simply suggest to create a .project just to reference the files of the component in the Eclipse workspace (as a convenience, to facilitate the file exploration of a given RTC component).
But that is separate from having a full-fledged .project, with many additional settings configured there.
I do not keep IDE specific files under version control.
You basically have an autotools project so what I do with that is put all the source autotool files (autogen.sh, configure.ac, Automake.am) under version control.
I also have a couple of scripts to setup autotools under different basic configurations (configure-debug.sh, configure-release.sh).
Then each developer simply runs the scripts which produce Makefiles.
Now they can use any IDE they wish based on the Makefiles. Each developer should be capable of working from a Makefile at least.
In eclipse I create an unmanaged "Makefile" style project and plug in the Makefiles that autotools produces.
But the project is not bound to eclipse, it is bound to any environment that runs autotools. Developers can use whatever IDE they prefer.

What is the SVN best practice for storing source when developing and testing with IDEs?

I do a fair amount of personal development on my computer and have used TortoiseSVN (I'm on windows) for web projects, but haven't used any version control for other languages. Anyways, soon I will be starting a decent sized C++ project and was going to try using SVN for it.
For web development, I normally just used notepad++ and it was really easy to manage it with SVN (just commit the whole source folder). However, for this project I will be using an IDE (most likely Eclipse CDT or Visual Studio) and was wondering what the best practice is to manage all of the IDE, project, and binary files. My guess was to make the IDE project outside of the version control, and just point to all of the source files into the SVN so all of the build and project files aren't committed. This way the only files in the SVN would be the .cpp and .h files.
However, if I wanted to switch to a new branch, then I would need to update the location of all of the source and headers to the new folder which seems like it would be a huge hassle.
Whats the best way to handle this?
Thanks
Ok, it seem I misgot the aim of the question in the first round. Now I'm assuming what is asked really to what to put under source control and what not.
Well, naturally everything but temporary/transient files.
If you install GitExtensions, it right away has a feature to populate the .gitignore file. Certainly depending on language you adjust it. Sure, solution, project, make files belong under control. .USER files storing some IDE preferences do not. As both IDEs and source control is ubiquitously used the content is fairly separated for many years, and should be pretty obvious as you do it.
External dependencies normally also shall be in a repo, though choice shall be made in which one. Some store everything together, others keep one dependency repo, others separate repos per component -- all depends on actual components and workflow. And you can replace physical storage of deps by an info file with stable links to the used version. It may also be covered later on the first change in dependencies.
For Visual Studio, there is a plugin that manages your files for you. As long as the files are part of the project, then they will be put into source control by the plugin. See ankhsvn for plugin info. Note that the express versions of Visual Studio are not supported.
I am sure eclipse has a plugin for SVN as well.

How to organize dependencies in Visual Studio 2010 C++ best?

I am new to Visual Stuido and I want to know how to organize the dependencies of a project in Visual Studio 2010 C++ (Express Edition) best.
The main problem is the following:
A project P requires lib L, so I add dependecy L to P. L is somewhere located at my system. I submit P to our version control. My colleague checks P out, but the configuration of P does not fit to his system (L is located somewhere else at his system). So he adjusts the configuration and submits the changes to P. I check P out and now it does not fit to my system.
I come from Java and Eclipse. In Eclipse you can set a variable globally for the whole IDE f.e. PATH_TO_L. In the project configuration the dependency is now variable PATH_TO_L and not anymore a path. So my colleague and I can use the exact same sources including the project configuration. All we have to do is to set both the variable PATH_TO_L. Nice.
Even nicer is Maven. So you do not have to care about copying the right dependencies to the right locations, because Maven does all the work for you.
I searched a little bit for a solution. One possible solution would be to use Property Sheets and to add a template Property Sheet to our version control. But templates in version control are not comfortable to use and I would have to adjust the settings for every project. Not a global setting.
I tried to use system environment variables, but I think Visual Studio 2010 does not use them.
So here is the question. How do you organize your projects in Visual Studio 2010? Is there an ideal way? Is there something like Maven, or is there a possibilty to use an repository manager like nexus in Visual Studio?
You are on the right track, with the property sheets.
You could use a property sheet to reference a environment variable. An example is here.
I would add the Path of library to the user specific property file named Microsoft.Cpp.<platform>.user. As this is included by default. More information is here.
As soon as you get familiar with the property sheets it not as bad as it seems. I actually start to like the msbuild system. But I am not aware of anything like maven for msbuild.
Quite a lot of people are using meta-build systems, these days, such as CMake, SCons...
Amongst other useful features, you can set up some variables that you can later reuse, for example for paths. This way, your colleague and yourself will have the same CMake configuration, but with individualised paths.
And, as these scripts are simple text files, they play nicely with version control (much better than MSVC xml configuration)

Import Existing C++ Source Code into Visual Studio

I am trying to import an existing c++ application's source into visual studio to take advantage of some specific MS tools. However, after searching online and playing with visual studio, I cannot seem to find an easy way to import existing c++ source code into visual studio and keep it structurally intact.
The import capacity I did find flattens out the directories and puts them all into one project. Am I missing something?
(This is all unmanaged C++, and contains specific builds for win/unix)
With no project/solution loaded, in Visual Studio 2005 I see this menu item:
File > New Project From Existing Code...
After following the wizard, my problem is solved!
Switching the "Show All Files" button shows the complete hierarchy with all directories and files within.
If the New Project From Existing Code... option isn't available, you'll need to add it in Tools > Customize...
I am not aware of any general solution under the constraints given - specifically having to create many projects from a source tree.
The best option I see is actually creating the project files by some script.
Creating a single project manually (create empty project, then add the files),
Configure it as close as possible as desired (i.e. with precompiled headers, build configurations, etc.)
Use the .vcproj created as skeleton for the project files to be created
A very simple method would file list, project name etc. with "strange tokens", and fill them in with your generator. If you want to be the good guy, you can of course use some XML handling library.
Our experience: We actually don't store the .vcproj and .sln in the repository (git) anymore, but a python script that re-genrates them from the source tree, together with VS 2008 "property sheet templates" (or whatever they are called). This helps a lot making general adjustments.
The project generation script contains information about all the projects specialties (e.g. do they use MFC/ATL, will it create DLL or an EXE, files to exclude).
In addition, this script also contains dependencies, which feeds the actual build script.
This works quite well, the problems are minor: python requried in build systems, not forgetting to re-gen the project files, me having to learn some python to make adjustments to some projects.
#Michael Burr "How complex are the python scripts and whatever supporting 'templates' you might need?"
I honestly can't tell, since I gave the task to another dev (who picked python). The original task was to provide a build script, as the VS2008 solution build was not good enough for our needs, and the old batch file didn't support parallelization. .vcproj generation was added later. As I understand his script generates the .vcproj and .sln files from scratch, but pulls in all the settings from separate property sheets.
Pros:
Adding new configurations on the fly. Some of the projects already had six configurations, and planning for unicode support meant considering doubling them for a while. Some awkward tools still build as MBCS, so some libs do have 8 configs now. Configuring that from hand is a pain, now it just doesn't bother me anymore.
Global changes, e.g. moving around relative project paths, the folder for temp files and for final binaries until we found a solution we were happy with
Build Stability. Merging VC6 project files was a notable source of errors for various reasons, and VC9 project files didn't look better. Now things seem isolated better: compile/link settings in the property sheets, file handling in the script. Also, the script mostly lists variations from our default, ending up easier to read than a project file.
Generally: I don't see a big benefit when your projects are already set up, they are rather stable, and you don't have real issues. However, when moving into the unknown (for us: mostly VC6 -> VC9 and Unicode builds), the flexibility reduced the risk of experiments greatly.
Create a new empty solution and add your source code to it.
For example,
File>New>Project...
Visual C++>Win32>Win32 Console Application
Application Settings>
- Uncheck "Precompiled Header"
- Check "Empty Project"
Project is then created. To add existing code:
Project>Add Existing Item...>
- Select file(s) to add
Recompile, done!
In the "Solution Explorer" you can click on the "Show All Files" button to have Visual Studio display the files as they exist on the file system (directories and all).
In my opinion this is an imperfect workaround, but I believe it's the best available. I'm unaware of a plug-in, macro or other tool that'll import a directory into an actual project with folders that mirror the file system's.
I know this question is already marked correct, but I was able to import existing code into a project with Visual Studio 2008 by doing "File" -> "New Project from existing code". The directory structure of my code was retained.
You can always switch view from project menu
For eg. Project->Show All Files
The above will display the files in unformated raw file system order
Not sure of older versions but it works on VS 2010
I understand you, I have the same problem: many .cpp and .h files organized in many folders and subfolders with include paths written for this folder structure. The only way you can do to import this folder structure together with the source files is to use "Show All Files" and then right-click on folders and select "Import in Project". This works for me when I am using C-Sharp projects. But it does not work for my C++ Projects. I am still searching for a solution...

Complex builds in Visual Studio

I have a few things that I cannot find a good way to perform in Visual Studio:
Pre-build step invokes a code generator that generates some source files which are later compiled. This can be solved to a limited extent by adding blank files to the project (which are later replaced with real generated files), but it does not work if I don't know names and/or the number of auto-generated source files. I can easily solve it in GNU make using $(wildcard generated/*.c). How can I do something similar with Visual Studio?
Can I prevent pre-build/post-build event running if the files do not need to be modified ("make" behaviour)? The current workaround is to write a wrapper script that will check timestamps for me, which works, but is a bit clunky.
What is a good way to locate external libraries and headers installed outside of VS? In *nix case, they would normally be installed in the system paths, or located with autoconf. I suppose I can specify paths with user-defined macros in project settings, but where is a good place to put these macros so they can be easily found and adjusted?
Just to be clear, I am aware that better Windows build systems exist (CMake, SCons), but they usually generate VS project files themselves, and I need to integrate this project into existing VS build system, so it is desirable that I have just plain VS project files, not generated ones.
If you need make behavior and are used to it, you can create visual studio makefile projects and include them in your project.
If you want less clunky, you can write visual studio macros and custom build events and tie them to specific build callbacks / hooks.
You can try something like workspacewhiz which will let you setup environment variables for your project, in a file format that can be checked in. Then users can alter them locally.
I've gone through this exact problem and I did get it working using Custom Build Rules.
But it was always a pain and worked poorly. I abandoned visual studio and went with a Makefile system using cygwin. Much better now.
cl.exe is the name of the VS compiler.
Update: I recently switched to using cmake, which comes with its own problems, and cmake can generate a visual studio solution. This seems to work well.
Specifically for #3, I use property pages to designate 3rd party library location settings (include paths, link paths, etc.). You can use User Macros from a parent or higher level property sheet to designate the starting point for the libraries themselves (if they are in a common root location), and then define individual sheets for each library using the base path macro. It's not automatic, but it is easy to maintain, and every developer can have a different root directory if necessary (it is in our environment).
One downside of this approach is that the include paths constructed this way are not included in the search paths for Visual Studio (unless you duplicate the definitions in the Projects and Directories settings for VS). I spoke to some MS people at PDC08 about getting this fixed for VS2010, and improving the interface in general, but no solid promises from them.
(1). I don't know a simple answer to this, but there are workarounds:
1a. If content of generated files does not clash (i.e. there is no common static identifiers etc.), you can add to the project a single file, such as AllGeneratedFiles.c, and modify your generator to append a #include "generated/file.c" to this file when it produces generated/file.c.
1b. Or you can create a separate makefile-based project for generated files and build them using nmake.
(2). Use a custom build rule instead of post-build event. You can add a custom build rule by right-clicking on the project name in the Solution Explorer and selecting Custom Build Rules.
(3). There is no standard way of doing this; it has to be defined on a per-project basis. One approach is to use environment variables to locate external dependencies. You can then use those environment variables in project properties. Add a readme.txt describing required tools and libraries and corresponding environment variables which the user has to set, and it should be easy enough for anyone to set up.
Depending on exactly what you are trying to do, you can sometimes have some luck with using a custom build step and setting your dependencies properly. It may be helpful to put all the generated code into its own project and then have your main project depend on it.