How can I build multiple products with different dependency versions in Visual Studio? - c++

I'm building a Ruby C Extensions and I have to build for both Ruby 1.8 and Ruby 2.0. The source code is the same, but I have to link to different Ruby libs and set different build paths.
In Xcode I was able to set up a project where the project contained the common configuration. Then I set up two targets that configured different include paths and libs as well as different output (Debug/Release) paths.
When I build I will get two products built, one for each targeted Ruby version. When I build Release I get:
build/Release/1.8/Example.bundle
build/Release/2.0/Example.bundle
Now I am trying to do the same thing for Visual Studio 2010. I've been looking at the Configuration Manager and Property Sheets, but I'm struggling to fond out how I can make the same type of inheritance for the targets as I can in Xcode.
Is it possible to have multiple targets in VS, as you can in Xcode, that produce two products from the same project when you build?
Do I have to create a Debug and Release configuration for each Ruby core I want to target?

I think I know what you are looking for (but I can't be sure, as I have no experience with Xcode.)
If you are using Visual Studio, one way you can do something like this is to set up you project properties for a build configuration (e.g. for Release) and then copy that config into a new build type (e.g. call it "Release20") and go in and change the properties that you want for the new build. You can do this in "Configuration Manager". (When you select to add a new configuration, it lets you copy all properties from an existing one.)
However, it seems that you are already familiar with this, so I'm not sure what the problem is.
Another thing you should know is that the two build configs don't stay linked together (this is not a proper inheritance and overriding of properties, like using property sheets.) If you change something in one, it doesn't affect the other.
Precisely for these reasons (in addition to some others) I myself have foregone trying to wrangle the VS build system. I now use Premake and CMake for all my projects. But of course it's quite a chore building a build system for someone else's large code base.

Related

Eclipse CDT update/sync project list automatically (to easy "refresh" related project set)

Historical context:
We have a project consisting of following parts:
Host application (C++)
Scripting Engine library (also written in C++)
A lot of C++ plugins (around 30+)
A lot of scripts that tie all the stuff together...
From version to version some plugins are added and some are removed.
Till now we used Visual Studio solution (*.sln) to contain all the projects (*.vcxproj) for Host application, Scripting Engine library and plugins (one *.vcxproj per plugin!).
To share sources/projects we use proprietary source control system, and till now once we merged updates from the server (some plugin projects are added and some plugin projects are removed) all the project tree in the VS were refreshed thanks to "reload" feature (no action was required from developer to see and build updated source tree).
The problem:
Now our senior management decided to switch to Eclipse CDT/MinGW pair and we faced the issue that Eclipse Workspace is not the same thing as Visual Studio *.sln ...
Now when some plugin project folder appears or some plugin project folders disappears corresponding workspace items do not update accordingly.
Thus from now every developer has to use File>Import...>General>"Existing projects into workspace" File/"Open Projects from File System" to add new projects to own Workspace manually once they were added by other developer to the source control.
Also one has to manually remove from own Workspace those plugin projects that were deleted from source control...
This is a great contrast with what we previously had with Visual Studio where "reload" feature automatically updated project/source tree (just bacause all the information arrived with *.sln/*.vcxproj from server).
Our first option was to place Workspace\.metadata etc stuff to source control (as we previously did for *.sln files) but "that is not the way how Eclipse Workspace is designed to be used" (this is even not possible just because paths in .metadata\* are absolute and tons of Workspace\* stuff it is not mergeable at all)
Question:
Is there some way to automatically syncronize Eclipse CDT Workspace with project set obtained from source control. Like just press some (hidden?) magic "refresh" button (in special plugin to install or something like that) and all the new projects will be automatically added to the source tree in the Workspace and deleted projects will also disappear automatically, wothout need to use all those "Import" wizards, and withot need to remove deleted projects manually?
May be there is a special "Container" project type in Eclipse to play the same role as *.sln did in Visual Studio or something like that?
May be other options available?... Overall intention is not in replacing *.sln by some Eclipse equivalent but to support similar workflow when bunch of plugin projects is managed as a whole and project set "refresh" to be simple operation that does not require from each person in the team to manually track projects appeared/disappeared in that set.
Have you looked at using CMake to generate the Eclipse project files? You can then import those into an Eclipse workspace.
Its not automatic, but if you create separate CMakeLists.txt files for each part, then you can easily comment the include of that part in the main CMakeLists.txt file and regenerate the project files when you only want to load subset of the project.
https://cmake.org/Wiki/Eclipse_CDT4_Generator
Should you ever want to change back to VS or to another IDE CMake can generate project files for that too.
I've personally only used CMake to generate VS-solutions and Unix make files so I can't vouch for how well this works.
HTH.
On side note, why did management decide that Eclipse should be used instead of Visual Studio? It sounds like a poor decision without factual grounds or impact research prior to the decision being made.
Was it because Eclipse is free? Did they consider what reduced developer productivity costs?

How to create two mains in an eclipse C++ project

We've got a program which runs separately, executed with an execvp command. So it needs a main method, but I believe that poses a problem to eclipse with a managed make. Do we have to keep this code segregated into a separate project, or is there a way to incorporate it into the same eclipse project?
Create a project for each executable that has a main() function, and create an additional project to represent the software as a whole (a "container" project of sorts). Eclipse allows you to specify projects as dependencies of other projects, and in this case you will want to set up the container project to list the other projects as "Referenced Projects".
To do this, create the container project, then right-click on the project in the left-hand column (project explorer) and click "Properties". A dialog box will appear. Select the "Project References" item in the list on the left-hand side and you will see a list of all projects that Eclipse is currently working with. Check the boxes next to the projects for your individual executables, then click OK. Now, when you perform a build on the container project, Eclipse should automatically perform a build on these dependent projects as well.
When using sub-projects in this manner, I have (personally) found it useful to create a working set that includes the container project and all of the sub-projects (this can make searching the entire software project easier).
Keep it in the same project and use preprocessor defines which you define differently depending on what kind of main you want to include in the current project. Here the mains are in the same file, but they can of course reside in different files.
#if defined(MAIN_ONE)
int main()
{
// Do stuff
}
#elif defined(MAIN_TWO)
int main()
{
// Do some other stuff
}
#endif
If the makefile being invoked doesn't compile the 2 main() methods into the same executable, it won't cause a problem. I don't know how eclipse projects are handled - if it's like VS, where "project" means a single executable or library, and "solution" is a group of "projects", then it would seem you'd need more than one project. If, OTOH, a "project" can contain different "subprojects" where a "subproject" is an executable or library, you should be able to handle that easily.
I am not aware of any easy way to build two mains using Eclipse build system. The smallest change you need to do might be to move to makefiles and use makefile targets to build.
Instead, I'd advise you to move to using CMake. CMake can be used to generate makefiles to be used with eclipse. The advantage you get from using CMake is that you can easily state how to build the libraries and link the libraries to form the executables. CMake can generate builds for Eclipse, Visual Studio, Code Blocks, or makefiles (so you can use command prompt).
This is built in the C++ language. You would have to modify it to get your result. There is something to do 2 things at once if that is what you want.

Building both DLL and static libs from the same project

I have a number of native C++ libraries (Win32, without MFC) compiling under Visual Studio 2005, and used in a number of solutions.
I'd like to be able to choose to compile and link them as either static libraries or DLLs, depending on the needs of the particular solution in which I'm using them.
What's the best way to do this? I've considered these approaches:
1. Multiple project files
Example: "foo_static.vcproj" vs "foo_dll.vcproj"
Pro: easy to generate for new libraries, not too much manual vcproj munging.
Con: settings, file lists, etc. in two places get out of sync too easily.
2. Single project file, multiple configurations
Example: "Debug | Win32" vs "Debug DLL | Win32", etc.
Pro: file lists are easier to keep in sync; compilation options are somewhat easier to keep in sync
Con: I build for both Win32 and Smart Device targets, so I already have multiple configurations; I don't want to make my combinatorial explosion worse ("Static library for FooPhone | WinMobile 6", "Dynamic library for FooPhone | WinMobile 6", "Static library for BarPda | WinMobile 6", etc.
Worse Con: VS 2005 has a bad habit of assuming that if you have a configuration defined for platform "Foo", then you really need it for all other platforms in your solution, and haphazardly inserts all permutations of configuration/platform configurations all over the affected vcproj files, whether valid or not. (Bug filed with MS; closed as WONTFIX.)
3. Single project file, selecting static or dynamic via vsprops files
Example: store the appropriate vcproj fragments in property sheet files, then apply the "FooApp Static Library" property sheet to config/platform combinations when you want static libs, and apply the "FooApp DLL" property sheet when you want DLLs.
Pros: This is what I really want to do!
Cons: It doesn't seem possible. It seems that the .vcproj attribute that switches between static and dynamic libraries (the ConfigurationType attribute of the Configuration element) isn't overrideable by the .vsprops file. Microsoft's published schema for these files lists only <Tool> and <UserMacro> elements.
EDIT: In case someone suggests it, I've also tried a more "clever" version of #3, in which I define a .vsprops containing a UserMacro called "ModuleConfigurationType" with a value of either "2" (DLL) or "4" (static library), and changed the configuration in the .vcproj to have ConfigurationType="$(ModuleConfigurationType)". Visual Studio silently and without warning removes the attribute and replaces it with ConfigurationType="1". So helpful!
Am I missing a better solution?
I may have missed something, but why can't you define the DLL project with no files, and just have it link the lib created by the other project?
And, with respect to settings, you can factor them out in vsprop files...
There is an easy way to create both static and dll lib versions in one project.
Create your dll project. Then do the following to it:
Simply create an nmake makefile or .bat file that runs the lib tool.
Basically, this is just this:
lib /NOLOGO /OUT:<your_lib_pathname> #<<
<list_all_of_your_obj_paths_here>
<<
Then, in your project, add a Post Build Event where the command just runs the .bat file (or nmake or perl). Then, you will always get both a dll and a static lib.
I'll refrain from denigrating visual studio for not allowing the tool for this to exist in a project just before Linker (in the tool flow).
I think the typical way this is done is choice 2 above. It is what I use and what I have seen done by a number of libraries and companies.
If you find it does not work for you then by all means use something else.
Good luck.
I prefer 2 configurations way.
Setup all common settings via 'All configurations' item in a project properties windows. After it separated settings. And it's done. Let's go coding.
Also there is very good feature named 'Batch build', which builds specified configurations by turn.
Multiple projects are the best way to go - this is the configuration i have most widely seen in umpteen no of projects that i have come across.
That said, it might be also possible to implement the third option by modifying your vcproj files on the fly from external tools(like a custom vbscript), that you could invoke from a make file. You can use shell variables to control the behavior of the tool.
Note that you should still use use visual studio to make the build, the makefile should only launch your external tool if required to make the mods and then follow that by the actual build command
I use Visual Studio 6.0 (Still) due to issues that are preventing us from Migrating to VS2005 or newer. Rebuilding causes severe issues (everything breaks)... so many of us are considering lobbying a migration to GnuC++ moving forward in a structured way to eventually get us off of licensed Visual Studio products and onto Eclipse and Linux.
In Unix/Linux it is easy to build for all configurations.. so I can't believe what a time and productivity sink it is to try and accomplish the same task in Visual Studio. For VS6.0 I have so far found that only having two separate projects seems to be workable. I haven't yet tried the multiple configuration technique, but will see if it works in the older VS6.0.
Why not go for version 1 and generate the second set of project files from the first using a script or something. That way you know that the differences are JUST the pieces required to build a dll or static lib.

Complex builds in Visual Studio

I have a few things that I cannot find a good way to perform in Visual Studio:
Pre-build step invokes a code generator that generates some source files which are later compiled. This can be solved to a limited extent by adding blank files to the project (which are later replaced with real generated files), but it does not work if I don't know names and/or the number of auto-generated source files. I can easily solve it in GNU make using $(wildcard generated/*.c). How can I do something similar with Visual Studio?
Can I prevent pre-build/post-build event running if the files do not need to be modified ("make" behaviour)? The current workaround is to write a wrapper script that will check timestamps for me, which works, but is a bit clunky.
What is a good way to locate external libraries and headers installed outside of VS? In *nix case, they would normally be installed in the system paths, or located with autoconf. I suppose I can specify paths with user-defined macros in project settings, but where is a good place to put these macros so they can be easily found and adjusted?
Just to be clear, I am aware that better Windows build systems exist (CMake, SCons), but they usually generate VS project files themselves, and I need to integrate this project into existing VS build system, so it is desirable that I have just plain VS project files, not generated ones.
If you need make behavior and are used to it, you can create visual studio makefile projects and include them in your project.
If you want less clunky, you can write visual studio macros and custom build events and tie them to specific build callbacks / hooks.
You can try something like workspacewhiz which will let you setup environment variables for your project, in a file format that can be checked in. Then users can alter them locally.
I've gone through this exact problem and I did get it working using Custom Build Rules.
But it was always a pain and worked poorly. I abandoned visual studio and went with a Makefile system using cygwin. Much better now.
cl.exe is the name of the VS compiler.
Update: I recently switched to using cmake, which comes with its own problems, and cmake can generate a visual studio solution. This seems to work well.
Specifically for #3, I use property pages to designate 3rd party library location settings (include paths, link paths, etc.). You can use User Macros from a parent or higher level property sheet to designate the starting point for the libraries themselves (if they are in a common root location), and then define individual sheets for each library using the base path macro. It's not automatic, but it is easy to maintain, and every developer can have a different root directory if necessary (it is in our environment).
One downside of this approach is that the include paths constructed this way are not included in the search paths for Visual Studio (unless you duplicate the definitions in the Projects and Directories settings for VS). I spoke to some MS people at PDC08 about getting this fixed for VS2010, and improving the interface in general, but no solid promises from them.
(1). I don't know a simple answer to this, but there are workarounds:
1a. If content of generated files does not clash (i.e. there is no common static identifiers etc.), you can add to the project a single file, such as AllGeneratedFiles.c, and modify your generator to append a #include "generated/file.c" to this file when it produces generated/file.c.
1b. Or you can create a separate makefile-based project for generated files and build them using nmake.
(2). Use a custom build rule instead of post-build event. You can add a custom build rule by right-clicking on the project name in the Solution Explorer and selecting Custom Build Rules.
(3). There is no standard way of doing this; it has to be defined on a per-project basis. One approach is to use environment variables to locate external dependencies. You can then use those environment variables in project properties. Add a readme.txt describing required tools and libraries and corresponding environment variables which the user has to set, and it should be easy enough for anyone to set up.
Depending on exactly what you are trying to do, you can sometimes have some luck with using a custom build step and setting your dependencies properly. It may be helpful to put all the generated code into its own project and then have your main project depend on it.

Using Makefile instead of Solution/Project files under Visual Studio (2005)

Does anyone have experience using makefiles for Visual Studio C++ builds (under VS 2005) as opposed to using the project/solution setup. For us, the way that the project/solutions work is not intuitive and leads to configuruation explosion when you are trying to tweak builds with specific compile time flags.
Under Unix, it's pretty easy to set up a makefile that has its default options overridden by user settings (or other configuration setting). But doing these types of things seems difficult in Visual Studio.
By way of example, we have a project that needs to get build for 3 different platforms. Each platform might have several configurations (for example debug, release, and several others). One of my goals on a newly formed project is to have a solution that can have all platform build living together, which makes building and testing code changes easier since you aren't having to open 3 different solutions just to test your code. But visual studio will require 3 * (number of base configurations) configurations. i.e. PC Debug, X360 Debug, PS3 Debug, etc.
It seems like a makefile solution is much better here. Wrapped with some basic batchfiles or scripts, it would be easy to keep the configuration explotion to a minimum and only maintain a small set of files for all of the different builds that we have to do.
However, I have no experience with makefiles under visual studio and would like to know if others have experiences or issues that they can share.
Thanks.
(post edited to mention that these are C++ builds)
I've found some benefits to makefiles with large projects, mainly related to unifying the location of the project settings. It's somewhat easier to manage the list of source files, include paths, preprocessor defines and so on, if they're all in a makefile or other build config file. With multiple configurations, adding an include path means you need to make sure you update every config manually through Visual Studio's fiddly project properties, which can get pretty tedious as a project grows in size.
Projects which use a lot of custom build tools can be easier to manage too, such as if you need to compile pixel / vertex shaders, or code in other languages without native VS support.
You'll still need to have various different project configurations however, since you'll need to differentiate the invocation of the build tool for each config (e.g. passing in different command line options to make).
Immediate downsides that spring to mind:
Slower builds: VS isn't particularly quick at invoking external tools, or even working out whether it needs to build a project in the first place.
Awkward inter-project dependencies: It's fiddly to set up so that a dependee causes the base project to build, and fiddlier to make sure that they get built in the right order. I've had some success getting SCons to do this, but it's always a challenge to get working well.
Loss of some useful IDE features: Edit & Continue being the main one!
In short, you'll spend less time managing your project configurations, but more time coaxing Visual Studio to work properly with it.
Visual studio is being built on top of the MSBuild configurations files. You can consider *proj and *sln files as makefiles. They allow you to fully customize build process.
While it's technically possible, it's not a very friendly solution within Visual Studio. It will be fighting you the entire time.
I recommend you take a look at NAnt. It's a very robust build system where you can do basically anything you need to.
Our NAnt script does this on every build:
Migrate the database to the latest version
Generate C# entities off of the database
Compile every project in our "master" solution
Run all unit tests
Run all integration tests
Additionally, our build server leverages this and adds 1 more task, which is generating Sandcastle documentation.
If you don't like XML, you might also take a look at Rake (ruby), Bake/BooBuildSystem (Boo), or Psake (PowerShell)
You can use nant to build the projects individually thus replacing the solution and have 1 coding solution and no build solutions.
1 thing to keep in mind, is that the solution and csproj files from vs 2005 and up are msbuild scripts. So if you get acquainted with msbuild you might be able to wield the existing files, to make vs easier, and to make your deployment easier.
We have a similar set up as the one you are describing. We support at least 3 different platforms, so the we found that using CMake to mange the different Visual Studio solutions. Set up can be a bit painful, but it pretty much boils down to reading the docs and a couple of tutorials. You should be able to do virtually everything you can do by going to the properties of the projects and the solution.
Not sure if you can have all three platforms builds living together in the same solution, but you can use CruiseControl to take care of your builds, and running your testing scripts as often as needed.