What is a good way to set Preprocessor values for an imported library - c++

I apologize if this is covered elsewhere, but I was unable to find the information readily. I am working with an extant library for my company that uses pre-processor directives to add and remove specialized capabilities. For example, we might have a IMPORT_OPENBLAS and IMPORT_SPEEX to indicate that the build needs to support use of the OpenBLAS and Speex libraries. We also have unit tests based off of the Google test framework, some of which need said pre-processor directives enabled to run, which statically link in our library. The two places where we typically run the unit tests are through Visual Studio (2008 if that makes a difference) and through Ant, which invokes vsbuild.exe to do the build.
So, long story short, I have been tasked with adding additional capabilities such as the above libraries. We have other projects that use our library and specifically don't want those capabilities turned on, in part due to issues with dependencies and in part because they don't want the additional complexity. My first impulse was to put the preprocessor directives into the unit test project, since it builds our library as a dependency anyhow, but that doesn't seem to work. Is there any way to flag things to indicate that a given pre-processor command needs to be turned on for compiling the dependent project?
Another alternative is to create new build targets for the unit tests which specifically set the right pre-processor flags, but I want to avoid that if possible because we already have 10 different build targets encompassing different linking methods, processor size, and debug versus release modes and one of my earlier tasks involved getting them all to work again since no one had compiled some of them for months since our primary release is based off of just two of those targets.
Thank you for any help you can provide.

You could simply have a header file that includes those defines and include it in all the files in the project through the project properties. See the project properties -> Configuration properties -> C/C++ -> Advanced -> Force Includes.
In other words, this file would be included in all the projects.

Related

How to make Visual Studio 2017 C++ project more portable between computers?

I am developing a project on C++ which relies on many of third-party libraries (*.lib files and *.h files). I store these libraries in a folder which is not dependant to project, like C:/thirdpartylib. Relative paths is not an option, since it becomes way too long. I have defined connections to libraries in linker setting and in general C++ settings.
But when I pass the project to supervisor he has to reset all paths to libraries to match his environment. We use git, and the project file is being tracked. He stores thirdparty libraries in another way than me.
Is there any way to make a project more portable? Maybe it is possible to store paths in some sort of config files?
As #gaurav says, the way to deal with this in Visual Studio is with property sheets. It's unfortunate that this term is used for two different things in VS, but I guess they just ran out of names (spoiler alert).
These are very powerful, once you learn how they work, and they're just what you need here because they let you define macros, and these macros can in turn be used in the rest of your project to refer to the (volatile) location of your various libraries. This is a trick that everyone who uses VS should know, but it seems that a lot of people don't.
I don't think it's worth me trying to walk you through the mechanics of setting one up here because Microsoft already document it in the Visual Studio help file. Suffice to say, you do it in the Property Manager, that should help you track down the relevant information.
There is also an excellent blog post here which I recommend you read before you do anything else:
http://www.dorodnic.com/blog/2014/03/20/visual-studio-macros/
It's also on Wayback Machine here:
https://web.archive.org/web/20171203113027/http://www.dorodnic.com/blog/2014/03/20/visual-studio-macros/
OK, so now we know how to define a macro, what can we do with it?
Well, that's actually the easy part. If we have a macro called, say, FOO, then wherever we want to expand that macro in some project setting or other we can just use $(FOO). There's also a bunch of macros built into the IDE as listed here:
https://msdn.microsoft.com/en-us/library/c02as0cs.aspx
So, you, I imagine, will want to define macros for the include and lib directories for each of your external libraries and you can then use these to replace the hard-coded paths you are currently using in your project.
And that, I reckon, should sort you out, because the definitions of the macros themselves are stored in a separate file, external to your project file, and different users / build machines can use different files. IIRC, these have extension .props.
Also, you can define a macro in terms of another macro or macros, and that makes the job easier still.
So, who still thinks that Microsoft don't know how to create a build system? Visual Studio is a fantastic piece of software once you get used to it, there's just a bit of a learning curve.
The way to go for large project is to use a package manager. There are some good options out there. Perhaps in windows and visual studio you can use vcpkg or NuGet unmanaged.
If you cannot use a package manager for some reason, the next thing to do is to commit all the dependencies to the GIT repo. If you only target windows platforms like windows 8 or 10 and want to support only VS2017 then committing the compiled dependencies is not a problem. The downside is that the repo will become huge.
For a tiny school project the latter option is viable.

Visual Studio Solution Dependencies

I'm working at an organization with a product suite based on several hundred Visual Studio solutions (mostly C++). Some of these solutions generate libraries that are used by other solutions and there's also a common "include" folder containing headers that shared by multiple modules.
The issue is that the dependencies are not explicitly stated anywhere, and the build system resolves dependencies by specifying a linear build order that makes sure the dependent modules get built at the right time. This works well for the build system but leaves developers at a disadvantage when trying to work on components with many direct and indirect external dependencies. For example, I might want to edit one of the library projects or shared headers and then build all the affected modules without necessarily knowing ahead of time which ones are affected. Another use case involves building a module after doing a fresh pull from TFS and having the modules it depends on built first without having to build the entire system.
I am wondering if there is/are any tool(s) available that can automate dependency generation for building large projects. I have considered creating a few really big solutions that encapsulate the other solutions but that seems really awkward and clumsy. Also, I don't like the idea of having developers manually specify dependencies as it can error prone, especially with such a large code base. I worked with scons a few years ago and really liked the way it could parse source files and automatically discover all the dependencies dependencies. Is there anything available today that can do the same thing with Visual Studio solutions?
This is not a duplicate of Visual Studio: how to handle project dependencies right?
I need to emphasize the magnitude of the problem I am trying to solve. This is a very large existing code base. In the main directory there are several hundred sub-folders, each one containing one of more VS solutions (not projects). Each solution, in turn, contains one or more projects. As I said before, I'm not trying to establish dependencies among a few projects in a solution. The problem is much bigger than that. I'm trying to find a way to establish dependencies among the solutions themselves (several hundred of them). For example, one solution may contain some projects that generate libraries for security, others for communications, etc. There may be, for example, dozens of solutions that use the communications libraries. So essentially I'm trying to create a directed a cyclic graph with hundreds of nodes and potentially tens of thousands of edges.
You could use cmake (https://cmake.org/). With it, you can specify several libraries and apps to be built. Once configured, you can modify a project and the build will just update the dependent projects. Cmake also provides a visual studio generator, so that you can continue using that IDE.
A possible disavantage to you is that, to configure, you must explictly specify, for each project (library or executable), with what projects it must be linked and what folders it must include. There are ways to define some global includes and links, but the use will depends on your problem.
VS does track dependencies (by parsing source files). It doesn't make sense that something could automatically set dependencies of your VS projects, in any other build tools you'd still have to specify in some way that for linking project A.exe you need to use B.lib.
If you use newer VS versions you should simply add references to lib to your exe/dll projects. If you manually added project dependencies, most likely you should remove them all, especially make sure you don't make static lib projects dependent on each other. VS allows you to do that (for example, if build of one library generates some source files that another static lib uses), but in general these shouldn't have any dependencies and this allows VS to optimize builds by building them in parallel.
For example, commonly you could have some kind of Base.lib, then System.lib and Graphics.lib. All of these are user by your App.exe. System.lib uses code from Base.lib, Graphics.lib uses code from System.lib and Base.lib. So, naturally the dependency chain is clear and you go and set them in VS, and that's a mistake! In cases like this in VS you should make these 4 libs independent and only App.exe should be dependent on all these libs (e.g. it should have references to all of these). VS will figure out what is the the correct dependency of these projects.
Regarding Cmake case: it simply generates VS projects and solutions, if you use VS then cmake cannot do more than VS itself can.

What is the correct way to import a type library in Visual Studio?

Background
Our build uses ant and a custom task to build Visual Studio projects/solutions as well as some Java projects. There structure is basically a large tree and artifacts from the projects are typically copied upwards to a common build directory.
This was previously a complete mess and I've greatly simplified the ant scripts and now I'm most of the way through the Visual Studio projects/solutions. These projects are extremely old and have been upgraded through every version of Visual Studio up to 2013. Part the changes I have done was to use as many of the default project properties and macros as possible. Most of these used to be hard coded.
Although I have modified the projects to use the $(Configuration) macro to separate the artifacts from different configurations these still get copied to a common location for other dependent projects in other solutions. So to avoid confusion and actually make sure our debug builds link to all the debug libraries (which wasn't happening previously) I have been adding suffixes to the Target Name. E.g. The Target Name of a Debug Unicode build will be $(ProjectName)DU.
Problem
This has been great so far but now I'm not sure how to complete these changes for one of our COM libraries. This library has an IDL file and the MIDL compiler generates a TLB file. Maybe this isn't a good way to do it but for now I wanted the TLB file to also have a different suffix depending on the build. The problem is, when I change the Type Library property for the MIDL configuration then this breaks the Compile-time directives in the RC file. I figured it might be possible to use #ifdefs in the TEXTINCLUDE block depending on whether _UNICODE or _DEBUG are set (provided I do it via the Resource Includes dialog so I don't break the RC file). It also means that there are other importlib attributes that will also need #ifdef checks.
At the moment it kinda works without renaming the TLB files but that's only because they are currently only being used within this solution.
Has anyone ever done anything like this or know a better solution?
Update
I guess what I really need to know here is what is the best way to use types from one COM DLL in another one? Should I even be using importlib? The MSDN documentation says that in most cases you should be using the import instead. I tried this but broke a whole bunch of stuff.

How to setup seperate Boost Test project in Visual Studio 2010

I want to use Boost Test to unit test my code in Visual Studio 2010. I've downloaded and built the latest version of the library.
I've read a lot on the subject here and elsewhere on the internet and people seem to suggest having a second project within your solution exlusively for your tests.
Fine, sounds good. I'm having trouble actually setting this up however. I've yet to find a clear explanation of the best way to set this up.
Do I need to use a Project Reference to make my unit test project reference my main project?
If so, do I still need to add the Include & Source directories of my main project in the properties of my unit test project? If so, what's the advantage of using the Project Reference in the first place?
Do I have to have my main project output a library for my unit test project to link in? Again, I thought that Project References would make this unnecessary but it seems I don't really understand the Project References.
If at all possible could I get a very idiot proof, step by step procedure for setting up a Boost Test unit test project alongside a main project in VS2010?
Would I be better off going with the method laid out here (one project, different configurations to build tests or actual project exe):
http://blog.yastrebkov.com/2010/07/boost-test-setup-and-usage.html
Many thanks,
There is no magic behind setting up a Boost.Test project. Maybe because it's a regular C++ (executable) project in no way different from a "normal" application. This is what I do:
Create a new C++ project. I always choose Win32 Executable with precompiled headers. I have a naming convention, that all test projects using Boost.Test start with "tests.boost.testee_name..."
In "stdafx.h", add the include for <boost/unit_test.hpp> and define the BOOST_TEST_MODULE (I always choose the project name). Also, add all other includes for external components this project requires, e.g. other boost libraries, stl headers etc. This results in considerably faster compilation times.
The testee must be a library (dynamic or static). So "add reference" to all required dependencies. You can of course test header-only libraries, in that case do not add references.
Add source files to your test project, according to Boost.Test manual. The convention I enforce is one BOOST_FIXTURE_TEST_SUITE per file.
For convenience, I have a custom property sheet tailored for boost unit test, which I add to each boost test project. Among others it contains a post-build event, which runs the tests.
I have to add that, lately, I switched to MSTest with Visual Studio 2012 which allows a more comfortable way to manage the tests and test results. Nevertheless, for the most important parts of the software, I am still writing boost tests in order to ensure correctness with older toolsets and potentially other platforms.
Cheers,
Paul

Complex builds in Visual Studio

I have a few things that I cannot find a good way to perform in Visual Studio:
Pre-build step invokes a code generator that generates some source files which are later compiled. This can be solved to a limited extent by adding blank files to the project (which are later replaced with real generated files), but it does not work if I don't know names and/or the number of auto-generated source files. I can easily solve it in GNU make using $(wildcard generated/*.c). How can I do something similar with Visual Studio?
Can I prevent pre-build/post-build event running if the files do not need to be modified ("make" behaviour)? The current workaround is to write a wrapper script that will check timestamps for me, which works, but is a bit clunky.
What is a good way to locate external libraries and headers installed outside of VS? In *nix case, they would normally be installed in the system paths, or located with autoconf. I suppose I can specify paths with user-defined macros in project settings, but where is a good place to put these macros so they can be easily found and adjusted?
Just to be clear, I am aware that better Windows build systems exist (CMake, SCons), but they usually generate VS project files themselves, and I need to integrate this project into existing VS build system, so it is desirable that I have just plain VS project files, not generated ones.
If you need make behavior and are used to it, you can create visual studio makefile projects and include them in your project.
If you want less clunky, you can write visual studio macros and custom build events and tie them to specific build callbacks / hooks.
You can try something like workspacewhiz which will let you setup environment variables for your project, in a file format that can be checked in. Then users can alter them locally.
I've gone through this exact problem and I did get it working using Custom Build Rules.
But it was always a pain and worked poorly. I abandoned visual studio and went with a Makefile system using cygwin. Much better now.
cl.exe is the name of the VS compiler.
Update: I recently switched to using cmake, which comes with its own problems, and cmake can generate a visual studio solution. This seems to work well.
Specifically for #3, I use property pages to designate 3rd party library location settings (include paths, link paths, etc.). You can use User Macros from a parent or higher level property sheet to designate the starting point for the libraries themselves (if they are in a common root location), and then define individual sheets for each library using the base path macro. It's not automatic, but it is easy to maintain, and every developer can have a different root directory if necessary (it is in our environment).
One downside of this approach is that the include paths constructed this way are not included in the search paths for Visual Studio (unless you duplicate the definitions in the Projects and Directories settings for VS). I spoke to some MS people at PDC08 about getting this fixed for VS2010, and improving the interface in general, but no solid promises from them.
(1). I don't know a simple answer to this, but there are workarounds:
1a. If content of generated files does not clash (i.e. there is no common static identifiers etc.), you can add to the project a single file, such as AllGeneratedFiles.c, and modify your generator to append a #include "generated/file.c" to this file when it produces generated/file.c.
1b. Or you can create a separate makefile-based project for generated files and build them using nmake.
(2). Use a custom build rule instead of post-build event. You can add a custom build rule by right-clicking on the project name in the Solution Explorer and selecting Custom Build Rules.
(3). There is no standard way of doing this; it has to be defined on a per-project basis. One approach is to use environment variables to locate external dependencies. You can then use those environment variables in project properties. Add a readme.txt describing required tools and libraries and corresponding environment variables which the user has to set, and it should be easy enough for anyone to set up.
Depending on exactly what you are trying to do, you can sometimes have some luck with using a custom build step and setting your dependencies properly. It may be helpful to put all the generated code into its own project and then have your main project depend on it.