Good Practice: How to define path to external libraries for compilation - c++

I am fairly unused to compilation and building projects so pardon me if my approach to compilation and build seems a bit odd. Any tip is welcome.
I am currently working on a 3D geometry C++ project (Which is a dll). This project uses external dll's such as BOOST. So when building the project I have to define the directories in which the .dll, .lib and .h/.hpp files are.
Currently I am using scons to build the project and define those paths straight in the SConstruct file.
However those paths are later reused for other build operations. (In the present case compiling the C++ code in MEX for matlab but that's not really the point here).
Thus I currently have to define the same path in different places which is inefficient. In addition the project has to be easy to set up for another user. So having to change and update path in many different files is something I would like to avoid.
From where I stand I see two alternatives:
First I could ask the user to define environment variables and read them from inside my various build scripts. However I am not really satisfied with this solution for it asks the user for additional manipulation and, as far as I've understand, I lose the cross-platform portability that scons offers. (I know it might still possible but requires some extra steps and I'd like to keep things as simlpe as possible)
Second I could define all my path in a single .txt (or something similar) file at the root of my project and read it from my various build scripts. However this makes the process sensitive to typos and parsing errors which is not really to my taste.
So my question is the following:
Is there a better way or good practice to have the user input the paths necessary for compilation that satisfies the following:
Has the user only input once every path
Is done within the project folder through a file or something else
Is as foolproof as possible
Does not require too much additional download/installing (I don't really want to have the user install a brand new software for this. However I'm fine with something like a simple light .exe that I can add in my project files)

SCons's Variables are likely your best choice here.
See: https://scons.org/doc/production/HTML/scons-user.html#sect-command-line-variables
It allows reading defaults from a file:
vars = Variables('custom.py')
You'd have to craft some logic to save any variables specified on the command line.

Related

set output path for cmake generated files

My question is the following:
Is there a way to tell CMakeFiles where to generate it's makefiles, such as cmake_install.cmake, CMakeCache.txt etc.?
More specifically, is there a way to set some commands in the CMakeFiles that specifies where to output these generated files? I have tried to search around the web to find some answers, and most people say there's no explicit way of doing this, while others say I might be able to, using custom commands. Sadly, I'm not very strong in cmake, so I couldn't figure this out.
I'm currently using the CLion IDE and there you can specifically set the output path through the settings, but for flexibility reasons I would like as much as possible to be done through the CMakeFiles such that compiling from different computers isn't that big of a hassle.
I would also like to avoid explicitly adding additional command line arguments etc.
I hope someone might have an answer for me, thanks in advance!
You can't (easily) do this and you shouldn't try to do it.
The build tree is CMake's territory. It allows you some tiny amount of customization there (for instance you can specify where the final build artifacts will be placed through the *_OUTPUT_DIRECTORY target properties), but it does not give you any direct control over where intermediate files, like object files or internal make scripts used for bookkeeping are being placed.
This is a feature. You have no idea how all the build systems supported by CMake work internally. Maybe you can move that internal file to a different location in your build process, which is based on Unix Makefiles. But maybe that will also horribly break my build process, which is using Visual Studio. The bottom line is: You shouldn't have to care about this. CMake should take care of it, and by taking some freedom away from you, it ensures that it can actually do that job on all supported build toolchains.
But this might still be an unsatisfactory answer to you. You're the developer, shouldn't you be in full control of the results produced by your build? Of course you should, which is why CMake again grants you full control over what goes into the install tree. That is, whatever ends up in the install directory when you call make install (or whatever is the equivalent of installing in your build toolchain) is again under your control.
So you do control everything that matters: The source tree, the install tree, and that tiny portion of the build tree where the final build artifacts go. The rest of the build tree is off-limits for you and for good reasons.

Best practices for porting a visual studio solution file to scons

I'm new to scons and trying to port over an existing visual studio solution (.sln) which internally references many VS project files (.vcxproj). The are multiple outputs, including a variety of libraries, and different executables.
From a conceptual point of view I'm unsure if I'm going down the right path and would appreciate any advice on how to do it better.
Here is my setup:
I have a top level SConstruct file at the root of the code depot. Additionally I have one SConscript file for each of my old VS project files. The SConstruct file calls the SConscript function once for each of these SConscript files, in which it specifies the source directory and where the outputs should go as parameters.
Additionally the SConstruct file creates and passes to each SConstruct file an array of scons environment instances. For example, there is one for compiling libraries, one for compiling executables, one for debug config, one for release, etc. and each SConscript file then chooses the one it wants, based on what it's trying to accomplish.
There are a couple things which I was wondering about:
1) Is there a better approach than creating multiple different environments, one for each configuration variation? Is that the expected usage pattern?
2) In visual studio, I could right click on a specific project and select build to only build that project and the projects it depends on, ignoring the rest of the dependency graph in the sln. With scons, is it true that it'll recompute the entire dependency graph every time I trigger a build of a specific library, even though in theory it would only need to compute a little portion of the entire dependency graph.
Thanks for any advice.
Mark
Your approach to having a SConstruct call several subsidiary SConscript files is indeed a good way to organize your projects, and is called a Hierarchical SCons build.
Regarding your questions, here are some things to consider:
Several different environments: Unless you have different compilers or compiler flags per builder or target (library, executable, etc) I would say that the approach you are using is a bit overkill. You could most likely achieve the same with just one environment. If you do need additional flags per sub-directory/builder, then you could consider passing the "main" environment to the subdirs, and in the respective SConscript's, clone the env and add/append what you need as mentioned here. This way the entire solution will be more modular by avoiding repetition and keeping everything common in one central place.
Building certain projects/targets: You can do the same with SCons by selecting the target on the command line, like this $ scons yourTarget. You can make the target names more manageable using the env.Alias() function. SCons does indeed analyze everything before building, but depending on the size of the project, it shouldnt be a problem, its still quite fast. If build performance does become an issue, here are some pointers for improving the performance.
Here are a few extra good things to know:
The SCons documentation is not bad WRT to other open-source tools. At the bottom of that doc page, there are several appendices with lots of extra information. The SCons man page is quite complete too.
Paths can be confusing in SCons if you're not Using the '#' as mentioned here
If you need to deal with MSVS projects, you can use the MSVSProject() and MSVSSolution() builers as mentioned here.

Cleaning up a VC++ 6 project

I'm working with a very old and large VC6++ project and it's all messed up. There are unused files and folders everywhere, copies of folders and it's just a mess to clean it up by hand in its current state.
It will be done eventually, but is there any simple way to check what files and folders are used when it does a clean compile?
The project settings doesnt help me at all because it simply uses copies of folders and additional include directories.
Any suggestions?
Well, if you want to parse the compiler output you can get which files are actually used. I also find this when googling around, you might want to try (I haven't tried it myself). My way would be to clean the build, list all source files, build, and for each source find its corresponding .obj. The ones without .obj are not used. Note that this only works for source files, unused header files stay undetected.
VC6 will produce a makefile for you:
http://msdn.microsoft.com/en-us/library/aa233950%28v=vs.60%29.aspx
You can use the generated makefile (and the associated .dep file) as a starting point and edit it down to the list of files that get used in a build.
This will let you see the header files that the project depends on in addition to the .c/.cpp/.lib files that might show in the build log. One thing to keep in mind is that you'll probably also want to make sure you track the .dsw and .dsp workspace and project files.
If you're a bit adventurous, you might be able to convince the makefile to actually copy the source files to some other location for you with an appropriate override of the certain macros and/or dependencies. But that would probably be more trouble than it's worth for a one-time effort.
Finally, there's a commercial product, CopyWiz by Kinook Software, that seems to have features that might do what you're looking for (and it supports VC++ 6). Note: I'm not sure if it will do what you want, but it may be worth a look.
Yes. Run Process Monitor from SysInternals. It can capture all file system events and filter them based on the path and other factors.
So, set the filter to the root of your source tree, only succesfull file reads (VC looks for headers in many places), and build your project. You'll probably still see several thousand events. So, save them to file, sort by path, and remove duplicate paths (headers especially will have many duplicate entries)

Separate "include" and "src" folders for application-level code? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
This questions concerns mostly Unix/Linux style C++ development. I see that many C++ libraries store their header files in a "include" folder and source files in an "src" folder. For the sake of conformance I adopted this in my own code. But it is not clear to me whether this should be done for application code as well. I've seen a few cases where a flat directory structure is used for that. What would be the recommended approach?
I also separate them, but not strictly on the extension, but on the access of the file.
Suppose you have a module that manages customer information and uses 2 classes to do this: Customer, CustomerValidityChecker.
Also suppose that other parts in your application only need to know about the Customer class, and that the CustomerValidityChecker is only used by the Customer class to perform some checking.
Based on these assumptions I store the files like this:
Public folder (or include folder):
customer.h
Private folder (or source folder):
customer.cpp
customervaliditychecker.h
customervaliditychecker.cpp
That way, it becomes immediately clear for callers of your module which parts are accessible (public) and which parts aren't.
We have a build system that auto-generates our makefiles. One thing it does is recursively descend any subdirectories and build them as libraries, linking them together with the main directory's objects to make the application. (In practice, these "subdirectories" are usually symbolic links.) Libraries are static unless the directory name ends in ".so". One thing that's nice about this is that a full build of our system, which has many executables, doesn't have to repeatedly compile the common libraries.
However, as a result of this, there's no separation of headers and sources. And it has never been a problem. Honestly, I think it's better this way because headers and source files have commonality of location, and you can grab a directory and know you got everything you need to use it. It also works great with Subversion's "externals" feature, and similar features in other VCSs.
One last place where an include/src separation fails is if you use any code generators, such as flex, bison, or gengetopts. Figuring out where these tools should put their outputs so they get built is tricky if you've spread things out.
It makes sense to separate them for shared libraries because they may be distributed in a compiled form without the source. I've seen projects that separate out "public" headers (headers that may be accessed from code outside your project or library) while leaving "private" headers and source files in the same directory. I think it's good to use a consistent approach whether you're writing shared library or application level code because you never know when you may want to turn something that you've written at the application level into a lower level library that is shared by multiple projects.
A lot depends on the size of project involved. Up to a few dozen files or so, keeping them in one directory tends to be more convenient. For a bigger application that includes hundreds or thousands of files, you start to look for ways to separate them (though in the projects I've worked on, it was done more on functional lines than src/include). In between those, it's probably open to question.
I don't do this; there seems little advantage in it. Since headers tend to have a different extension from source files, you can have your editor show them separately if you really feel the need -- Visual Studio does this by default, but I disable it since I prefer seeing them together
Bottom Line: sources and headers that are still changing go in /src. Code that has crystallised should go in /lib & /include (actually you could keep all .libs and their .hs in /lib).
Keep own sources and headers together, provided they are (a) specific to this project or (b) have not yet been factored out as a shared library.
Once certain sources in the main project have been factored out as a (relatively stable) library, place the .a or .lib into /lib, and its public interface header into /include.
All third party libraries and their public interface headers also go into /lib & /include.
As others note, it is often more compatible for tools / IDEs to access .h/.c from one folder. But from an organisational view it can be useful to separate changing local code from stable lib code.
There is no clear advantage to either in my view. I finally decided to keep program and header files together because my editor (Visual SlickEdit) happens to provide additional referential features when they are not separated.
I almost always create include and src folders to split up my source code. I think it makes the folder less cluttered and files are easier to find in my IDE. But I think this is just a matter of taste.
Either method is valid. It depends on the coding style you want to follow how you do this.
I place include (header) and source files in the same directory (folder). I create different folders for different themes. I get frustrated when trying to find header files (while debugging and also for researching). In some shops, there are only two folders: source and includes. These directories tend to grow exponentially. Reusing code becomes a nightmare at best.
IMHO, I believe organizing by theme is better. Each theme folder should build into at least one library. Different projects can easily include the themes by searching or including the folders. The projects only need to include the libraries. Smart build engines can list the theme folders as dependencies. This speeds up the build process.
The theme organization also adds a bit of safety to the project. Accidental damage to files (such as removing the wrong ones or replacing with different versions) is reduced since files are located in different directories. Deletion of files in the "Person" folder will not affect files in the "Shape" folder.
This is just my opinion, Your Mileage May Vary.
We have a build system which use this rule. This build system is sconspiracy a set of scripts to configure SCons and dedicated to the C++ world. You can see an example which use this tools : fw4spl

Complex builds in Visual Studio

I have a few things that I cannot find a good way to perform in Visual Studio:
Pre-build step invokes a code generator that generates some source files which are later compiled. This can be solved to a limited extent by adding blank files to the project (which are later replaced with real generated files), but it does not work if I don't know names and/or the number of auto-generated source files. I can easily solve it in GNU make using $(wildcard generated/*.c). How can I do something similar with Visual Studio?
Can I prevent pre-build/post-build event running if the files do not need to be modified ("make" behaviour)? The current workaround is to write a wrapper script that will check timestamps for me, which works, but is a bit clunky.
What is a good way to locate external libraries and headers installed outside of VS? In *nix case, they would normally be installed in the system paths, or located with autoconf. I suppose I can specify paths with user-defined macros in project settings, but where is a good place to put these macros so they can be easily found and adjusted?
Just to be clear, I am aware that better Windows build systems exist (CMake, SCons), but they usually generate VS project files themselves, and I need to integrate this project into existing VS build system, so it is desirable that I have just plain VS project files, not generated ones.
If you need make behavior and are used to it, you can create visual studio makefile projects and include them in your project.
If you want less clunky, you can write visual studio macros and custom build events and tie them to specific build callbacks / hooks.
You can try something like workspacewhiz which will let you setup environment variables for your project, in a file format that can be checked in. Then users can alter them locally.
I've gone through this exact problem and I did get it working using Custom Build Rules.
But it was always a pain and worked poorly. I abandoned visual studio and went with a Makefile system using cygwin. Much better now.
cl.exe is the name of the VS compiler.
Update: I recently switched to using cmake, which comes with its own problems, and cmake can generate a visual studio solution. This seems to work well.
Specifically for #3, I use property pages to designate 3rd party library location settings (include paths, link paths, etc.). You can use User Macros from a parent or higher level property sheet to designate the starting point for the libraries themselves (if they are in a common root location), and then define individual sheets for each library using the base path macro. It's not automatic, but it is easy to maintain, and every developer can have a different root directory if necessary (it is in our environment).
One downside of this approach is that the include paths constructed this way are not included in the search paths for Visual Studio (unless you duplicate the definitions in the Projects and Directories settings for VS). I spoke to some MS people at PDC08 about getting this fixed for VS2010, and improving the interface in general, but no solid promises from them.
(1). I don't know a simple answer to this, but there are workarounds:
1a. If content of generated files does not clash (i.e. there is no common static identifiers etc.), you can add to the project a single file, such as AllGeneratedFiles.c, and modify your generator to append a #include "generated/file.c" to this file when it produces generated/file.c.
1b. Or you can create a separate makefile-based project for generated files and build them using nmake.
(2). Use a custom build rule instead of post-build event. You can add a custom build rule by right-clicking on the project name in the Solution Explorer and selecting Custom Build Rules.
(3). There is no standard way of doing this; it has to be defined on a per-project basis. One approach is to use environment variables to locate external dependencies. You can then use those environment variables in project properties. Add a readme.txt describing required tools and libraries and corresponding environment variables which the user has to set, and it should be easy enough for anyone to set up.
Depending on exactly what you are trying to do, you can sometimes have some luck with using a custom build step and setting your dependencies properly. It may be helpful to put all the generated code into its own project and then have your main project depend on it.