Building basic Boost applications with bjam - boost-build

I can find tons of general purpose documentation on Boost.Build, but surprisingly nothing on how to use it to build simple Boost applications.
I compiled Boost for mingw with bjam, got all the libraries and includes in C:\Boost. Now what would a basic Jamroot file look like to use all this ?
The libs all have complicated names like 'libboost_filesystem-mgw34-mt-s.lib', I'm sure there is some kind of magic switch somewhere to just say 'link against libboost_filesystem' !

Of course there are shortcuts! An example project could look like:
#jamfile - an example Boost.Build project
exe my_exe : [ glob *.cpp ] /boost//filesystem ;
Making an executable from all .cpp files in the project's directory and using Boost.Filesystem.
Then you don't need to build any of boost libraries manually, bjam will take care of that itself as necessary. That will also ensure your app always links with the boost libraries compiled with the right options.
There is some initial configuration effort to be done to prepare an environment for using Boost.Build, such as creating a user-config.jam, a boost-build.jam and a jamroot for convenience (having e.g. use-project /boost : /path/to/boost statement). But that's a one-time effort and after that things are much easier than before.
http://www.boost.org/boost-build2/doc/html/index.html

It's not necessary in any way to use bjam in order to build code that works with boost.
Thus, use gmake, or batch scripts, or an IDE, or whatever you like. bjam is one of the harder choices as like you've found, the documentation is poor.
There is a very basic tutorial at:
http://www.boost.org/doc/libs/1_39_0/more/getting_started/unix-variants.html
or
http://www.boost.org/doc/libs/1_39_0/more/getting_started/windows.html

Related

How to use cmake to get rapidcheck (property based testing) working in C++?

I'd like to do some property-based testing in a C++ library I'm working on, and was thinking of going with RapidCheck unless somebody has a better idea. (I will need, for example, to generate arbitrary std::set<int>, and if I can place bounds on the range of int in the sets and the size of the sets, all the better.)
All this being said, I'm still a bit of a cmake newb. There appear to be no instructions in RapidCheck except to include it as part of the source code (although downloading it would be better). I have gotten to the point where I can include the headers for RapidCheck in my code, but when I try to build any app using RapidCheck, I'm told that there are symbols from RapidCheck missing or that the rapidcheck library is missing.
I'm assuming that I have to build RapidCheck itself as part of the project to generate the library, but I'm not entirely sure how to do this and it seems difficult to find any examples where this is done.
Does anyone have any suggestions of examples where such things are done so that I can see the string of commands necessary to build a 3rd party API and include the library when building the executables, or - even better - an example of a project using RapidCheck that does exactly this? The lack of documentation on how to set this up is discouraging.
I hope this is not overly vague. To summarize, what I'd like to do from cmake:
Preferably download RapidTest (although including the files directly from the RapidTest project would be fine as well).
Run the required commands and set up the necessary variables to have my test code (in ${PROJECT_SOURCE_DIR}/test) be able to access RapidTest headers.
Generate (if necessary) the RapidTest library and make it so that I can link it to the tests I'm running.
Thanks in advance for any help you might be able to offer!
This is probably not the right way to do this, but maybe it will help:
I was able to get this working by doing the following:
# from within the root of the rapidcheck repo:
$ cmake -DBUILD_SHARED_LIBS=true -G "Unix Makefiles" -DCMAKE_BUILD_TYPE=Debug .
# Leave off the BUILD_SHARED_LIBS flag if you don't need an SO.
$ make
That built: librapidcheck.so and librapidcheck.a, which you can then copy / install as needed.
You'll also need the include directory with the headers for rapidcheck, but that's just in the source tree.
Add the include path to your compile commands using whatever build tool you want, and link with the compiled libraries (the .so and .a)

Writing a makefile for a C++ program

Can someone help me write my first makefile?
I have a helloworld.cpp code in /path/to/main/code/helloworld.cpp.
And then I have just included a .h file #include "SFML/network.hpp".
Now SFML library is saved in /path/to/SFML/Library
It would be great if someone can help me getting started in this. In the mean time, I am trying to read thru the literature but its very overwhelming.
Although you may need to learn the traditional GNU Makefile syntax in order to work with other projects, I recommend using CMake for your own projects because it is simpler and more intuitive.
A CMake file for your example would go in a file called CMakeLists.txt in the root directory of your project (same as the source in this case) and look like this:
project(HelloWorld) # Name your project.
add_executable(helloworld helloworld.cpp) # Specify an executable to build.
link_directories(/path/to/SFML) # Tell it where your libraries are.
target_link_libraries(helloworld Library) # Tell it which library to link.
If you want to be able to #include headers from SFML without including the directory name every time, then you can also write:
include_directories(SFML) # Tell it where your headers are.
Further documentation on writing CMake files and running CMake is available on the website.
Lastly, CMake often gives a noisy but harmless message requesting that you put a line like this at the top of your CMakeLists.txt file:
cmake_minimum_required(VERSION 2.8) # Quell warnings.
Cheers.
It is probably simple on its own, but just wanted to recommend genmake.pl, I used this to great effect for my school project. It is quite easy to use, it generates a skeleton Makefile for you and you need to fill in things like the path to SFML.
If you're using ACE (Adaptive Communication Environment), it comes with a very powerful makefile creation utility called MPC (Makefile Project Creator)
To compile a simple program, and auto-generate GNU makefiles, a minimal example could be:
project(*my_project) : baseWorkspace {
exename = my_project
exeout = ./
Source_Files {
main.cpp
}
}
I've found that the beauty of using this tool is that it can inherit "base" workspaces, almost like polymorphism in C++. The libraries defined in the "base" are used in the current project, allowing the programmer to utilize various building blocks to easily manipulate and create projects rapidly.
I'm confident only the MPC portion of ACE can be downloaded and used via an RPM, I'll search around and see if I can find an appropriate example for you.

How to manage growing C++ project

I am wondering how I should manage a growing C++ project. Now, I am developing a project with Netbeans and it's dirty work generating makefiles. The project has become too big and I have decided to split it up into a few parts. What is the best way of doing this?
I am trying to use Scons as my build system. I have had some success with it, but should I edit the build scripts every time I append or delete files. It's too dull.
So I need your advice.
P.S. By the way, how does a large project like google chrome do this? Does everybody use some kind of IDE to build scripts generated only for software distribution?
I also use Netbeans for C++ and compile with SCons. I use the jVi Netbeans plugin which really works well.
For some reason the Netbeans Python plugin is no longer official, which I dont understand at all. You can still get it though, and it really makes editing the SCons build scripts a nice experience. Even though Netbeans doesnt have a SCons plugin (yet?) you can still configure its build command to execute SCons.
As for maintaining the SCons scripts automatically by the IDE, I dont do that either, I do that by hand. But its not like I have to deal with this on a daily basis, so I dont see that its that important, especially considering how easy to read the scripts are.
Here's the build script in SCons that does the same as mentioned previously for CMake:
env = Environment()
env.EnsurePythonVersion(2, 5)
env.EnsureSConsVersion(2, 1)
libTarget = env.SharedLibrary(target = 'foo', source = ['a.cpp', 'b.cpp', 'c.pp'])
env.Program(target = 'bar', source = ['bar.cpp', libTarget])
The SCons Glob() function is a nice option, but I tend to shy away from automatically building all the files in a directory. The same goes for listing sub-directories to be built. Ive been burned enough times by this, and prefer explicitly specifying the file/dirs to be built.
In case you hear those rumors that SCons is slower than other alternatives, the SCons GoFastButton has some pointers that can help out.
Most large projects stick with a build system that automatically handles all the messy details for them. I'm a huge fan of CMake (which is what KDE uses for all their components) but scons is another popular choice. My editor (KDevelop) supposedly handles CMake projects itself, but I still edit the build scripts myself because it's not that hard.
I'd recommend learning one tool really well and sticking with it (plenty of documentation is available for any tool you'll be interested in). Be sure you also look into version control if you haven't already (I have a soft spot for git, but Mercurial and Subversion are also very popular choices).
A simple CMake example:
project("My Awesome Project" CXX)
cmake_minimum_required(VERSION 2.8)
add_library(foo SHARED a.cpp b.cpp c.cpp) #we'll build an so file
add_executable(bar bar.cpp)
target_link_libraries(bar foo) #link bar to foo
This is obviously a trivial case, but it's very easy to manage and expand as needed.
I am trying to use Scons as build system. I have some success with it, but I should edit
build scripts every time I append or delete file. It's too dull.
Depending on how your files are organized, you can use, for example, Scon's Glob() function to get source files as a list without having to list all files individually. For example, to build all c++ source files into an executable, you can do:
Program('program', Glob('*.cpp'))
You can do the same in CMake using its commands.
And, if you're using SCons, since it's Python you can write arbitrary Python code to make your source file lists.
You can also organize files into multiple folders and have subsidiary SCons (or CMakeList.txt) build files that the master build script can call.

how to compile a VC project using g++?

i have source code of vc++ project. Now I am using linux.
i know how compile a single file .cpp not a whole project. So how to compile a VC project using g++ ?
A slight advantage of Makefiles would be possible integration with autotools (cough - It might prove handy to get the starting point for feature macros).[2]
There is a tool as part of winemaker that is EXCEEDINGLY helpful with fixing up a source tree that was assuming case insensitive names to work on a case-sensitive filesystem. (_it was intended mainly in order to build against winelib but that is not required)
If you want to keep using windows API's for some parts of the code, you can consider compiling with winelib (and use winegcc, producing WIN32 executables; I'm not sure whether this is what you want)
[2]: SCons is a very nice tool though
First step would be to generate Makefile out of vcproj file.
There are (obviously) some tools for that:
http://www.codeproject.com/KB/cross-platform/sln2mak.aspx
There is no easy way to do it. As others have suggested you can figure out how the build process works for this project (maybe by reading the build output in VS) and recreate that using your favorite linux build tool (scons, cmake, autotools etc.). The alternative is to use a converter tool. Aside from the below mentioned sln2mak, there is also winemaker. The docs for winemaker have a lot of old info like most linux tools docs but it can convert a .sln to a makefile. I am not sure about newer vs .sln files.

C++ Buildsystem with ability to compile dependencies beforehand

I'm in the middle of setting up an build environment for a c++ game project. Our main requirement is the ability to build not just our game code, but also its dependencies (Ogre3D, Cegui, boost, etc.). Furthermore we would like to be able build on Linux as well as on Windows as our development team consists of members using different operating systems.
Ogre3D uses CMake as its build tool. This is why we based our project on CMake too so far. We can compile perfectly fine once all dependencies are set up manually on each team members system as CMake is able to find the libraries.
The Question is if there is an feasible way to get the dependencies set up automatically. As a Java developer I know of Maven, but what tools do exist in the world of c++?
Update: Thanks for the nice answers and links. Over the next few days I will be trying out some of the tools to see what meets our requirements, starting with CMake. I've indeed had my share with autotools so far and as much as I like the documentation (the autobook is a very good read), I fear autotools are not meant to be used on Windows natively.
Some of you suggested to let some IDE handle the dependency management. We consist of individuals using all possible technologies to code from pure Vim to fully blown Eclipse CDT or Visual Studio. This is where CMake allows use some flexibility with its ability to generate native project files.
In the latest CMake 2.8 version there is the new ExternalProject module.
This allows to download/checkout code, configure and build it as part of your main build tree.
It should also allow to set dependencies.
At my work (medical image processing group) we use CMake to build all our own libraries and applications. We have an in-house tool to track all the dependencies between projects (defined in a XML database). Most of the third party libraries (like Boost, Qt, VTK, ITK etc..) are build once for each system we support (MSWin32, MSWin64, Linux32 etc..) and are commited as zip-files in the version control system. CMake will then extract and configure the correct zip file depending on which system the developer is working on.
I have been using GNU Autotools (Autoconf, Automake, Libtool) for the past couple of months in several projects that I have been involved in and I think it works beautifully. Truth be told it does take a little bit to get used to the syntax, but I have used it successfully on a project that requires the distribution of python scripts, C libraries, and a C++ application. I'll give you some links that helped me out when I first asked a similar question on here.
The GNU Autotools Page provides the best documentation on the system as a whole but it is quite verbose.
Wikipedia has a page which explains how everything works. Autoconf configures the project based upon the platform that you are about to compile on, Automake builds the Makefiles for your project, and Libtool handles libraries.
A Makefile.am example and a configure.ac example should help you get started.
Some more links:
http://www.lrde.epita.fr/~adl/autotools.html
http://www.developingprogrammers.com/index.php/2006/01/05/autotools-tutorial/
http://sources.redhat.com/autobook/
One thing that I am not certain on is any type of Windows wrapper for GNU Autotools. I know you are able to use it inside of Cygwin, but as for actually distributing files and dependencies on Windows platforms you are probably better off using a Windows MSI installer (or something that can package your project inside of Visual Studio).
If you want to distribute dependencies you can set them up under a different subdirectory, for example, libzip, with a specific Makefile.am entry which will build that library. When you perform a make install the library will be installed to the lib folder that the configure script determined it should use.
Good luck!
There are several interesting make replacements that automatically track implicit dependencies (from header files), are cross-platform and can cope with generated files (e.g. shader definitions). Two examples I used to work with are SCons and Jam/BJam.
I don't know of a cross-platform way of getting *make to automatically track dependencies.
The best you can do is use some script that scans source files (or has C++ compiler do that) and finds #includes (conditional compilation makes this tricky) and generates part of makefile.
But you'd need to call this script whenever something might have changed.
The Question is if there is an feasible way to get the dependencies set up automatically.
What do you mean set up?
As you said, CMake will compile everything once the dependencies are on the machines. Are you just looking for a way to package up the dependency source? Once all the source is there, CMake and a build tool (gcc, nmake, MSVS, etc.) is all you need.
Edit: Side note, CMake has the file command which can be used to download files if they are needed: file(DOWNLOAD url file [TIMEOUT timeout] [STATUS status] [LOG log])
Edit 2: CPack is another tool by the CMake guys that can be used to package up files and such for distribution on various platforms. It can create NSIS for Windows and .deb or .tgz files for *nix.
At my place of work (we build embedded systems for power protection) we used CMake to solve the problem. Our setup allows cmake to be run from various locations.
/
CMakeLists.txt "install precompiled dependencies and build project"
project/
CMakeLists.txt "build the project managing dependencies of subsystems"
subsystem1/
CMakeLists.txt "build subsystem 1 assume dependecies are already met"
subsystem2/
CMakeLists.txt "build subsystem 2 assume dependecies are already met"
The trick is to make sure that each CMakeLists.txt file can be called in isolation but that the top level file can still build everything correctly. Technically we don't need the sub CMakeLists.txt files but it makes the developers happy. It would be an absolute pain if we all had to edit one monolithic build file at the root of the project.
I did not set up the system (I helped but it is not my baby). The author said that the boost cmake build system had some really good stuff in it, that help him get the whole thing building smoothly.
On many *nix systems, some kind of package manager or build system is used for this. The most common one for source stuff is GNU Autotools, which I've heard is a source of extreme grief. However, with a few scripts and an online depository for your deps you can set up something similar like so:
In your project Makefile, create a target (optionally with subtargets) that covers your dependencies.
Within the target for each dependency, first check to see if the dep source is in the project (on *nix you can use touch for this, but you could be more thorough)
If the dep is not there, you can use curl, etc to download the dep
In all cases, have the dep targets make a recursive make call (make; make install; make clean; etc) to the Makefile (or other configure script/build file) of the dependency. If the dep is already built and installed, make will return fairly promptly.
There are going to be lots of corner cases that will cause this to break though, depending on the installers for each dep (perhaps the installer is interactive?), but this approach should cover the general idea.
Right now I'm working on a tool able to automatically install all dependencies of a C/C++ app with exact version requirement :
compiler
libs
tools (cmake, autotools)
Right now it works, for my app. (Installing UnitTest++, Boost, Wt, sqlite, cmake all in correct order)
The tool, named «C++ Version Manager» (inspired by the excellent ruby version manager), is coded in bash and hosted on github : https://github.com/Offirmo/cvm
Any advices and suggestions are welcomed.