Using Boost on Windows (Visual Studio) - c++

I want to get started using Boost. I'm programming a C++ program in Visual Studio (obviously on a Windows machine).
Boost's Getting Started Guide says:
The easiest way to get a copy of Boost is to use an installer. The Boost website version of this Getting Started guide will have undated information on installers as they become available, or see Boost downloads or the installer provided by BoostPro Computing. We especially recommend using an installer if you use Microsoft Visual Studio, because the installer can download and install precompiled library binaries, saving you the trouble of building them yourself.
I'm a little unsure if I want to follow this advice, or just download and build everything myself. Potential problems that I see with an installer are:
Things are no longer self-contained (i.e. every team member has to install Boost, then configure Visual Studio to recognize it).
I can't keep Boost under source control (I would ideally like it to be soure files in my source control like everything else). (Edit: Judging by the comments, it looks like boost is kinda large (as in 5 GB!), so obviously I'd need to keep only parts of it under source control).
So my question is, am I just being paranoid and should go the installer route, or am I correct and should build it myself? If anyone has any experience working with Boost and Visual Studio, I'd appreciate if they could share their views on this (and if it should be to build it myself, any tips would also be appreciated, for example should I only copy every file that I actually use? etc.).
Note:
A few similar questions on StackOverflow, but which didn't ask this explicitly, make me think that I shouldn't use the installer, which is why I'm asking it explicitly here. For reference, these are the questions:
Boost linking, Visual Studio & version control
Including Relevant Boost Libraries with C++ Source (Using Visual Studio)

A good way to make sure everyone has everything configured properly is to use svn externals. You can create something like /trunk/boost1.35 and then you can point to that with an svn external.
That way as new versions of boost come out, you can just repoint your svn external to /trunk/boost1.40
In your repository, your svn external points to that svn folder within your repository. Example /depends/boost
We personally keep the boost header files under source control as described but keep the libs as a zip that we require everyone to download. We have an environment variable something like the following BOOST_LIB and we point that to the current boost library directory.

I recommend using the installer.
Building it yourself is not hard. Here is the procedure:
Download boost into C:\Program Files\boost\boost_1_40_0
Open the command prompt and change your current directory to the Boost root directory
bootstrap
.\bjam
The library binaries are now sprinkled through the folders under
C:\Program Files\boost\boost_1_40_0\bin.v2
Find the required libraries and copy them to
C:\Program Files\boost\boost_1_40_0\lib
( Do not confuse folders called lib and libs! )
However, this is slow enough and just complicated enough, especially the last step, that you and everyone else will probably screw something up once in a while, resulting in a waste of many hours sorting out mysterious build errors - this is my experience anyway.

I have built Boost under Windows. Its "bjam" installation tool auto-detects MSVC and uses it for compiling; I wouldn't have any reservations against building yourself. It's only marginally more difficult than "./configure && make && make install", really.
Building yourself could even be necessary, because the Boost libs available online do not include ICU (Unicode) support, e.g. for the boost_regex lib.

Things are no longer self-contained (i.e. every team member has to install Boost, then configure Visual Studio to recognize it).
I can't keep Boost under source control (I would ideally like it to be soure files in my source control like everything else).
Before you put Boost under source control, keep in mind that the compiled libraries take up several gigabytes. (my Boost folder is around 5GB) It might be worth just letting everyone install Boost for themselves.
Apart from this, the installer should work fine, but compiling it yourself is really trivial as well.
Boost installs into version-specific folders by default (both if you compile it yourself and if you use the installer), so it's easy enough to have multiple versions installed side by side. So if your team upgrades Boost to a new version, you could simply change the include path in the .sln or .vsprops files to make the compiler search for the new version -- if a coworker hasn't installed the right version, he just won't be able to build (which might be preferable to silently building with an old version)

One other thing to consider is whether you need all or part of boost. What we do here is put the source in version control and create a single wrapper project for the libraries that we actually want to use. The individual libraries are written cleanly enough to make it a matter of just dropping all the cpp files into a new visual studio project. You might need to set up the top level configuration header (I think I set this as a forced include) and the whole thing built very easily. Add this project as a dependency in your solution and it means you can keep all the binaries out of the SCM and also ensure that everyone is always up-to-date.
Much of boost is headers-only anyway so you may find that there's only a handful of libraries that you'd want to build. This approach makes it easier for you to match your VS project settings too.

We actually create our own installer, with just the parts of Boost that we use in our jobs, and give that out to the IT folks to install on developer machines. We also keep that copy of boost in revision control, so we can track dependencies between it and the rest of our system properly, and build it ourselves.
I suppose work-wise this is the worst of both worlds. But it does give us the maximum control.

Several points NOT to keep it under source control:
Boost is huge.
Compilation is non-trivial (esp. for several configurations)
Compilation is long (you don't want every developer to do it)
I personally would not bother building it myself - on Linux, for instance I always use the distribution provided package.

I'd use the installer, unless you need to customize the build flags. It's much easier, and building boost (at least the last time I did it) wasn't the clearest process. There's nothing stopping you from downloading the source that matches the version of boost that the installer gives you, and putting that in version control. This is the approach I've used in the past for other libraries (nss, iplanet sdk) and it's worked well.

I would recommend to run bootstrap.bat first - it will build bjam.exe and then
bjam --stagedir="c:\Program Files\Boost" --build-type=complete --toolset=msvc-9.0 --with-regex --with-date_time --with-thread --with-signals --with-system --with-filesystem --with-program_options stage
bjam --stagedir="c:\Program Files\Boost" --build-type=complete --toolset=msvc-10.0 --with-regex --with-date_time --with-thread --with-signals --with-system --with-filesystem --with-program_options stage
..
You just need to specify the correct toolset. It will put all binaries to the ..\lib folder.

I'd say, just make Boost installation as prerequisite for your project. The manual installation takes just few minutes with one-time small number of steps. Most large size complex project, eventually end up taking up dependency on Boost so its not unusual to make it prerequisite. Of course, it's trivial to automate it. The advantages are:
You don't add giant Boost distribution in your repo
You don't have to cherry-pick what you use
Other projects can share the installation
One-time setup also takes care of building header+cpp libraries
For Visual Studio 2015 and latest Boost version, here are the step-by-step instructions we follow for our team:
https://stackoverflow.com/a/39628306/207661

Related

How do you package GCC for distribution?

I am making a modified C++ compiler and I have it built and tested locally. However, I would like to be able to package my build for Windows, Linux (Debian), and Mac OSX.
All of the instructions I can find online deal with building gcc but have no regard for making something distributable (or perhaps I am missing something?). I know for Windows I will need to bundle MinGW somehow, but that only further confuses me - and I have no idea how well Mac works with GCC these days..
Can anyone layout a set of discrete high-level steps I could try on each system so I can allow people to install my modified compiler easily?
First make sure your project installs well including executables, headers, runtime dependencies. If you're using something like cmake, this is a matter of installing things to CMAKE_INSTALL_PREFIX while possibly appending GnuInstallDirs. If you are using make, then you need to ensure that make install --prefix=... works well.
From here, you can target each platform independently. Treat the packaging independently from your project. Like Chipster mentioned, making rpm files isn't so tough. deb files for Debian-based OSs, tar.xz files for Arch-based OSs are similar. The rules for creating these packages can use your install rules to create the package. You mentioned mingw. If you're targeting an msys distribution of mingw for Windows deployment, then the Arch-based packaging of pacman will work on msys as well. You can slowly work on supporting one-platform at a time with almost no changes to your actual project.
Typically in the open-source world, people will release a tar.gz file supporting ./configure && make && make install or similar. Then someone associated with the platform (like a Debian-developer) will find your project, make some packaging rules for it, and release it into their distribution. That means your project can be totally agnostic to where it's being release. It also means you don't really need to worry about MacOS yet, you can wait until you have someone who wants it there, or some hardware to test it on.
If you really want to be in control of how things are packaged for each platform from inside of your project, and you are already using cmake, cpack is a great tool which helps out. After writing cpack rules for your project, you can simply type cpack to generate many types of deployable archives. You won't get the resulting *.deb file into Debian or Ubuntu official archives, but at least people can using those formats can install your package.
Also, consider releasing one package with the runtime libraries, and one with the development content (headers, compiler, static libraries). This way, if someone uses your compiler, they can re-distribute the runtime libraries which is probably going to be a much simpler install.

How to build and install google-url as a shared library on Mac OS/Linux

I want to use google-url in my project as a shared library on Linux\Mac OS, but can not figure out the right way to build it...
Question: what is the way you suggest to build it from scratch form official sources?
Requirements - be able to stay in sync with official repo and use standard(make) tools.
As far as i can see, right now there are few ways to build it:
in the official repo itself only Visual Studio 2005 build files are included
it is in use at Chromium and so there is .gyp available for it but looks like it is tight integrated with Chromium build structure, so there is no easy way to generate Makefile for the standalone library build.
Although it has a comment inside "TODO(mark): Upstream this file to googleurl."
So at list this considered to be possible.
Googleurl is also integrated with PageSpeed project in .gyp form (thought no the same one as above) and so it is somehow built there
third-party bindings for python are available and also contain some build instructions, but with SCons this time, and AFAIK it is kind of obsolete system to rely on.
Looks like i'm not the only one with this trouble, so other people i found both just implemented their own build files using autotools:
https://github.com/artemg/Googleurl-separate-library
https://github.com/commoncrawl/commoncrawl-crawler/blob/master/src/native/src/libGoogleURL/googleurl/README.google
It could work but the filesystem layout is not the same as in official repo/they have local modification so there are no easy way to downstream changes and stay in sync.
The most tempting way would be to use GYP to generate platform-specific build files for the oficial repo once: make/xcode/visual studio, then just save and use them later as needed..but i have no idea how to approach this and where to start from.

How to generate vcproj files?

Suppose I've got a cross-platform C++ library, let's call it ylib. My primary development platform is Linux. I'd like ylib to be buildable by MSVC (2008, 2010, 11, etc).
What's the best way to do this? Do I generate .vcproj files? If so, how? Ideally I'd be able to generate them without using Windows myself. No extra dependencies should be required for Windows.
Multiple variants should be build: debug dll, debug lib, release dll, release lib with dynamic runtime and release lib with static runtime.
You could use cmake for your build scripts. Cmake has the ability to generate visual studio project files from the cmake build scripts. So you'd just need to distribute your cmake files, then individual people using windows could generate MSVC project files from that.
Though as pointed out in the comments, it'd be difficult to guarantee that you could actually build your project under visual studio without trying it out yourself.
EDIT: Though I've just realized that you requested no extra dependencies on linux, which this would not solve. Unless you generated the vcproj files yourself using cmake, and distributed them. But personally I think it'd be better to just have the cmake dependency. It's freely available, and easy to install.
This also has the advantage of supporting whatever version of visual studio your end user happens to have, without the need for distributing several different formats.
You just need to understand the format of vcproj files and then write them - they are simply XML.
I don't know how well MSFT document the settings (not very if history is a guide) - so for simple projects I would just create a project in MSVC and look at what it writes.
Alternatively you could just use cmake which already does this.

What's the quickest and easiest way of getting libpng available for development in VisualStudio2008?

I have some C++ code which uses boost's GIL image library, and wants to write files using boost::gil::png_write_view from boost/gil/extension/io/png_io.hpp. That header itself includes png.h, and of course results on a link dependency.
On Debian it compiles and links fine. If it did complain about anything missing, the necessary headers and libs would be a few seconds away via an aptitude install libpng-dev.
On Windows (VisualStudio 2008 on XP64), I'm having to face the idea that it looks like I'll have to build libpng from source (and so also its zlib dependency) myself. If there's an obvious packaging already out there, I'm not seeing it. Can anyone enlighten me if there is such a useful resource anywhere ?
Libpng's own packaging itself seems to supply project files for vc6 and VC7.1 (VS2003). And more recent releases also come with VC10 project files. But nothing for VC8(VS2005) or VC9(VS2008). However there are instructions here (which I've yet to try) which describe building for 2008 after running the 7.1 project files through the conversion wizard.
I did initally try the GnuWin32 build of libpng, but (apart from being 32 bit only) it crashed in a libpng call to fwrite when passed a FILE* from VS2008's CRT. libpng's own documentation has something to say about this and the perils of mixing different versions of MSVC but their suggested workround is only relevant to direct libpng users (and I'm using it via boost GIL).
Update: the converted project files do work pretty well (maybe some minor obvious fixups). By far the biggest part of the job was creating the 64bit builds; the original and converted project files don't include any such configuration and while visual studio will have a go at creating them, there was a fair amount of config dialog editing to get consistent folder/file names etc.

C++ Buildsystem with ability to compile dependencies beforehand

I'm in the middle of setting up an build environment for a c++ game project. Our main requirement is the ability to build not just our game code, but also its dependencies (Ogre3D, Cegui, boost, etc.). Furthermore we would like to be able build on Linux as well as on Windows as our development team consists of members using different operating systems.
Ogre3D uses CMake as its build tool. This is why we based our project on CMake too so far. We can compile perfectly fine once all dependencies are set up manually on each team members system as CMake is able to find the libraries.
The Question is if there is an feasible way to get the dependencies set up automatically. As a Java developer I know of Maven, but what tools do exist in the world of c++?
Update: Thanks for the nice answers and links. Over the next few days I will be trying out some of the tools to see what meets our requirements, starting with CMake. I've indeed had my share with autotools so far and as much as I like the documentation (the autobook is a very good read), I fear autotools are not meant to be used on Windows natively.
Some of you suggested to let some IDE handle the dependency management. We consist of individuals using all possible technologies to code from pure Vim to fully blown Eclipse CDT or Visual Studio. This is where CMake allows use some flexibility with its ability to generate native project files.
In the latest CMake 2.8 version there is the new ExternalProject module.
This allows to download/checkout code, configure and build it as part of your main build tree.
It should also allow to set dependencies.
At my work (medical image processing group) we use CMake to build all our own libraries and applications. We have an in-house tool to track all the dependencies between projects (defined in a XML database). Most of the third party libraries (like Boost, Qt, VTK, ITK etc..) are build once for each system we support (MSWin32, MSWin64, Linux32 etc..) and are commited as zip-files in the version control system. CMake will then extract and configure the correct zip file depending on which system the developer is working on.
I have been using GNU Autotools (Autoconf, Automake, Libtool) for the past couple of months in several projects that I have been involved in and I think it works beautifully. Truth be told it does take a little bit to get used to the syntax, but I have used it successfully on a project that requires the distribution of python scripts, C libraries, and a C++ application. I'll give you some links that helped me out when I first asked a similar question on here.
The GNU Autotools Page provides the best documentation on the system as a whole but it is quite verbose.
Wikipedia has a page which explains how everything works. Autoconf configures the project based upon the platform that you are about to compile on, Automake builds the Makefiles for your project, and Libtool handles libraries.
A Makefile.am example and a configure.ac example should help you get started.
Some more links:
http://www.lrde.epita.fr/~adl/autotools.html
http://www.developingprogrammers.com/index.php/2006/01/05/autotools-tutorial/
http://sources.redhat.com/autobook/
One thing that I am not certain on is any type of Windows wrapper for GNU Autotools. I know you are able to use it inside of Cygwin, but as for actually distributing files and dependencies on Windows platforms you are probably better off using a Windows MSI installer (or something that can package your project inside of Visual Studio).
If you want to distribute dependencies you can set them up under a different subdirectory, for example, libzip, with a specific Makefile.am entry which will build that library. When you perform a make install the library will be installed to the lib folder that the configure script determined it should use.
Good luck!
There are several interesting make replacements that automatically track implicit dependencies (from header files), are cross-platform and can cope with generated files (e.g. shader definitions). Two examples I used to work with are SCons and Jam/BJam.
I don't know of a cross-platform way of getting *make to automatically track dependencies.
The best you can do is use some script that scans source files (or has C++ compiler do that) and finds #includes (conditional compilation makes this tricky) and generates part of makefile.
But you'd need to call this script whenever something might have changed.
The Question is if there is an feasible way to get the dependencies set up automatically.
What do you mean set up?
As you said, CMake will compile everything once the dependencies are on the machines. Are you just looking for a way to package up the dependency source? Once all the source is there, CMake and a build tool (gcc, nmake, MSVS, etc.) is all you need.
Edit: Side note, CMake has the file command which can be used to download files if they are needed: file(DOWNLOAD url file [TIMEOUT timeout] [STATUS status] [LOG log])
Edit 2: CPack is another tool by the CMake guys that can be used to package up files and such for distribution on various platforms. It can create NSIS for Windows and .deb or .tgz files for *nix.
At my place of work (we build embedded systems for power protection) we used CMake to solve the problem. Our setup allows cmake to be run from various locations.
/
CMakeLists.txt "install precompiled dependencies and build project"
project/
CMakeLists.txt "build the project managing dependencies of subsystems"
subsystem1/
CMakeLists.txt "build subsystem 1 assume dependecies are already met"
subsystem2/
CMakeLists.txt "build subsystem 2 assume dependecies are already met"
The trick is to make sure that each CMakeLists.txt file can be called in isolation but that the top level file can still build everything correctly. Technically we don't need the sub CMakeLists.txt files but it makes the developers happy. It would be an absolute pain if we all had to edit one monolithic build file at the root of the project.
I did not set up the system (I helped but it is not my baby). The author said that the boost cmake build system had some really good stuff in it, that help him get the whole thing building smoothly.
On many *nix systems, some kind of package manager or build system is used for this. The most common one for source stuff is GNU Autotools, which I've heard is a source of extreme grief. However, with a few scripts and an online depository for your deps you can set up something similar like so:
In your project Makefile, create a target (optionally with subtargets) that covers your dependencies.
Within the target for each dependency, first check to see if the dep source is in the project (on *nix you can use touch for this, but you could be more thorough)
If the dep is not there, you can use curl, etc to download the dep
In all cases, have the dep targets make a recursive make call (make; make install; make clean; etc) to the Makefile (or other configure script/build file) of the dependency. If the dep is already built and installed, make will return fairly promptly.
There are going to be lots of corner cases that will cause this to break though, depending on the installers for each dep (perhaps the installer is interactive?), but this approach should cover the general idea.
Right now I'm working on a tool able to automatically install all dependencies of a C/C++ app with exact version requirement :
compiler
libs
tools (cmake, autotools)
Right now it works, for my app. (Installing UnitTest++, Boost, Wt, sqlite, cmake all in correct order)
The tool, named «C++ Version Manager» (inspired by the excellent ruby version manager), is coded in bash and hosted on github : https://github.com/Offirmo/cvm
Any advices and suggestions are welcomed.