Using Boost Pre-build package - c++

I want to use Boost.Thread library in a C++ software that I'm developing.
Since I'm a complete newbie in C++, I would like to know if there are any differences between:
Downloading and building Boost manually for MSVC9-x64
Using a pre-built package: http://boost.teeks99.com/
Option 1 seems so painful...

As far as I can tell, these are the default builds and not from patched or tweaked sources or anything.
If you take care to use the correct version of binaries for your application target and version of VC (including service packs) and link against the correct libraries (shared, static CRT, debug, etc.) you should be absolutely fine.
Also, since it seems that these packages don't contain Boost headers, you must take care to get and use the correct version of headers.
Concerning the "differences". Boost is a large and complex collection of libraries. For a software of that size, it has a quite simple and straightforward build process, but still, there are many options and customizations you can do when building Boost (e.g. you can build Boost.Iostreams with or without zlib and bzip2 support, build Boost.Regex with or without Unicode support, build Boost.Python against different versions of Python, and much more.) When you build Boost, you have control over these options.
The defaults work for most people, but some people may need certain customizations. You might want a specific version of a specific optional dependency, or a certain library built a certain way. For that, you probably will need to build Boost yourself and maintain the build throughout your project. This is not a scary task!
If you don't have any special requirements, then a generic build would most probably be fine for you.

Related

How do I properly integrate a modular boost build into my CMake project using FetchContent?

CMake FetchContent is a great way to manage build dependencies, by integration the dependency into your build and build it from source along with your own sources.
I would like to do this with Boost, too. I am encouraged by the fact that CMake support for Boost is steadily improving.
Of course since Boost is a large package, and one rarely uses all Boost libraries in a project, it would be rather wasteful to pull the entire Boost sources into one's own build. Given the modularity of the Boost project, using git submodules, it would be much more intelligent and efficient to only fetch the sources for the libraries actually used, and FetchContent supports this via its GIT_SUBMODULES option.
However, this doesn't seem to cater for dependencies between the Boost libraries. Do I need to handle this manually, or is there a more intelligent solution?
Furthermore, how do I control installation in this scenario? Many Boost libraries are header-only, and I don't want the headers included in my installation, since I'm only using them for my build. Is there a way to tell CMake that I don't want anything installed from what I fetch with FetchContent?
I have read Craig Scott's book, and of course the CMake documentation, but there's not much information about this sort of problem.
But maybe I'm trying something that I shouldn't be doing. Have others tried this, and can show me how it's done properly?

What is the theoretical reason for C++ dependency production not being automated?

C++ Buildsystem with ability to compile dependencies beforehand
Java has Maven which is a pleasure to work with, simply specifying dependencies that are already compiled, and deposited to Mavens standard directory, meaning that the location of the dependencies is standardized as opposed to the often used way of having multiple locations (give me a break, like anyone remembers the default installed directories for particular deps) of C/C++ dependencies.
It is massively unproductive for every individual developer having to, more often than not, find, read about, get familiar with the configure options/build, and finally compile for every dependency to simply make a build of a project.
What is the theoretical reason this has not been implemented?
Why would it be difficult to provide packages of the following options with a maven-like declaration format?
version
platform (windows, linux)
src/dev/bin
shared/static
equivalent set of Boost ABI options when applicable
Having to manually go to websites and search out dependencies in the year 2013 for the oldest major programming language is absurd.
There aren't any theoretical reasons. There are a great many practical reasons. There are just too many different ways of handling things in the C++ world to easily standardize on a dependency system:
Implementation differences - C++ is a complicated language, and different implementations have historically varied in how well they support it (how well they can correctly handle various moderate to advanced C++ code). So there's no guarantee that a library could be built in a particular implementation.
Platform differences - Some platforms may not support exceptions. There are different implementations of the standard library, with various pros and cons. Unlike Java's standardized library, Windows and POSIX APIs can be quite different. The filesystem isn't even a part of Standard C++.
Compilation differences - Static or shared? Debug or production build? Enable optional dependencies or not? Unlike Java, which has very stable bytecode, C++'s lack of a standard ABI means that code may not link properly, even if built for the same platform by the same compiler.
Build system differences - Makefiles? (If so, GNU Make, or something else?) Autotools? CMake? Visual Studio project files? Something else?
Historical concerns - Because of C's and C++'s age, popular libraries like zlib predate build systems like Maven by quite a bit. Why should zlib switch to some hypothetical C++ build system when what it's doing works? How can a newer, higher-level library switch to some hypothetical build system if it depends on libraries like zlib?
Two additional factors complicate things:
In Linux, the distro packaging systems do provide standardized repositories of development library headers binaries, with (generally) standardized ABIs and an easy way of specifying a project's build dependencies. The existence of these (platform-specific) solutions reduces the impetus for a cross-platform solution.
With all of these complicating factors and pre-existing approaches, any attempt to establish a standard build system is going to run into the problem described in XKCD's "Standards":
Situation: There are 14 competing standards.
"14? Riculous! We need to develop one universal standard that covers everyone's use cases."
Soon: There are 15 competing standards.
With all of that said:
There is some hope for the future. For example, CMake seems to be gradually replacing other build systems. Some of the Boost developers have started Ryppl, an attempt to do what you're describing.
(also posted in linked question)
Right now I'm working on a tool able to automatically install all dependencies of a C/C++ app with exact version requirement :
compiler
libs
tools (cmake, autotools)
Right now it works, for my app. (Installing UnitTest++, Boost, Wt, sqlite, cmake all in correct order)
The tool, named «C++ Version Manager» (inspired by the excellent ruby version manager) is coded in bash and hosted on github : https://github.com/Offirmo/cvm
Any advices and suggestions are welcomed.
well, first off a system that resolves all the dependencies doesn't makes you productive by default, potentially it can make you even less productive.
Regarding the differences between languages I would say that in Java you have packages, which are handy when you have to organize and give a limited horizon to your code, in C++ you don't have an equivalent concept.
In C++ all the libraries that can solve a symbol are good enough for the compiler, the only real requirement for a library is to have a certain ABI and to solve the required symbols, there are no automated ways that you can work to pick the right library, also solving a symbol it's just a matter of linking your function to the actual implementation, this doesn't even grant you that a correct linking phase will make your app work.
To this you can add important variables such as the library version, different implementations of the same library and different libraries with the same methods name.
An example is the Mesa library VS the opengl lib from the official drivers, or whatever lib you want that offers multiple releases and each one can solve all the symbols but probably there is a release that is more mature than the others and you can ask a compiler to pick the right one because they are all the same for its own purposes .

What is the preferred way to structure and build OCaml projects?

It is unclear to newcomers to the ecosystem what is the canonically preferred way to structure and manage building small to medium sized OCaml projects. I understand the basics of ocamlc, &c.--they mirror conventional UNIX C compilers enough to seem straightforward. But, above the level of one-off compilation of individual files, it is unclear how best to manage compilation simply and cleanly. The issue is not searching for potential tools, but seeing one or a few Right (enough) Ways--as validated by the experience of the community--for structuring and building standard OCaml projects.
My model use case is a modest but nontrivial project, of pure OCaml or OCaml plus a C dependency. Such a project:
contains a number of source files
links to a number of standard libraries
links to one or more 3rd party libraries
optionally includes a C library and OCaml wrapper as a subproject (though this could also be managed separately and included as a 3rd party library, as in (3))
Several alternative tools stand out:
Custom Makefiles seem to be the common standard in most open source OCaml packages, but appear frustratingly verbose and complex--even more so than for modest C/C++ projects. Worse, many even seemingly simple OCaml libraries layer autoconf/automake on top for even greater complexity.
ocamlbuild appears to offer a modern, streamlined mechanism for automating builds with minimal configuration, but it is not well documented for newcomers, nor represented by example in the introductory materials in the OCaml ecosystem, nor visibly used by any of the various published OCaml projects which I have browsed for inspiration.
OASIS seems to be a layer of convention and library code atop other build systems to support building a package manager and library, like Cabal.
(I have also seen OMake, which appears to be a self-styled "make++" which also includes a suite of standard rules for common languages, including OCaml, and ocaml-make née OCamlMakefile, providing a template of standard rules for GNU make.)
Are any of these a preferred, modern way of managing OCaml builds?
How are project files best structured?
How are 3rd party library dependencies included and managed? Is it preferred to install them at the system level, or is there a standard and straightforward way of managing them locally to a project? I would much prefer a model in which projects remain as self-contained as possible.
You've got a thorough listing of the options available, but this question will not have a clear answer. My personal recommendation is also to use ocamlbuild. The myocamlbuild.ml file provided here is a good start. It will allow you to easily compile projects that depend on various libraries. I don't think it handles the case of binding to C libraries, but there are additional examples on the wiki that may help.
Some people object to ocamlbuild because it is yet another build tool, complicating package managers jobs. However, its ease of use and the fact that it is included in the official distribution is making it more and more widely used.
You can also skip all this and use oasis directly. It is very new, and a stable release has not yet been announced, but it is very usable. It will generate the myocamlbuild.ml automatically for you. This is probably the way to go in the very near future, if not already. Furthermore, by using oasis, you will immediately have the benefit of oasis-db, a CPAN like system for OCaml that is under development.
Regarding managing libraries, the answer is ocamlfind. If you have multiple instances of OCaml installed, calling the appropriate copy of ocamlfind will automatically cause all references to libraries be those for that particular instance, assuming you use ocamlfind systematically for all libraries. I currently use godi to install OCaml and libraries. It uses ocamlfind, and I have no problem having multiple instances of OCaml installed.
Personally I'd give +1 for ocamlbuild. It's default rules are good enough to compile small to medium projects with one command and none to very minimal configuration. It also enforces some very reasonable conventions (not mixing sources with build results). And for larger projects it can be customized to one's desire, with extra rules & plugins. At the company where I work we are using it for a large project (Ocaml + some C + some preprocessing + ...) and it works like a charm (and gives us much less headaches than Makefiles would).
As for manuals, I'd think the user guide (available from the author's webpage,) should be enough to get you started. More funky stuff may require a bit more digging.
+1 for OMake.
We revamped our build infrastructure a few years ago and chose OMake for the following reasons:
our products are composed of mixture of C, C++, Managed C++, Ruby and OCaml.
we target both Linux and Windows.
we interact with databases at build time.
for some productions we had to use OCaml 3.10.
our original build system used autoconf/automake.
we require out-of-source builds*.
To be honest I don't know if we could have done it with ocamlbuild, I haven't tested it. The tool is in use for sure since there is some activity around it in OCaml's bugtracker. If you opt for ocamlbuild, make sure you have an up-to-date version of OCaml.
*OMake supports out-of-source builds in a bit non-obvious way. It also has some issues when sources are read-only. We had to patch and rebuild our Windows version of OMake.
The current recommendation is to use Dune, a composable build system that supports OCaml and Reason compilation. It's being actively developed.
The Quickstart page gives a variety of project templates to start with.
Dune can also handle the following:
FFI with C and C++
Compile to JavaScript via js_of_ocaml
Cross compilation support via opam-cross
Opam file generation
Opam is the de-facto package manager for OCaml. Besides finding and installing packages, it can also handle multiple OCaml installations.
Esy is a newer, package.json-driven package manager born from the Reason community. Its pitch is that it brings out-of-the-box project sandboxing, and provides an easy way to pull existing opam packages.
good question. I would tend to say :
1) ocamlbuild
It is likely to be the standard way to compile, because it is efficient, fast, and the default tool given by the official distribution. The fact that it is in the official distribution is good point, because it is more likely to remain with time. Furthermore, it has ocamlfind enabled, so it can manage packages installed with ocamlfind, another standard for installing packages (ocamlfind is a bit like pkg-config for C)
2) But it won't be enough for your project. The integration with C is basic with ocamlbuild. So here I would maybe advise you to use oasis to finally answer your question. I also tried OMake, but did not like it.
3) However, your build scripts are not likely to work straight if you wan't other people to be able to download and build your project on their own machine. Furthermore, oasis does not handle pkg-config. For those reasons, I would tend to advise you to use ocaml-autoconf (ocaml macros for autotools). Because autotools are the standard for managing C libraries and it is well known by package maintainers. It can also handle cross-compilation...
=> ocaml-autoconf with ocamlbuild

Boost dependency for a C++ open source project?

Boost is meant to be the standard non-standard C++ library that every C++ user can use. Is it reasonable to assume it's available for an open source C++ project, or is it a large dependency too far?
Basically your question boils down to “is it reasonable to have [free library xyz] as a dependency for a C++ open source project.”
Now consider the following quote from Stroustrup and the answer is really a no-brainer:
Without a good library, most interesting tasks are hard to do in
C++; but given a good library, almost any task can be made easy
Assuming that this is correct (and in my experience, it is) then writing a reasonably-sized C++ project without dependencies is downright unreasonable.
Developing this argument further, the one C++ dependency (apart from system libraries) that can reasonably be expected on a (developer's) client system is the Boost libraries.
I know that they aren't but it's not an unreasonable presumption for a software to make.
If a software can't even rely on Boost, it can't rely on any library.
Take a look at http://www.boost.org/doc/tools.html. Specifically the bcp utility would come in handy if you would like to embed your boost-dependencies into your project. An excerpt from the web site:
"The bcp utility is a tool for extracting subsets of Boost, it's useful for Boost authors who want to distribute their library separately from Boost, and for Boost users who want to distribute a subset of Boost with their application.bcp can also report on which parts of Boost your code is dependent on, and what licences are used by those dependencies."
Of course this could have some drawbacks - but at least you should be aware of the possibility to do so.
I used to be extremely wary of introducing dependencies to systems, but now I find that dependencies are not a big deal. Modern operating systems come with package managers that can often automatically resolve dependencies or, at least,make it very easy for administrators to install what is needed. For instance, Boost is available under Gentoo-Postage as dev-libs/boost and under FreeBSD ports as devel/boost.
Modern open source software builds a lot on other systems. In a recent study, by tracking the dependencies of the FreeBSD packages, we established that the 12,357 ports packages in our FreeBSD 4.11 system, had in total 21,135 library dependencies; i.e., they required a library, other than the 52 libraries that are part of the base system, in order to compile. The library dependencies comprised 688 different libraries, while the number of different external libraries used by a single project varied between 1 and 38, with a mode value of 2. Furthermore, 5,117 projects used at least one external library and 405 projects used 10 or more.
In the end the answer to your question will come from a cost versus benefit analysis. Is the benefit of re-using a mature, widely used, reviewed, and tested library like Boost and larger than the low and falling cost of a dependency? For any non-trivial use of Boost's facilities the answer is that you should go ahead and use Boost.
It depends. If you're using a header file only defined class template in Boost - then yes go ahead and use it because it doesn't suck in any Boost shared library, as all the code is generated at compile time with no external dependencies. Versioning problems are a pain for any shared c++ library, and Boost is not immune from this, so if you can avoid the problem altogether it's a good thing.
The benefits of using boost when writing C++ code that they significantly outweigh the extra complexity of distributing the open source code.
I work on Programmer's Notepad and the code takes a dependency on boost for test, smart pointers, and python integration. There have been a couple of complaints due to the requirement, but most will just get on with it if they want to work on the code. Taking the boost dependency was a decision I have never regretted.
To make the complexity slightly less for others, I include versioned pre-built libraries for boost python so that all they need to do is provide boost in their include directories.
KDE also depends on Boost.
However it mostly depends on your goals, and even more so on your target audience, rather than the scope of your project. for example TinyJSON (very small project), is almost 100% Boost, but thats fine because the API it provides is Boost-like and targeted at Boost programmers that need JSON bindings. However many other JSON libraries don't use Boost because they target other audiences.
On the other hand I can't use Boost at work, and I know lots of other developers (in their day jobs) are in the same boat. So I guess you could say if your Target is OpenSource, and a group that uses Boost, go ahead. If you target enterprise you might want to think it over and copy-paste just the necessary parts from Boost(and commit to their support) for your project to work.
Edit: The reason we can't use it at work is because our software has
to be portable to about 7 different
platforms and across 4 compilers. So
we can't use boost because it hasn't
been proven to be compatible with
all our targets, so the reason is a
technical one. (We're fine with the
OpenSource and Boost License part,
as we use Boost for other things at
times)
I would say yes. Both Mandriva (Red Hat based) and Ubuntu (Debian based) have packages for the Boost libriaries.
I think the extensive functionality that Boost provides and, as you say, it is the standard non-standard C++ library justifies it as a dependency.
Unfortunately yes, for ubuntu they're readily available but for RHEL 4&5 I've almost always ended up making them from tarballs. They're great libraries, just really big... like using a rail spike when sometimes all you really need is a thumbtack.
It all depends on the way you're going to use Boost. As Diomidis said, if you're going to use some non-trivial facilities from Boost, just go ahead. Using libraries is not a crime.
Of course, there are many people who prefer not to use Boost, because introducing new dependencies has always some cons and extra worries, but in an open source project... in my opinion it's even alright to use them if you just want to learn them or improve your skills on them.

Include only certain libraries on an operating system

When writing an app that one wants to have compile on mac, linux and windows, what is the best way of managing the different libraries that will need to be included on the various operating systems. For example, using the glut opengl toolkit requires different includes on each operating system.
Your question is actually two questions in one:
1) How do I write my C++ code to include the right include files on the right platform?
2) How do I write my Makefile to work on different platforms?
The C++ code question is already answered - find the platform-specific defines and use them to figure out what platform you're on.
Automake or scons are quite complex, and are worth your time only if you intend to release your code to a wide audience. In the case of in-house code, a "generic" makefile with per-platform include is usually sufficient. For Windows, you can get GNU Make for Windows (available from here, or use nmake and limit yourself to the subset of syntax common between all platforms.
If you just need to worry about header files, then the preprocessor will do everything you need. If you want to handle differing source files, and possibly different libraries you'll need a tool to handle it.
Some options include:
The Autotools
Scons
CMake
My personal favorite is CMake. The Autotools uses a multi-stage process that's relatively easy to break, and scons just feels weird to me. Cmake will also generate project files for a variety of IDEs, in addition to makefiles.
There is a good article on Macros. One of the answers how to use conditional compilation based on OS/COmpiler (its near the top).
The use of the Autoconfiguration tools is a nice addition on top of this but is not needed for small projects where it may be easier to detect the OS explicitly, though for larger projects that may need to run on many different types of OS you should also explore the Available autoconfiguration tools mentioned by Branan
Several projects I've worked on use an autoconf-based configure script which builds a Makefile, hence the reason you can build all of them from source with a simple:
./configure
make
make install
Scons has a configuring mechanism that will do a lot of what autotools do without as much complexity, and is pretty darn portable (although not as portable as autotools).
The compiler should have a set of preprocessor symbols it will provide that you can use. For example linux for gcc on a Linux system, _WIN32 for VC++. If you need something more complex then look at autoconf, but that works best for Unix based code.
I'd recommend checking out how some of the larger OpenSource projects handle this. See AutoSense.hpp from (an old release of) Apache Xerces.
If the libraries offer the same API on the different platforms, I would create a "proxy" include file containing all the necessary #ifdefs. That 'platform-independent' include file is then included in your client code instead of cluttering it with numerous and ugly-reading preprocessor commands. These will be contained in the ugly and cluttered platform-independent include.
If the API differs across platforms, you will need to create your own abstraction.
Perhaps this is a cop-out answer, but have you looked at how boost handles this? They build on quite a few platforms without autoconf, although they do have their own build system - bjam - that probably handles some of the same situations. They also do a nice auto-linking trick on windows that automatically selects the right version of libraries for linking depending on the version of the MSVC compiler. Based on your initial description, it sounds like just macro defs checking for various platforms/compilers might do the trick, but perhaps there is more to your problem that would prevent this.