When writing an app that one wants to have compile on mac, linux and windows, what is the best way of managing the different libraries that will need to be included on the various operating systems. For example, using the glut opengl toolkit requires different includes on each operating system.
Your question is actually two questions in one:
1) How do I write my C++ code to include the right include files on the right platform?
2) How do I write my Makefile to work on different platforms?
The C++ code question is already answered - find the platform-specific defines and use them to figure out what platform you're on.
Automake or scons are quite complex, and are worth your time only if you intend to release your code to a wide audience. In the case of in-house code, a "generic" makefile with per-platform include is usually sufficient. For Windows, you can get GNU Make for Windows (available from here, or use nmake and limit yourself to the subset of syntax common between all platforms.
If you just need to worry about header files, then the preprocessor will do everything you need. If you want to handle differing source files, and possibly different libraries you'll need a tool to handle it.
Some options include:
The Autotools
Scons
CMake
My personal favorite is CMake. The Autotools uses a multi-stage process that's relatively easy to break, and scons just feels weird to me. Cmake will also generate project files for a variety of IDEs, in addition to makefiles.
There is a good article on Macros. One of the answers how to use conditional compilation based on OS/COmpiler (its near the top).
The use of the Autoconfiguration tools is a nice addition on top of this but is not needed for small projects where it may be easier to detect the OS explicitly, though for larger projects that may need to run on many different types of OS you should also explore the Available autoconfiguration tools mentioned by Branan
Several projects I've worked on use an autoconf-based configure script which builds a Makefile, hence the reason you can build all of them from source with a simple:
./configure
make
make install
Scons has a configuring mechanism that will do a lot of what autotools do without as much complexity, and is pretty darn portable (although not as portable as autotools).
The compiler should have a set of preprocessor symbols it will provide that you can use. For example linux for gcc on a Linux system, _WIN32 for VC++. If you need something more complex then look at autoconf, but that works best for Unix based code.
I'd recommend checking out how some of the larger OpenSource projects handle this. See AutoSense.hpp from (an old release of) Apache Xerces.
If the libraries offer the same API on the different platforms, I would create a "proxy" include file containing all the necessary #ifdefs. That 'platform-independent' include file is then included in your client code instead of cluttering it with numerous and ugly-reading preprocessor commands. These will be contained in the ugly and cluttered platform-independent include.
If the API differs across platforms, you will need to create your own abstraction.
Perhaps this is a cop-out answer, but have you looked at how boost handles this? They build on quite a few platforms without autoconf, although they do have their own build system - bjam - that probably handles some of the same situations. They also do a nice auto-linking trick on windows that automatically selects the right version of libraries for linking depending on the version of the MSVC compiler. Based on your initial description, it sounds like just macro defs checking for various platforms/compilers might do the trick, but perhaps there is more to your problem that would prevent this.
Related
I pretty new to C++. I was wondering, as what is considered generally a neat way to provide paths for various files/libraries while compiling or executing c++ codes.
For ex:
I have Boost libraries installed in some location on my system. Lets call it X
In order to execute anything I have to type in
c++ -I LongpathWhichisX/to/boost_1_60_0 example.cpp -o example
Similarly, also Long path for the input file while executing the code.
Is there a better way to address it. Is it possible to create environment variables lets Y, which refers to path 'X'. And we can use following command to compile code
c++ -I Y/to/boost_1_60_0 example.cpp -o example
Best way is to use build tools. For example you can use Make. You define all your include paths (and other options) in the Makefile. In console you just have to call make to build your project or something like make run to run your project.
The usual way is to make a Makefile where you can specify all needed paths and compile options in proper variables.
If you don't want/need a Makefile and rather want to run compiler from command-line, then you may use the CPATH environment variable to specify a colon-separated list of paths to include files.
This is a broad question but the other answers highlight the most important step. It is essential to learn build tools like make because they will make it easier to build your projects during development and for others to build it later. In the modern programming age though this is not enough. If you are using something like Boost (which targets many platforms) you will probably want to make your build cross-platform as well. For this you would use either cmake or autotools which both have scripts that make it much easier to locate the Boost libraries (and others).
Any other build systems, in my opinion, are a pain and are the bane of maintainers of Linux distributions. CMake used to be in that catergory but it has gained wide acceptance now. CMake targets building cross-platform projects across operating systems (Windows and Unixes) better (again in my opinion) because it attempts to provide the native build system on each platform (for example: Visual Studio in Windows, Make on all Unices, XCode on Mac). The autotools instead target the Unix environment with much greater depth (you have a bit of a harder time on Windows, but you can target embedded Unix systems to high end Unix server systems with much more flexibility).
Note: Autotools support for cross-compiling is superior in almost every way to other solutions. I always cringe when I download something that needs to be cross compiled for Arm Linux and it uses some weird build system. Funnily enough, boost is one of these.
This is a bit of a long winded answer. In summary, it is essential that you learn a build system for native development. It is part of your skill set and until you have that skill you can't really contribute to open-source projects or even your employer developing closed-source projects.
I am currently developing a C++ command line utility to be distributed as an open-source utility on Github. However, I want people who download the program to be able to easily compile and run the program on any platform (specifically Mac, Linux, and Windows) in as few steps as possible. Assuming only small changes have to be made to the code to make it compatible with the various platform-independent C++ compilers (g++ and win32), how can I do this? Are makefiles relevant?
My advice is, do not use make files, maintaining the files for big enougth projects is tedious and errors happen sometimes which you don't catch immediatly (because the *.o file is still there).
See this question here
Makefiles are indeed highly relevant. You may find that you need (at least) two different makefiles to compensate for the fact that you have different compilers.
It's hard to be specific about how you solve this, since it depends on how complex the project is. It may be easiest to write a script/batchfile, and just document "Use the command build.sh on Linux/Unix, and build.bat on Windows") - and then let the respective files deal with for example setting up the name of the compiler and flags, etc.
Or you can have an include into the makefile, which is determined by the architecture. Or different makefiles.
If the project is REALLY simple, it may be just enough to provide a basic makefile - but it's unlikely, as a compile of x.cpp on Linux/MacOS makes an object file is called x.o, on windows the object file is called x.obj. Libraries have different names, dll's have differnet names, and on Linux/MacOS, the final executable has no extension (typically) so it's called "myprog", where the executable under windows is called "myprog.exe".
These sorts of differences mean that the makefile needs to be different.
C++ Buildsystem with ability to compile dependencies beforehand
Java has Maven which is a pleasure to work with, simply specifying dependencies that are already compiled, and deposited to Mavens standard directory, meaning that the location of the dependencies is standardized as opposed to the often used way of having multiple locations (give me a break, like anyone remembers the default installed directories for particular deps) of C/C++ dependencies.
It is massively unproductive for every individual developer having to, more often than not, find, read about, get familiar with the configure options/build, and finally compile for every dependency to simply make a build of a project.
What is the theoretical reason this has not been implemented?
Why would it be difficult to provide packages of the following options with a maven-like declaration format?
version
platform (windows, linux)
src/dev/bin
shared/static
equivalent set of Boost ABI options when applicable
Having to manually go to websites and search out dependencies in the year 2013 for the oldest major programming language is absurd.
There aren't any theoretical reasons. There are a great many practical reasons. There are just too many different ways of handling things in the C++ world to easily standardize on a dependency system:
Implementation differences - C++ is a complicated language, and different implementations have historically varied in how well they support it (how well they can correctly handle various moderate to advanced C++ code). So there's no guarantee that a library could be built in a particular implementation.
Platform differences - Some platforms may not support exceptions. There are different implementations of the standard library, with various pros and cons. Unlike Java's standardized library, Windows and POSIX APIs can be quite different. The filesystem isn't even a part of Standard C++.
Compilation differences - Static or shared? Debug or production build? Enable optional dependencies or not? Unlike Java, which has very stable bytecode, C++'s lack of a standard ABI means that code may not link properly, even if built for the same platform by the same compiler.
Build system differences - Makefiles? (If so, GNU Make, or something else?) Autotools? CMake? Visual Studio project files? Something else?
Historical concerns - Because of C's and C++'s age, popular libraries like zlib predate build systems like Maven by quite a bit. Why should zlib switch to some hypothetical C++ build system when what it's doing works? How can a newer, higher-level library switch to some hypothetical build system if it depends on libraries like zlib?
Two additional factors complicate things:
In Linux, the distro packaging systems do provide standardized repositories of development library headers binaries, with (generally) standardized ABIs and an easy way of specifying a project's build dependencies. The existence of these (platform-specific) solutions reduces the impetus for a cross-platform solution.
With all of these complicating factors and pre-existing approaches, any attempt to establish a standard build system is going to run into the problem described in XKCD's "Standards":
Situation: There are 14 competing standards.
"14? Riculous! We need to develop one universal standard that covers everyone's use cases."
Soon: There are 15 competing standards.
With all of that said:
There is some hope for the future. For example, CMake seems to be gradually replacing other build systems. Some of the Boost developers have started Ryppl, an attempt to do what you're describing.
(also posted in linked question)
Right now I'm working on a tool able to automatically install all dependencies of a C/C++ app with exact version requirement :
compiler
libs
tools (cmake, autotools)
Right now it works, for my app. (Installing UnitTest++, Boost, Wt, sqlite, cmake all in correct order)
The tool, named «C++ Version Manager» (inspired by the excellent ruby version manager) is coded in bash and hosted on github : https://github.com/Offirmo/cvm
Any advices and suggestions are welcomed.
well, first off a system that resolves all the dependencies doesn't makes you productive by default, potentially it can make you even less productive.
Regarding the differences between languages I would say that in Java you have packages, which are handy when you have to organize and give a limited horizon to your code, in C++ you don't have an equivalent concept.
In C++ all the libraries that can solve a symbol are good enough for the compiler, the only real requirement for a library is to have a certain ABI and to solve the required symbols, there are no automated ways that you can work to pick the right library, also solving a symbol it's just a matter of linking your function to the actual implementation, this doesn't even grant you that a correct linking phase will make your app work.
To this you can add important variables such as the library version, different implementations of the same library and different libraries with the same methods name.
An example is the Mesa library VS the opengl lib from the official drivers, or whatever lib you want that offers multiple releases and each one can solve all the symbols but probably there is a release that is more mature than the others and you can ask a compiler to pick the right one because they are all the same for its own purposes .
I've seen lots of methods been used to resolved the dependencies in Makefile, such as using gcc -MM and sed commond, or using the include directive (plus a little Perl magic), or qmake, or automake, or info make, etc.
Facing such many options, I am confused of which should I choose. So, I wanna know what's the common way to resolve dependencies in Makefile nowadays? What's the best way to cope with this problem?
PS: C/CPP project.
Generally if all you care about is systems that support GNU make and gcc (such as all linux variants and most unix like systems these days), you just use gcc's various -M flags to generate the dependencies and then -include them in your Makefile. There's some good info in this question -- generally there's no need to use sed or any more complex tools.
If you only need to support lots of Linux distributions (as you noted in a comment), then I'd recommend the automake/autoconf suite.
This answer assumes you are only asking generally and you do not yet know what specific issues you will have to resolve as you go.
Edit:
GNU make alone can handle dependency generation within your own project.
autoconf handles optional or alternative dependencies on third party libraries, tools or system features. automake provides macros some of which are occasionally useful even if you are otherwise using autoconf without automake.
A side benefit of starting with automake outright is that your makefiles will behave completely predictably (in terms of conventions and portability) with less investment of attention.
Hence my humble recommendation.
There are several ways to generate make-compatible dependencies for C/C++ projects:
gcc -M, which comes in several flavors and is sort of "the gold standard" in terms of accuracy, since it uses the actual compiler to generate the dependencies, and who would know better how to process #include statements than the compiler itself?
makedepend, which is generally discouraged in favor of compiler-generated dependencies.
fastdep, another third-party dependency generator which purports to be faster than gcc -M.
ElectricAccelerator has a built-in feature called autodep which uses the filesystem usage activity of the commands invoked in the build to generate dependency information. The advantage of autodep over the alternatives is that it is extremely fast and completely tool and programming language independent -- while the others are all tied to C/C++ or require the use of a specific compiler, autodep works with all types of build tools.
I did a performance comparison of several of these options a while back.
Disclaimer: I am the architect and lead developer of ElectricAccelerator.
I've got a C++ project where we have loads and loads of dependencies. The project should work on Linux and Windows, so we've ported it to CMake. Most dependencies are now included right into the source tree and build alongside the project, so there are no problems with those.
However, we have one binary which depends on Fortran code etc. and is really complicated to build. For Linux, it's also not available as a package, but only as precompiled binaries or with full source (needs a BLAS library installed and several other dependencies). For windows, the same library is available as binary, building for Windows seems even more complicated.
The question is, how do you handle such dependencies? Just check in the binaries for the supported platforms, and require the user to set up his build environment otherwise (that is, manually point to the binary location), or would you really try to get them compiled along (even if it requires installing like 10 libraries -- BLAS libraries are the biggest pain here), or is there some other recommended way to handle that?
If the binary is independant of the other part of your build process, you definitively should check-in it. But as you cannot include every version of the binary (I mean for every platform and compile flags the user might use) the build from source seems mandatory.
I have done something similar. I have checked-in the source code archives of the libraries/binaries I needed. Then I wrote makefile/scripts to build them according to the targeted platform/flags in a specific location (no standard OS location) and make my main build process to point to the right location. I have done that to be able to handle the correct versions and options of the libraries/binaries I needed. It's quite a hard work to make things works for different platforms but it's worth the time !
Oh, and of course it's easier if you use crossplatform build tools :)
One question to you. Does the users need to modify this binary, or are they just happy it's there so the can use/access it? If they don't need to modify it, check in the binaries.
I would agree, check in the binaries for each platform if they are not going to be modified very often. Not only will this reduce build times, but it will also reduce frustration from unnecessary compilations.