Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
Imagine you're free to choose a tool like GNU make for a new C++ project. What would you choose? Are any usable substitutes out there?
It shall have/be
a command line interface
"easy" to understand
easy to set up for a default c++ project
may support src/bin seperation as common for Java
may not add too much dependencies to other software/libs
platform independent (new)
features:
build rules / templates like make but in an human readable way
recursively crawling directories and applying the rules if there is no other
"Makefile"
configuration by exception
Note:
Nothing's wrong with GNU make. I just don't like its grammar, all the stuff that grows in the years and the silly recursive make problems. I'm using gmake for years now.
I use cmake, and I'm very glad I made the switch.
EDIT
Feature list as found in the wikipedia article:
Configuration files are CMake scripts, which use a programming
language specialized to software builds
Automatic dependency analysis built-in for C, C++, Fortran and Java
Support of SWIG, Qt, via the CMake scripting language
Built-in support for many versions of Microsoft Visual Studio including versions 6, 7, 7.1, 8.0, and 9.0
Generates build files for Eclipse CDT (C/C++ Development Tools)
Detection of file content changes using traditional timestamps,
Support for parallel builds
Cross-compilation
Global view of all dependencies, using CMake to output a graphviz diagram
Support for cross-platform builds, and known to work on
Linux and other POSIX systems (including AIX, *BSD systems, HP-UX, IRIX/SGI, and Solaris)
Mac OS X
Windows 95/98/NT/2000/XP, Windows Vista and MinGW/MSYS
Integrated with DART (software), CDash, CTest and CPack, a collection of tools for software testing and release
But to be honest: Just about anything is better than the gnu toolchain.
SCons and waf are good (well, IMHO anyway) build systems, and depend only on Python.
How about "gnu make"?
You asked for something like it without giving any indication of what features you want that aren't supported by gnu make.
Boost.Jam
It has the features you named
command line interface;
easy;
it comes from the C++ library collection Boost, so it has good support for C++ (and it's not limited to C++, either);
it stores executables in places under the bin directory, depending on what build request you've commanded. If you use gcc 4.3.2, than you get the executables under
bin/gcc-4.3.2/debug -- when executing bjam
bin/gcc-4.3.2/release -- when executing bjam release
bin/gcc-4.3.2/release/inlining-off/threading-multi -- when executing bjam release inlining=off threading=multi
bin/gcc-4.3.2/release/threading-multi -- for bjam release inlining=full threading=multi because inlining=full is default for release.
it doesn't need the full Boost library collection, only Boost.Build and Boost.Jam are necessary;
platform independent;
the Jamfile syntax is easy, but powerful;
you can divide the build config into many Jamfiles in subdirectories.
CMake should answer most, if not all for your requirements.
It will generate the Makefiles for you.
It has a good domain specific primitives, plus a simple language for the times you need to do something special.
It solves most of the problems with recursive make (see recursive make is considered harmful paper).
It uses an out-of-source build, so you have your bin / src separation.
I found it easy to write, easy to maintain, and fast to build.
... Plus:
It is cross platform.
With CText and CDash it has what you need for setup a continues integration service.
See also this answer to Recursive Make - friend or foe?
SCons + swtoolkit
G'day,
I'd agree with the couple of answers, so far, that recommend sticking with gmake.
Maybe have another look after reading the first few chapters of Robert Mecklenburg's excellent book "Managing Projects with GNU Make" (sanitised Amazon link).
Or, even better, is to search out a copy of the previous edition called "Managing Projects With make" by Andrew Oram and Steve Talbott. The first few chapters give an excellent description of (g)make and [Mm]akefile basics.
And I see you can buy a second hand copy of the 2nd ed. from Amazon for the princely sum of $0.01! (sanitised Amazon link)
After reading that intro you'll even understand how make is backward chaining, which is a bit non-intuitive when just looking at make's behaviour.
HTH
cheers,
Autotools -- autoconf/automake/libtool they are very poweful build instruments.
Its take some time to start with them, but after -- they serv you very well.
And what is more important they are significantly more powerfull then their replacements (CMake, BJam, SCons etc).
How are they more powerfull?
Transparrent support of building both static and shared libraries via libtool.
Support of uninstall (no in CMake)
Support of standard installation paths -- very important in packaging.
Full support and integration of gettext.
Transparent and strightforward support of cross compilation.
Many things. CMake may do most of things but for each one of them you should write long scripts or specify many things manually.
What's wrong with gmake?
What issues does it have that mean you want to use it. There's no point in recommending another tool if it has the same perceived issues as gmake.
we're using gmake in our build system and we're extremely happy with it's performance, flexibility and features
Related
C++ Buildsystem with ability to compile dependencies beforehand
Java has Maven which is a pleasure to work with, simply specifying dependencies that are already compiled, and deposited to Mavens standard directory, meaning that the location of the dependencies is standardized as opposed to the often used way of having multiple locations (give me a break, like anyone remembers the default installed directories for particular deps) of C/C++ dependencies.
It is massively unproductive for every individual developer having to, more often than not, find, read about, get familiar with the configure options/build, and finally compile for every dependency to simply make a build of a project.
What is the theoretical reason this has not been implemented?
Why would it be difficult to provide packages of the following options with a maven-like declaration format?
version
platform (windows, linux)
src/dev/bin
shared/static
equivalent set of Boost ABI options when applicable
Having to manually go to websites and search out dependencies in the year 2013 for the oldest major programming language is absurd.
There aren't any theoretical reasons. There are a great many practical reasons. There are just too many different ways of handling things in the C++ world to easily standardize on a dependency system:
Implementation differences - C++ is a complicated language, and different implementations have historically varied in how well they support it (how well they can correctly handle various moderate to advanced C++ code). So there's no guarantee that a library could be built in a particular implementation.
Platform differences - Some platforms may not support exceptions. There are different implementations of the standard library, with various pros and cons. Unlike Java's standardized library, Windows and POSIX APIs can be quite different. The filesystem isn't even a part of Standard C++.
Compilation differences - Static or shared? Debug or production build? Enable optional dependencies or not? Unlike Java, which has very stable bytecode, C++'s lack of a standard ABI means that code may not link properly, even if built for the same platform by the same compiler.
Build system differences - Makefiles? (If so, GNU Make, or something else?) Autotools? CMake? Visual Studio project files? Something else?
Historical concerns - Because of C's and C++'s age, popular libraries like zlib predate build systems like Maven by quite a bit. Why should zlib switch to some hypothetical C++ build system when what it's doing works? How can a newer, higher-level library switch to some hypothetical build system if it depends on libraries like zlib?
Two additional factors complicate things:
In Linux, the distro packaging systems do provide standardized repositories of development library headers binaries, with (generally) standardized ABIs and an easy way of specifying a project's build dependencies. The existence of these (platform-specific) solutions reduces the impetus for a cross-platform solution.
With all of these complicating factors and pre-existing approaches, any attempt to establish a standard build system is going to run into the problem described in XKCD's "Standards":
Situation: There are 14 competing standards.
"14? Riculous! We need to develop one universal standard that covers everyone's use cases."
Soon: There are 15 competing standards.
With all of that said:
There is some hope for the future. For example, CMake seems to be gradually replacing other build systems. Some of the Boost developers have started Ryppl, an attempt to do what you're describing.
(also posted in linked question)
Right now I'm working on a tool able to automatically install all dependencies of a C/C++ app with exact version requirement :
compiler
libs
tools (cmake, autotools)
Right now it works, for my app. (Installing UnitTest++, Boost, Wt, sqlite, cmake all in correct order)
The tool, named «C++ Version Manager» (inspired by the excellent ruby version manager) is coded in bash and hosted on github : https://github.com/Offirmo/cvm
Any advices and suggestions are welcomed.
well, first off a system that resolves all the dependencies doesn't makes you productive by default, potentially it can make you even less productive.
Regarding the differences between languages I would say that in Java you have packages, which are handy when you have to organize and give a limited horizon to your code, in C++ you don't have an equivalent concept.
In C++ all the libraries that can solve a symbol are good enough for the compiler, the only real requirement for a library is to have a certain ABI and to solve the required symbols, there are no automated ways that you can work to pick the right library, also solving a symbol it's just a matter of linking your function to the actual implementation, this doesn't even grant you that a correct linking phase will make your app work.
To this you can add important variables such as the library version, different implementations of the same library and different libraries with the same methods name.
An example is the Mesa library VS the opengl lib from the official drivers, or whatever lib you want that offers multiple releases and each one can solve all the symbols but probably there is a release that is more mature than the others and you can ask a compiler to pick the right one because they are all the same for its own purposes .
I've seen lots of methods been used to resolved the dependencies in Makefile, such as using gcc -MM and sed commond, or using the include directive (plus a little Perl magic), or qmake, or automake, or info make, etc.
Facing such many options, I am confused of which should I choose. So, I wanna know what's the common way to resolve dependencies in Makefile nowadays? What's the best way to cope with this problem?
PS: C/CPP project.
Generally if all you care about is systems that support GNU make and gcc (such as all linux variants and most unix like systems these days), you just use gcc's various -M flags to generate the dependencies and then -include them in your Makefile. There's some good info in this question -- generally there's no need to use sed or any more complex tools.
If you only need to support lots of Linux distributions (as you noted in a comment), then I'd recommend the automake/autoconf suite.
This answer assumes you are only asking generally and you do not yet know what specific issues you will have to resolve as you go.
Edit:
GNU make alone can handle dependency generation within your own project.
autoconf handles optional or alternative dependencies on third party libraries, tools or system features. automake provides macros some of which are occasionally useful even if you are otherwise using autoconf without automake.
A side benefit of starting with automake outright is that your makefiles will behave completely predictably (in terms of conventions and portability) with less investment of attention.
Hence my humble recommendation.
There are several ways to generate make-compatible dependencies for C/C++ projects:
gcc -M, which comes in several flavors and is sort of "the gold standard" in terms of accuracy, since it uses the actual compiler to generate the dependencies, and who would know better how to process #include statements than the compiler itself?
makedepend, which is generally discouraged in favor of compiler-generated dependencies.
fastdep, another third-party dependency generator which purports to be faster than gcc -M.
ElectricAccelerator has a built-in feature called autodep which uses the filesystem usage activity of the commands invoked in the build to generate dependency information. The advantage of autodep over the alternatives is that it is extremely fast and completely tool and programming language independent -- while the others are all tied to C/C++ or require the use of a specific compiler, autodep works with all types of build tools.
I did a performance comparison of several of these options a while back.
Disclaimer: I am the architect and lead developer of ElectricAccelerator.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 9 years ago.
I've recently installed Ubuntu 11.10 and along with it the CodeBlocks IDE and I am aware that I have gcc and the std libraries by default.
My questions are:
Do you you have any tips for a new C++ programmer on Ubuntu?
Any libraries I should get from the start?
A really good IDE I'm missing? (YMMV but I prefer to work in IDE's)
Any programming boons or traps I should be aware of from the start?
You don't need an IDE to code in C or C++ on Ubuntu. You can use a good editor (like emacs, which you can configure to suit your needs.).
Some few tips for a newbie:
Always compile with -Wall -Wextra and perhaps even with -Werror -pedantic-errors
Order of arguments to the compiler (gcc or g++) are really important; I recommend:
general warnings and optimization flags (e.g. -Wall, -g to get debug info, -O, -flto etc, or -c to avoid linking , ...)
preprocessor options like -I include-dir and -D defined-symbol (or -H to understand which headers get included) etc..
source file[s] to compile like hello.c or world.cc
if you want to link existing object files else.o, add them after the source files
linker options (if relevant), notably -L library-dir (and probably -rdynamic if your program uses plugins with dlopen(3) ....)
libraries (like -lfoo -lbar from higher-level libraries like libfoo.so to lower-level libraries.
output file (i.e. produced executable), e.g. -o yourexec.
Always correct your source code till you got no warning at all. Trust the compiler's warnings and error messages.
Learn how to use make and to write simple Makefile-s; see this example.
there are other builders, e.g. http://omake.metaprl.org/ etc
Compile your code with the -g flag to have the compiler produce debugging information; only when you have debugged your program, ask the compiler to optimize (e.g. with -O1 or -O2), especially before benchmarking.
Learn how to use gdb
Use a version control system like svn or git (even for a homework assignment). In 2015 I recommend git over svn
Backup your work.
Learn to use valgrind to hunt memory leaks.
NB
The advices above are not specific to Ubuntu 11.10, they could apply to other Linux distributions and other Ubuntu versions.
QT Creator is a good IDE, that works well also with simple Makefile based projects. Also, as a C++ programmer you should check out Dia and Dia2Code for automatic generation of stubs from UML diagrams.
Since you ask more than one question I will answer each separately.
Do you you have any tips for a new C++ programmer on Ubuntu?
Learn some build system such as CMake or SCons. Although understanding how make and Makefiles work is useful there is a tendency of moving away from make to more high-level tools which also provide configure-like functionality. Make is often used for command-line build, for example with CMake you can generate Makefiles and build your projects using make.
Use a version control system such as git or Mercurial. I also recommend keeping those your projects you care about on some external service like github at least for the purposes of backup.
Pay attention to compiler warnings but keep in mind that warnings only catch a fraction of possible errors. A more complete picture can be obtained using static analysis tools and dynamic analysis tools like Valgrind.
Any libraries I should get from the start?
You've already got the main one which is called the C++ Standard Library. Make sure that you know what it provides.
Boost will cover most of the remaining needs except GUI.
Gtkmm and Qt are two major C++ GUI frameworks.
A really good IDE I'm missing? (YMMV but I prefer to work in IDE's)
Eclipse - for a long time I've been thinking of it as a Java only IDE, but in fact it is an excellent IDE for almost anything (I even wrote my PhD thesis in it using TeXlipse plugin) and C/C++ support is improving all the time. Also CMake can generate Eclipse CDT project files.
Qt Creator - another excellent C++ IDE. It is very fast and has native CMake support
Any programming boons or traps I should be aware of from the start?
From my experience the most common sources of errors in C++ are pointers and resource management in case of exceptions. Make sure you understand and use the RAII idiom and smart pointers.
For a more complete list of traps and recommendations see the answers to this question.
Some tips besides those which are already mentioned:
Valgrind is your friend in finding memory leaks. You may also use valgrind --tool=callgrind and KCacheGrind to see where does your program spend time on execution.
If you are going to distribute your program, you should learn autotools or cmake. The first is a classical tool, a bit bloated, the second is more modern.
Geany is a nice IDE if you are looking for something lightweight. Otherwise, take a look at Code::Blocks, Eclipse/CDT and NetBeans.
Since I am not sure what you meant by "std libraries", I should mention that besides standard C library, there are a lot of POSIX functions and interfaces, which are common to most *nix-systems, including Mac OS X.
Eclipse/CDT runs really well on Ubuntu.
Boost provide a whole bunch of libraries that are commonly used and can come in handy. Anyway, I'm not really sure this questions fits in too well on a Q&A board.
EDIT: As suggested by Basile, Makefiles and learning to use gdb are great ideas. There are plenty of neat flags to use with gcc also, for helping to debug your code, optimize it, produce assembly instructions, etc.
On the first steps of programming you should not use IDE because you will better understand what happens backside :) GCC or G++ and stdlib will be sufficient. You also should read about Makefiles, SVN(CVS, GIT), Autotools or CMake to manage your projects. If you want make GUI applications you should learn GTK+ or Qt. If you want real IDE for your needs try Eclipse with C/C++ plugins. Good luck :)
If you are familiar with the command line you can use an editor like vim and gcc/g++ to compile your code, learning make svn git is also recommend.
In case you are not familiar with the command line or you prefer using the UI :NetBeans is also a good IDE you can use to develop in c/c++ and java.
To install netbeans: open firefox and point to apt://netbeans
I hope this will help you.
I think Netbeans is good. Same UI in Microsoft Windows and Linux. Built-in version Controller and installed Git by default.
No extra library added (as oposit of QT)
Library: I recommend you to use Boost. You can find many libraries in it.
IDE: Eclipse and QTCreator are good IDEs, but I think it is also very important to use text editor + makefile. Vim, Emacs or Sublime Text is good choice.
Always remember to backup your code.
It is unclear to newcomers to the ecosystem what is the canonically preferred way to structure and manage building small to medium sized OCaml projects. I understand the basics of ocamlc, &c.--they mirror conventional UNIX C compilers enough to seem straightforward. But, above the level of one-off compilation of individual files, it is unclear how best to manage compilation simply and cleanly. The issue is not searching for potential tools, but seeing one or a few Right (enough) Ways--as validated by the experience of the community--for structuring and building standard OCaml projects.
My model use case is a modest but nontrivial project, of pure OCaml or OCaml plus a C dependency. Such a project:
contains a number of source files
links to a number of standard libraries
links to one or more 3rd party libraries
optionally includes a C library and OCaml wrapper as a subproject (though this could also be managed separately and included as a 3rd party library, as in (3))
Several alternative tools stand out:
Custom Makefiles seem to be the common standard in most open source OCaml packages, but appear frustratingly verbose and complex--even more so than for modest C/C++ projects. Worse, many even seemingly simple OCaml libraries layer autoconf/automake on top for even greater complexity.
ocamlbuild appears to offer a modern, streamlined mechanism for automating builds with minimal configuration, but it is not well documented for newcomers, nor represented by example in the introductory materials in the OCaml ecosystem, nor visibly used by any of the various published OCaml projects which I have browsed for inspiration.
OASIS seems to be a layer of convention and library code atop other build systems to support building a package manager and library, like Cabal.
(I have also seen OMake, which appears to be a self-styled "make++" which also includes a suite of standard rules for common languages, including OCaml, and ocaml-make née OCamlMakefile, providing a template of standard rules for GNU make.)
Are any of these a preferred, modern way of managing OCaml builds?
How are project files best structured?
How are 3rd party library dependencies included and managed? Is it preferred to install them at the system level, or is there a standard and straightforward way of managing them locally to a project? I would much prefer a model in which projects remain as self-contained as possible.
You've got a thorough listing of the options available, but this question will not have a clear answer. My personal recommendation is also to use ocamlbuild. The myocamlbuild.ml file provided here is a good start. It will allow you to easily compile projects that depend on various libraries. I don't think it handles the case of binding to C libraries, but there are additional examples on the wiki that may help.
Some people object to ocamlbuild because it is yet another build tool, complicating package managers jobs. However, its ease of use and the fact that it is included in the official distribution is making it more and more widely used.
You can also skip all this and use oasis directly. It is very new, and a stable release has not yet been announced, but it is very usable. It will generate the myocamlbuild.ml automatically for you. This is probably the way to go in the very near future, if not already. Furthermore, by using oasis, you will immediately have the benefit of oasis-db, a CPAN like system for OCaml that is under development.
Regarding managing libraries, the answer is ocamlfind. If you have multiple instances of OCaml installed, calling the appropriate copy of ocamlfind will automatically cause all references to libraries be those for that particular instance, assuming you use ocamlfind systematically for all libraries. I currently use godi to install OCaml and libraries. It uses ocamlfind, and I have no problem having multiple instances of OCaml installed.
Personally I'd give +1 for ocamlbuild. It's default rules are good enough to compile small to medium projects with one command and none to very minimal configuration. It also enforces some very reasonable conventions (not mixing sources with build results). And for larger projects it can be customized to one's desire, with extra rules & plugins. At the company where I work we are using it for a large project (Ocaml + some C + some preprocessing + ...) and it works like a charm (and gives us much less headaches than Makefiles would).
As for manuals, I'd think the user guide (available from the author's webpage,) should be enough to get you started. More funky stuff may require a bit more digging.
+1 for OMake.
We revamped our build infrastructure a few years ago and chose OMake for the following reasons:
our products are composed of mixture of C, C++, Managed C++, Ruby and OCaml.
we target both Linux and Windows.
we interact with databases at build time.
for some productions we had to use OCaml 3.10.
our original build system used autoconf/automake.
we require out-of-source builds*.
To be honest I don't know if we could have done it with ocamlbuild, I haven't tested it. The tool is in use for sure since there is some activity around it in OCaml's bugtracker. If you opt for ocamlbuild, make sure you have an up-to-date version of OCaml.
*OMake supports out-of-source builds in a bit non-obvious way. It also has some issues when sources are read-only. We had to patch and rebuild our Windows version of OMake.
The current recommendation is to use Dune, a composable build system that supports OCaml and Reason compilation. It's being actively developed.
The Quickstart page gives a variety of project templates to start with.
Dune can also handle the following:
FFI with C and C++
Compile to JavaScript via js_of_ocaml
Cross compilation support via opam-cross
Opam file generation
Opam is the de-facto package manager for OCaml. Besides finding and installing packages, it can also handle multiple OCaml installations.
Esy is a newer, package.json-driven package manager born from the Reason community. Its pitch is that it brings out-of-the-box project sandboxing, and provides an easy way to pull existing opam packages.
good question. I would tend to say :
1) ocamlbuild
It is likely to be the standard way to compile, because it is efficient, fast, and the default tool given by the official distribution. The fact that it is in the official distribution is good point, because it is more likely to remain with time. Furthermore, it has ocamlfind enabled, so it can manage packages installed with ocamlfind, another standard for installing packages (ocamlfind is a bit like pkg-config for C)
2) But it won't be enough for your project. The integration with C is basic with ocamlbuild. So here I would maybe advise you to use oasis to finally answer your question. I also tried OMake, but did not like it.
3) However, your build scripts are not likely to work straight if you wan't other people to be able to download and build your project on their own machine. Furthermore, oasis does not handle pkg-config. For those reasons, I would tend to advise you to use ocaml-autoconf (ocaml macros for autotools). Because autotools are the standard for managing C libraries and it is well known by package maintainers. It can also handle cross-compilation...
=> ocaml-autoconf with ocamlbuild
When writing an app that one wants to have compile on mac, linux and windows, what is the best way of managing the different libraries that will need to be included on the various operating systems. For example, using the glut opengl toolkit requires different includes on each operating system.
Your question is actually two questions in one:
1) How do I write my C++ code to include the right include files on the right platform?
2) How do I write my Makefile to work on different platforms?
The C++ code question is already answered - find the platform-specific defines and use them to figure out what platform you're on.
Automake or scons are quite complex, and are worth your time only if you intend to release your code to a wide audience. In the case of in-house code, a "generic" makefile with per-platform include is usually sufficient. For Windows, you can get GNU Make for Windows (available from here, or use nmake and limit yourself to the subset of syntax common between all platforms.
If you just need to worry about header files, then the preprocessor will do everything you need. If you want to handle differing source files, and possibly different libraries you'll need a tool to handle it.
Some options include:
The Autotools
Scons
CMake
My personal favorite is CMake. The Autotools uses a multi-stage process that's relatively easy to break, and scons just feels weird to me. Cmake will also generate project files for a variety of IDEs, in addition to makefiles.
There is a good article on Macros. One of the answers how to use conditional compilation based on OS/COmpiler (its near the top).
The use of the Autoconfiguration tools is a nice addition on top of this but is not needed for small projects where it may be easier to detect the OS explicitly, though for larger projects that may need to run on many different types of OS you should also explore the Available autoconfiguration tools mentioned by Branan
Several projects I've worked on use an autoconf-based configure script which builds a Makefile, hence the reason you can build all of them from source with a simple:
./configure
make
make install
Scons has a configuring mechanism that will do a lot of what autotools do without as much complexity, and is pretty darn portable (although not as portable as autotools).
The compiler should have a set of preprocessor symbols it will provide that you can use. For example linux for gcc on a Linux system, _WIN32 for VC++. If you need something more complex then look at autoconf, but that works best for Unix based code.
I'd recommend checking out how some of the larger OpenSource projects handle this. See AutoSense.hpp from (an old release of) Apache Xerces.
If the libraries offer the same API on the different platforms, I would create a "proxy" include file containing all the necessary #ifdefs. That 'platform-independent' include file is then included in your client code instead of cluttering it with numerous and ugly-reading preprocessor commands. These will be contained in the ugly and cluttered platform-independent include.
If the API differs across platforms, you will need to create your own abstraction.
Perhaps this is a cop-out answer, but have you looked at how boost handles this? They build on quite a few platforms without autoconf, although they do have their own build system - bjam - that probably handles some of the same situations. They also do a nice auto-linking trick on windows that automatically selects the right version of libraries for linking depending on the version of the MSVC compiler. Based on your initial description, it sounds like just macro defs checking for various platforms/compilers might do the trick, but perhaps there is more to your problem that would prevent this.