Merits of bmake - build

Apart from the fact that bmake is an BSD equivalent of GNU make, I could not clearly understand it's advantages over GNU make. Can anyone help me? I was able to find only one resource that was bit helpful. More help or pointers are appreciated.

BSD make and GNU make are both free replacements for the original AT&T make. The major difference is having different syntax for the advanced features. Here is how to put the output of a shell command in a variable in BSD make:
# BSD make
TODAY != date +%Y-%m-%d
And in GNU make:
# GNU make
TODAY = $(shell date +%Y-%m-%d)
As soon as someone writes $(shell ...) in a Makefile, it requries GNU make. Because of the different syntax, some packages require GNU make for the build, and some require BSD make.
BSD make began its life as PMake, short for parallel make. Its author, Adam de Boor, described PMake in PMake -- A Tutorial. Its merit was the new ability to run jobs in parallel, as in make -j 3. This parallel mode broke compatibility by running all commands for each target in a single shell, not in one shell per line. GNU make has a parallel mode, also -j, that keeps one shell per line. NetBSD make(1) now has make -B -j 3 to do parallel mode with one shell per line. OpenBSD make(1) now always does parallel mode with one shell per line.
4.3BSD-Reno included PMake as make and the bsd.*.mk include files. These include files are the best feature of BSD make. src/bin/sed/Makefile in 4.3BSD-Tahoe (the release before 4.3BSD-Reno) defines several targets like clean, depend, install, and so on. src/usr.bin/sed/Makefile in 4.3BSD-Reno has only four non-empty lines:
# #(#)Makefile 4.6 (Berkeley) 5/11/90
PROG= sed
SRCS= sed0.c sed1.c
.include <bsd.prog.mk>
Here bsd.prog.mk automatically sets OBJS to sed0.o sed1.o, defines a sed target to link sed from those objects, defines other targets like clean, depend, install, and causes make install to install both sed and its manual page sed.1. There is also bsd.lib.mk for building libraries.
When using bsd.*.mk, each Makefile can build only one program or library. To build another one, there must be a second Makefile in another directory. So src/usr.sbin/smtpd/ in OpenBSD has six subdirectories, where each subdirectory only contains a Makefile, because smtpd builds six programs.
It is also rare to use bsd.*.mk to build anything except BSD itself. Many portable packages from BSD developers, like OpenSSH (from OpenBSD) or mksh (from MirBSD), do not require BSD make and do not use bsd.*.mk files.
The file bsd.port.mk is at the center of FreeBSD Ports, the system that builds software packages for FreeBSD. (NetBSD pkgsrc calls this file bsd.pkg.mk.) This system has rivals in other scripting languages. Homebrew uses Ruby. MacPorts use Tcl.
In the past, GNU make was more portable than BSD make. Because BSD make was part of BSD, it is rare to find BSD make on other systems. Now there is portable bmake for other systems. This is a portable version of NetBSD make. The most common use for portable bmake is to run pkgsrc on non-NetBSD systems. I run pkgsrc on OS X, with bmake bootstrapped by pkgsrc.

I am writing BSD Owl, a portable build system based on BSD Make. I started to write Makefiles in 2000 with GNU Make and quickly switched to BSD Make. Let me outline the reasons which guided my choice.
Documentation and available literature
This is the main point, really. While the GNU project provided awesome and important piece of software to the free software world, documentation does not cleanly distinguish between user manual and reference manual, a consistent characteristic of GNU documentation. As a result, GNU Make documentation is a huge (well over 200 pages) document interleaving careful description of each aspect of the system with toy-examples. When I used it, I never found these examples useful, because they are nowhere near of actual use cases, and always needed several minutes to locate or relocate information I had to look up. The literature corpus available for GNU Make is really huge, however it very hard to find interesting examples to guide oneself in the writing of a comprehensive portable build system like BSD Owl. Maybe it is because projects rely on automake instead of writing the Makefiles directly. Maybe it is because I did not search hard enough.
The state of things was much better on the FreeBSD side. Indeed,
There is a canonical tutorial or user manual by Adam de Boor.
The manual page serves well as a reference manual and is well under 30 pages.
The FreeBSD build system and the FreeBSD ports system are a corpus of hand-written makefiles.
Of course the manual page of BSD Make is not as detailed as the manual of GNU Make, but it is much more dense and easier to use. The FreeBSD build systems are at the same time (a) a proof of concept that BSD Make can orchestrate the build of large complex projects and (b) a show-case of advanced techniques in the art of Makefile writing.
Programming features
A very funny thing is that programming features of GNU Make seem much more powerful than what BSD Make provide. Nevertheless, I found them incredibly hard to use, mostly because of expansion order, and because of the lack of useful examples. BSD Make has much less constructs and function, but they are very solid and easy to use: most of the code I need to write are conditionals and for-loops to define lists of targets and BSD Make does this very easily and reliably.
Advanced features
BSD Make has an advanced feature that GNU Make does not have — as far as I know — it is called the meta mode. The most difficult part of writing Makefiles is to correctly specify prerequisite lists in a package. Some software like mkdep(1) tries to automate the process by analysing sources. BSD Make takes a different approach: we write a Makefile with possibly buggy (incomplete) dependencies, but precise enough to successfully build the package. During the process, BSD Make monitors I/O using filemon(4) to determine the actual list of prerequisites.

The BSD make can generate targets, based on conditionals or within loops. This comes quite handy sometimes.

Whenever comparing GNU utilities to BSD utilities, keep in mind that most people making comparisons will treat licensing as a feature or a bug in of itself -- and that something like half the developers making a comparison favor a permissive license and half favor a GPL-like viral license. This is not to say that there aren't real differences (sometimes there are, but often it's a wash).
bmake isn't quite as bloated as GNU make, and both are compatible with most hand-written makefiles (and if you generate makefiles with autotools, autotools is capable of generating makefiles compatible with either or both).
The only clear benefit to bmake so far as I can tell is that using it will please people who are pathologically afraid of GPLs. But, bmake's author claims that it is the result of trying to patch pmake (the normal BSD make) to accept the output of autotools. Given the trouble I've had trying to get it to understand autotools-generated makefiles, I am suspicious of how successful this was. Other people make other claims, some of which aren't meaningful.
In the end, if you stick to the standard elements of makefiles and don't stray too far from POSIX, it shouldn't matter too much which make you use.

I'm not using it, but my understanding is that the BSD make is a make program + a standard library of templates (i.e. more or less the equivalent of GNU make + automake and perhaps autoconf).

Related

Better way to give provide path of libraries while compiling in C++

I pretty new to C++. I was wondering, as what is considered generally a neat way to provide paths for various files/libraries while compiling or executing c++ codes.
For ex:
I have Boost libraries installed in some location on my system. Lets call it X
In order to execute anything I have to type in
c++ -I LongpathWhichisX/to/boost_1_60_0 example.cpp -o example
Similarly, also Long path for the input file while executing the code.
Is there a better way to address it. Is it possible to create environment variables lets Y, which refers to path 'X'. And we can use following command to compile code
c++ -I Y/to/boost_1_60_0 example.cpp -o example
Best way is to use build tools. For example you can use Make. You define all your include paths (and other options) in the Makefile. In console you just have to call make to build your project or something like make run to run your project.
The usual way is to make a Makefile where you can specify all needed paths and compile options in proper variables.
If you don't want/need a Makefile and rather want to run compiler from command-line, then you may use the CPATH environment variable to specify a colon-separated list of paths to include files.
This is a broad question but the other answers highlight the most important step. It is essential to learn build tools like make because they will make it easier to build your projects during development and for others to build it later. In the modern programming age though this is not enough. If you are using something like Boost (which targets many platforms) you will probably want to make your build cross-platform as well. For this you would use either cmake or autotools which both have scripts that make it much easier to locate the Boost libraries (and others).
Any other build systems, in my opinion, are a pain and are the bane of maintainers of Linux distributions. CMake used to be in that catergory but it has gained wide acceptance now. CMake targets building cross-platform projects across operating systems (Windows and Unixes) better (again in my opinion) because it attempts to provide the native build system on each platform (for example: Visual Studio in Windows, Make on all Unices, XCode on Mac). The autotools instead target the Unix environment with much greater depth (you have a bit of a harder time on Windows, but you can target embedded Unix systems to high end Unix server systems with much more flexibility).
Note: Autotools support for cross-compiling is superior in almost every way to other solutions. I always cringe when I download something that needs to be cross compiled for Arm Linux and it uses some weird build system. Funnily enough, boost is one of these.
This is a bit of a long winded answer. In summary, it is essential that you learn a build system for native development. It is part of your skill set and until you have that skill you can't really contribute to open-source projects or even your employer developing closed-source projects.

What's the common way to resolve dependencies in Makefile?

I've seen lots of methods been used to resolved the dependencies in Makefile, such as using gcc -MM and sed commond, or using the include directive (plus a little Perl magic), or qmake, or automake, or info make, etc.
Facing such many options, I am confused of which should I choose. So, I wanna know what's the common way to resolve dependencies in Makefile nowadays? What's the best way to cope with this problem?
PS: C/CPP project.
Generally if all you care about is systems that support GNU make and gcc (such as all linux variants and most unix like systems these days), you just use gcc's various -M flags to generate the dependencies and then -include them in your Makefile. There's some good info in this question -- generally there's no need to use sed or any more complex tools.
If you only need to support lots of Linux distributions (as you noted in a comment), then I'd recommend the automake/autoconf suite.
This answer assumes you are only asking generally and you do not yet know what specific issues you will have to resolve as you go.
Edit:
GNU make alone can handle dependency generation within your own project.
autoconf handles optional or alternative dependencies on third party libraries, tools or system features. automake provides macros some of which are occasionally useful even if you are otherwise using autoconf without automake.
A side benefit of starting with automake outright is that your makefiles will behave completely predictably (in terms of conventions and portability) with less investment of attention.
Hence my humble recommendation.
There are several ways to generate make-compatible dependencies for C/C++ projects:
gcc -M, which comes in several flavors and is sort of "the gold standard" in terms of accuracy, since it uses the actual compiler to generate the dependencies, and who would know better how to process #include statements than the compiler itself?
makedepend, which is generally discouraged in favor of compiler-generated dependencies.
fastdep, another third-party dependency generator which purports to be faster than gcc -M.
ElectricAccelerator has a built-in feature called autodep which uses the filesystem usage activity of the commands invoked in the build to generate dependency information. The advantage of autodep over the alternatives is that it is extremely fast and completely tool and programming language independent -- while the others are all tied to C/C++ or require the use of a specific compiler, autodep works with all types of build tools.
I did a performance comparison of several of these options a while back.
Disclaimer: I am the architect and lead developer of ElectricAccelerator.

What is the preferred way to structure and build OCaml projects?

It is unclear to newcomers to the ecosystem what is the canonically preferred way to structure and manage building small to medium sized OCaml projects. I understand the basics of ocamlc, &c.--they mirror conventional UNIX C compilers enough to seem straightforward. But, above the level of one-off compilation of individual files, it is unclear how best to manage compilation simply and cleanly. The issue is not searching for potential tools, but seeing one or a few Right (enough) Ways--as validated by the experience of the community--for structuring and building standard OCaml projects.
My model use case is a modest but nontrivial project, of pure OCaml or OCaml plus a C dependency. Such a project:
contains a number of source files
links to a number of standard libraries
links to one or more 3rd party libraries
optionally includes a C library and OCaml wrapper as a subproject (though this could also be managed separately and included as a 3rd party library, as in (3))
Several alternative tools stand out:
Custom Makefiles seem to be the common standard in most open source OCaml packages, but appear frustratingly verbose and complex--even more so than for modest C/C++ projects. Worse, many even seemingly simple OCaml libraries layer autoconf/automake on top for even greater complexity.
ocamlbuild appears to offer a modern, streamlined mechanism for automating builds with minimal configuration, but it is not well documented for newcomers, nor represented by example in the introductory materials in the OCaml ecosystem, nor visibly used by any of the various published OCaml projects which I have browsed for inspiration.
OASIS seems to be a layer of convention and library code atop other build systems to support building a package manager and library, like Cabal.
(I have also seen OMake, which appears to be a self-styled "make++" which also includes a suite of standard rules for common languages, including OCaml, and ocaml-make née OCamlMakefile, providing a template of standard rules for GNU make.)
Are any of these a preferred, modern way of managing OCaml builds?
How are project files best structured?
How are 3rd party library dependencies included and managed? Is it preferred to install them at the system level, or is there a standard and straightforward way of managing them locally to a project? I would much prefer a model in which projects remain as self-contained as possible.
You've got a thorough listing of the options available, but this question will not have a clear answer. My personal recommendation is also to use ocamlbuild. The myocamlbuild.ml file provided here is a good start. It will allow you to easily compile projects that depend on various libraries. I don't think it handles the case of binding to C libraries, but there are additional examples on the wiki that may help.
Some people object to ocamlbuild because it is yet another build tool, complicating package managers jobs. However, its ease of use and the fact that it is included in the official distribution is making it more and more widely used.
You can also skip all this and use oasis directly. It is very new, and a stable release has not yet been announced, but it is very usable. It will generate the myocamlbuild.ml automatically for you. This is probably the way to go in the very near future, if not already. Furthermore, by using oasis, you will immediately have the benefit of oasis-db, a CPAN like system for OCaml that is under development.
Regarding managing libraries, the answer is ocamlfind. If you have multiple instances of OCaml installed, calling the appropriate copy of ocamlfind will automatically cause all references to libraries be those for that particular instance, assuming you use ocamlfind systematically for all libraries. I currently use godi to install OCaml and libraries. It uses ocamlfind, and I have no problem having multiple instances of OCaml installed.
Personally I'd give +1 for ocamlbuild. It's default rules are good enough to compile small to medium projects with one command and none to very minimal configuration. It also enforces some very reasonable conventions (not mixing sources with build results). And for larger projects it can be customized to one's desire, with extra rules & plugins. At the company where I work we are using it for a large project (Ocaml + some C + some preprocessing + ...) and it works like a charm (and gives us much less headaches than Makefiles would).
As for manuals, I'd think the user guide (available from the author's webpage,) should be enough to get you started. More funky stuff may require a bit more digging.
+1 for OMake.
We revamped our build infrastructure a few years ago and chose OMake for the following reasons:
our products are composed of mixture of C, C++, Managed C++, Ruby and OCaml.
we target both Linux and Windows.
we interact with databases at build time.
for some productions we had to use OCaml 3.10.
our original build system used autoconf/automake.
we require out-of-source builds*.
To be honest I don't know if we could have done it with ocamlbuild, I haven't tested it. The tool is in use for sure since there is some activity around it in OCaml's bugtracker. If you opt for ocamlbuild, make sure you have an up-to-date version of OCaml.
*OMake supports out-of-source builds in a bit non-obvious way. It also has some issues when sources are read-only. We had to patch and rebuild our Windows version of OMake.
The current recommendation is to use Dune, a composable build system that supports OCaml and Reason compilation. It's being actively developed.
The Quickstart page gives a variety of project templates to start with.
Dune can also handle the following:
FFI with C and C++
Compile to JavaScript via js_of_ocaml
Cross compilation support via opam-cross
Opam file generation
Opam is the de-facto package manager for OCaml. Besides finding and installing packages, it can also handle multiple OCaml installations.
Esy is a newer, package.json-driven package manager born from the Reason community. Its pitch is that it brings out-of-the-box project sandboxing, and provides an easy way to pull existing opam packages.
good question. I would tend to say :
1) ocamlbuild
It is likely to be the standard way to compile, because it is efficient, fast, and the default tool given by the official distribution. The fact that it is in the official distribution is good point, because it is more likely to remain with time. Furthermore, it has ocamlfind enabled, so it can manage packages installed with ocamlfind, another standard for installing packages (ocamlfind is a bit like pkg-config for C)
2) But it won't be enough for your project. The integration with C is basic with ocamlbuild. So here I would maybe advise you to use oasis to finally answer your question. I also tried OMake, but did not like it.
3) However, your build scripts are not likely to work straight if you wan't other people to be able to download and build your project on their own machine. Furthermore, oasis does not handle pkg-config. For those reasons, I would tend to advise you to use ocaml-autoconf (ocaml macros for autotools). Because autotools are the standard for managing C libraries and it is well known by package maintainers. It can also handle cross-compilation...
=> ocaml-autoconf with ocamlbuild

What GNU make substitute do you recommend? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
Imagine you're free to choose a tool like GNU make for a new C++ project. What would you choose? Are any usable substitutes out there?
It shall have/be
a command line interface
"easy" to understand
easy to set up for a default c++ project
may support src/bin seperation as common for Java
may not add too much dependencies to other software/libs
platform independent (new)
features:
build rules / templates like make but in an human readable way
recursively crawling directories and applying the rules if there is no other
"Makefile"
configuration by exception
Note:
Nothing's wrong with GNU make. I just don't like its grammar, all the stuff that grows in the years and the silly recursive make problems. I'm using gmake for years now.
I use cmake, and I'm very glad I made the switch.
EDIT
Feature list as found in the wikipedia article:
Configuration files are CMake scripts, which use a programming
language specialized to software builds
Automatic dependency analysis built-in for C, C++, Fortran and Java
Support of SWIG, Qt, via the CMake scripting language
Built-in support for many versions of Microsoft Visual Studio including versions 6, 7, 7.1, 8.0, and 9.0
Generates build files for Eclipse CDT (C/C++ Development Tools)
Detection of file content changes using traditional timestamps,
Support for parallel builds
Cross-compilation
Global view of all dependencies, using CMake to output a graphviz diagram
Support for cross-platform builds, and known to work on
Linux and other POSIX systems (including AIX, *BSD systems, HP-UX, IRIX/SGI, and Solaris)
Mac OS X
Windows 95/98/NT/2000/XP, Windows Vista and MinGW/MSYS
Integrated with DART (software), CDash, CTest and CPack, a collection of tools for software testing and release
But to be honest: Just about anything is better than the gnu toolchain.
SCons and waf are good (well, IMHO anyway) build systems, and depend only on Python.
How about "gnu make"?
You asked for something like it without giving any indication of what features you want that aren't supported by gnu make.
Boost.Jam
It has the features you named
command line interface;
easy;
it comes from the C++ library collection Boost, so it has good support for C++ (and it's not limited to C++, either);
it stores executables in places under the bin directory, depending on what build request you've commanded. If you use gcc 4.3.2, than you get the executables under
bin/gcc-4.3.2/debug -- when executing bjam
bin/gcc-4.3.2/release -- when executing bjam release
bin/gcc-4.3.2/release/inlining-off/threading-multi -- when executing bjam release inlining=off threading=multi
bin/gcc-4.3.2/release/threading-multi -- for bjam release inlining=full threading=multi because inlining=full is default for release.
it doesn't need the full Boost library collection, only Boost.Build and Boost.Jam are necessary;
platform independent;
the Jamfile syntax is easy, but powerful;
you can divide the build config into many Jamfiles in subdirectories.
CMake should answer most, if not all for your requirements.
It will generate the Makefiles for you.
It has a good domain specific primitives, plus a simple language for the times you need to do something special.
It solves most of the problems with recursive make (see recursive make is considered harmful paper).
It uses an out-of-source build, so you have your bin / src separation.
I found it easy to write, easy to maintain, and fast to build.
... Plus:
It is cross platform.
With CText and CDash it has what you need for setup a continues integration service.
See also this answer to Recursive Make - friend or foe?
SCons + swtoolkit
G'day,
I'd agree with the couple of answers, so far, that recommend sticking with gmake.
Maybe have another look after reading the first few chapters of Robert Mecklenburg's excellent book "Managing Projects with GNU Make" (sanitised Amazon link).
Or, even better, is to search out a copy of the previous edition called "Managing Projects With make" by Andrew Oram and Steve Talbott. The first few chapters give an excellent description of (g)make and [Mm]akefile basics.
And I see you can buy a second hand copy of the 2nd ed. from Amazon for the princely sum of $0.01! (sanitised Amazon link)
After reading that intro you'll even understand how make is backward chaining, which is a bit non-intuitive when just looking at make's behaviour.
HTH
cheers,
Autotools -- autoconf/automake/libtool they are very poweful build instruments.
Its take some time to start with them, but after -- they serv you very well.
And what is more important they are significantly more powerfull then their replacements (CMake, BJam, SCons etc).
How are they more powerfull?
Transparrent support of building both static and shared libraries via libtool.
Support of uninstall (no in CMake)
Support of standard installation paths -- very important in packaging.
Full support and integration of gettext.
Transparent and strightforward support of cross compilation.
Many things. CMake may do most of things but for each one of them you should write long scripts or specify many things manually.
What's wrong with gmake?
What issues does it have that mean you want to use it. There's no point in recommending another tool if it has the same perceived issues as gmake.
we're using gmake in our build system and we're extremely happy with it's performance, flexibility and features

Include only certain libraries on an operating system

When writing an app that one wants to have compile on mac, linux and windows, what is the best way of managing the different libraries that will need to be included on the various operating systems. For example, using the glut opengl toolkit requires different includes on each operating system.
Your question is actually two questions in one:
1) How do I write my C++ code to include the right include files on the right platform?
2) How do I write my Makefile to work on different platforms?
The C++ code question is already answered - find the platform-specific defines and use them to figure out what platform you're on.
Automake or scons are quite complex, and are worth your time only if you intend to release your code to a wide audience. In the case of in-house code, a "generic" makefile with per-platform include is usually sufficient. For Windows, you can get GNU Make for Windows (available from here, or use nmake and limit yourself to the subset of syntax common between all platforms.
If you just need to worry about header files, then the preprocessor will do everything you need. If you want to handle differing source files, and possibly different libraries you'll need a tool to handle it.
Some options include:
The Autotools
Scons
CMake
My personal favorite is CMake. The Autotools uses a multi-stage process that's relatively easy to break, and scons just feels weird to me. Cmake will also generate project files for a variety of IDEs, in addition to makefiles.
There is a good article on Macros. One of the answers how to use conditional compilation based on OS/COmpiler (its near the top).
The use of the Autoconfiguration tools is a nice addition on top of this but is not needed for small projects where it may be easier to detect the OS explicitly, though for larger projects that may need to run on many different types of OS you should also explore the Available autoconfiguration tools mentioned by Branan
Several projects I've worked on use an autoconf-based configure script which builds a Makefile, hence the reason you can build all of them from source with a simple:
./configure
make
make install
Scons has a configuring mechanism that will do a lot of what autotools do without as much complexity, and is pretty darn portable (although not as portable as autotools).
The compiler should have a set of preprocessor symbols it will provide that you can use. For example linux for gcc on a Linux system, _WIN32 for VC++. If you need something more complex then look at autoconf, but that works best for Unix based code.
I'd recommend checking out how some of the larger OpenSource projects handle this. See AutoSense.hpp from (an old release of) Apache Xerces.
If the libraries offer the same API on the different platforms, I would create a "proxy" include file containing all the necessary #ifdefs. That 'platform-independent' include file is then included in your client code instead of cluttering it with numerous and ugly-reading preprocessor commands. These will be contained in the ugly and cluttered platform-independent include.
If the API differs across platforms, you will need to create your own abstraction.
Perhaps this is a cop-out answer, but have you looked at how boost handles this? They build on quite a few platforms without autoconf, although they do have their own build system - bjam - that probably handles some of the same situations. They also do a nice auto-linking trick on windows that automatically selects the right version of libraries for linking depending on the version of the MSVC compiler. Based on your initial description, it sounds like just macro defs checking for various platforms/compilers might do the trick, but perhaps there is more to your problem that would prevent this.