Does "make" know how to search sub-dirs for include files? - c++

This is a question for experienced C/C++ developpers.
I have zero knowledge of compiling C programs with "make", and need to modify an existing application, ie. change its "config" and "makefile" files.
The .h files that the application needs are not located in a single-level directory, but rather, they are spread in multiple sub-directories.
In order for cc to find all the required include files, can I just add a single "-I" switch to point cc to the top-level directory and expect it to search all sub-dirs recursively, or must I add several "-I" switches to list all the sub-dirs explicitely, eg. -I/usr/src/myapp/includes/1 -I/usr/src/myapp/includes/2, etc.?
Thank you.

This question appears to be about the C compiler driver, rather than make. Assuming you are using GCC, then you need to list each directory you want searched:
gcc -I/foo -I/foo/bar myprog.c

This is actually a compiler switch, unrelated to make itself.
The compiler will search for include files in the built-in system dirs, and then in the paths you provide with the -I switch. However, no automatic sub-directory traversal is made.
For example, if you have
#include "my/path/to/file.h"
and you give -I a/directory as a parameter, the compiler will look for a/directory/my/path/to/file.h.

If the makefiles are written in the usual way, the line that invokes the compiler will use a couple of variables that allow you to customize the details, e.g. not
gcc (...)
but
$(CC) $(CFLAGS) (...)
and if this is the case, and you're lucky, you don't even need to edit any of the makefiles; instead you can invoke make like this
make CFLAGS='-I /absolute-path/to/wherever'
to incorporate your special options into the compiler invocation.
Also check whether the Makefiles aren't generated by something else (usually, a script in the top directory called
configure
which will have options of its own to control what goes into them).

everyone answered your question correctly. but something to consider when you get to setup your own source tree.... a leaf node should only look 2 places for headers, in its own directory or up the tree. once people start going across to peers and down the tree, the build system will get gnarly, but what also happens is folks with start using private interfaces when they should be using public interfaces

Related

Bazel: Compile a single file without linking

Question
In ninja, I can compile a single C++ file by running ninja path/to/my/object.file.o.
Is there a way to achieve the same in bazel?
Use case / Background
During refactoring, in particular when changing interfaces in .hpp files, I usually want to focus on one single complex user of the interface first. I want to iterate on that one user until my refactoring works as expected on complex_user.cpp and I am happy with the new interfaces. Only afterwards, I want to adjust all other users. I hence want to get the compiler errors / warnings only from my complex_user.cpp file while ignoring all other places where .hpp might be included
Try --save_temps. bazel build --save_temps //my:library will give you the .o, .s, and similar files for only the targets listed on the command line.
--compile_one_dependency is designed for a similar use case, if you want to specify the target to build by the .cpp file instead of specifying a particular cc_library.
You need to implement a custom-made rule cc_object_file. Since the Bazel cc_rules are open source you can use this as a starting point.

Use autotools installation prefix

I am writing a C++ program using gtkmm as the window library and autotools as my build system. In my Makefile.am, I install the icon as follows:
icondir = $(datadir)/icons/hicolor/scalable/apps
icon_DATA = $(top_srcdir)/appname.svg
EDIT: changed from prefix to datadir
This results in appname.svg being copied to $(datadir)/icons/hicolor/scalable/apps when the program is installed. In my C++ code, I would like to access the icon at runtime for a window decoration:
string iconPath = DATADIR + "/icons/hicolor/scalable/apps/appname.svg";
// do stuff with the icon
I am unsure how to go about obtaining DATADIR for this purpose. I could use relative paths, but then moving the binary would break the icon, which seems evident of hackery. I figure that there should be a special way to handle icons separate from general data, since people can install 3rd party icon packs. So, I have two questions:
What is the standard way of installing and using icons with autotools/C++/gtkmm?
Edit: gtkmm has an IconTheme class that is the standard way to use icons in gtkmm. It appears that I add_resource_path() (for which I still need the installation prefix), and then I can use the library to obtain the icon by name.
What is the general method with autotools/C++ to access the autotools installation prefix?
To convey data determined by configure to your source files, the primary methods available are to write them in a header that your sources #include or to define them as macros on the compiler command line. These are handled most conveniently via the AC_DEFINE Autoconf macro. Under some circumstances, you might also consider converting source files to templates for configure to process, but except inasmuch as Autoconf itself uses an internal version of that technique to build config.h (when that is requested), I wouldn't normally recommend it.
HOWEVER, the installation prefix and other installation directories are special cases. They are not finally set until you actually run make. Even if you set them via the configure's command-line options, you can still override that by specifying different values on the make command line. Thus, it is not safe to rely on AC_DEFINE for this particular purpose, and in fact, doing so may not work at all (will not work for prefix itself).
Instead, you should specify the appropriate macro definition in a command-line option that is evaluated at make time. You can do this for all targets being built by setting the AM_CPPFLAGS variable in your Makefile.am files, as demonstrated in another answer. That particular example sets the specified symbol to be a macro that expands to a C string literal containing the prefix. Alternatively, you could consider defining the whole icon directory as a symbol. If you need it only for one target out of several then you might prefer setting the appropriate onetarget_CPPFLAGS variable.
As an aside, do note that $(prefix)/icons/hicolor/scalable/apps is a nonstandard choice for the installation directory for your icon. That will typically resolve to something like /usr/local/icons/hicolor/scalable/apps. The conventional choice would be $(datadir)/icons/hicolor/scalable/apps, which will resolve to something like /usr/local/share/icons/hicolor/scalable/apps.
In your Makefile.am, use the following
AM_CPPFLAGS = -DPREFIX='"$(prefix)"'
See Defining Directories in autoconf's manual.

How to create an EDE project for C++

I have been trying to set up an EDE project for C++ (emacs24 + builtin CEDET) and I'm starting to get desperate because I can't seem to find the way I want the makefiles to be generated. I'm relatively new to Emacs.
I'll try to describe what I'm doing:
I have a toy project set like so:
main.cpp
other/
Utils.cpp
Utils.h
CGrabBuffer.cpp
CGrabBuffer.h
main.cpp includes both .h's inside the "other/" directory. These are the steps I follow to set up an EDE project with this simple directory setup:
Open main.cpp in emacs and do M-x ede-new ; type: Make ; name: main-proj.
Open one of the files in the "other" directory and do M-x ede-new ; type: Make ; name: aux-proj.
Now it's time to create the targets (which I believe are three in this case):
On the main.cpp buffer: M-x ede-new-target ; name: main ; type: program. When prompted, I add the main.cpp to this target.
I repeat the same for the other two targets (Utils which has Utils.cpp and Utils.h and CGrabBuffer which has CGrabBuffer.cpp and CGrabBuffer.h). Here I find the first problem. What type do these two targets have to be? I only want them to generate .o files.
Once this is done, I type M-x ede-customize-current-target to all three targets and I add some include paths, some libraries, etc.
After this, if I call M-x ede-compile-project it doesn't compile because:
It tries to compile main.cpp first; I have no idea how to specify (using EDE) that both Utils.o and CGrabBuffer.o are needed before attempting to build main.cpp.
If I manually change the order (editing the Makefile), it's not able to link main.cpp because it can't find Utils.o and CGrabBuffer.o.
As you can see, I am in the middle of a great mess. Maybe I'm not even understanding what "target" means in EDE. I have also read about the existence of ede-cpp-root-project which has to be specified inside the .emacs file. I haven't tried it because what I think it does is just help with the semantics. It doesn't generate Makefiles, does it? Can I have (or do I need) an EDE project built with Project.el's and the same thing using ede-cpp-root-project for the semantics? Or is it redundant?
Sorry If I misunderstood a lot of things but I'm very confused and being new to emacs makes things worse. Thanks for your patience!
EDIT: with some tinkering and the responses I received I have been able to figure out a lot of stuff, so thanks a lot. What I still don't understand is the use of the ede-cpp-root-project which has to be specified inside the .emacs file. Is it just for c++ semantics? Is it redundant to have the project with Project.el's AND also the elisp lines in .emacs?
EDE is designed to handle many different kinds of projects, usually of a type where the build system was written outside of Emacs in some other tool.
The EDE project type that creates Makefiles for you can do quite a few things, but you need to have some basic understanding of build systems for it to be helpful, and you really do need to customize the projects to get anything of any complexity working.
I've recently added a section to the EDE manual to help with basic project setups that autogenerate Automake files. You can check out the tutorial here:
http://www.randomsample.de/cedetdocs/ede/ede/Quick-Start.html
The same steps will apply for projects that just use Make instead, but Make based projects often have trouble with shared libraries due to the extra complexity.
Mike's answer is quite good, but I think it is ok to just add .h files to the same target as your .cpp sources. It will keep track of them separately.
Another useful trick is to use the whole project compile keystroke (C-c . C) which uses a capital C whenever you change something big. That will regenerate the Makefiles, rerun any needed Automake features, and start at the top.
EDIT: You only need one EDE project for a give project area. The ede-cpp-root project is useful when no other automatic project type works. That's when you create that in your .emacs file so that the other tools that need a project definition, like semantic's smart completion, and tag lookup, will work.
Well, I think I actually have it figured out this time, but it's ugly. Utils.cpp and CGrabBuffer.cpp should not get their own individual targets, because there doesn't seem to be an appropriate target type. Instead, you'll need to create an archive or library, which will automatically compile Utils.cpp and CGrabBuffer.cpp for you. Below, I'll assume you want static, but it's easy to change.
[For anyone to whom archives or libraries are not familiar, they basically just gather up .o files into a separate unit. It doesn't actually make the compilation harder. Read more here.]
1) Follow the first two and a half steps above (including making the main target, but not the other targets).
2) Switch to Utils.cpp and do M-x ede-new-target ; name: aux ; type: archive. When prompted, add Utils.cpp to this target.
3) Switch to CGrabBuffer.cpp and do C-c . a ; Target: aux .
4) Regenerate the Makefile with M-x ede-proj-regenerate. At this point, if you run make in the other subdirectory, you should get the archive libaux.a.
5) Switch back to main.cpp and do M-x ede-customize-current-target. This brings up an interactive emacs customization buffer, which allows you to edit details of the ede configuration. Under the Ldflags section, click [INS]. This pops out a new line that says Link Flag: and has some different-colored box for you to type in (mine is grey). Type -Lother -laux, so that other/libaux.a is included when compiling main. Then, at the top of the buffer, press [Accept], which should save that change and switch back to main.cpp.
6) Regenerate the Makefile with M-x ede-proj-regenerate.
Now, unfortunately, the Makefile makes the main target first, then descends into the other directory and makes that. Unfortunately, this means that a make from the top-level directory will not work on a clean tree. I don't know why this is, because it seems like that would never be what you want in any project that is ever made with EDE. I can't find any way to change that, except for this hack:
7) Do M-x customize-project; under Inference-Rules click [INS]. Then enter Target: all ; Dependencies: aux main ; Rules: [INS] ; String #: . (This last one is just to prevent an error on an empty rule with a tab; presumably an EDE bug.) Click [Accept], and regenerate the Makefiles.
So now, in your top directory, you can just run make, and main should be a working executable.
I'm quickly becoming convinced that EDE is not yet ready to be used by people other than its authors. Despite its size and the amount of effort they've clearly put into it, it is too buggy, too counterintuitive, and just not smart enough. That's a shame. Emacs needs something like this.

Best practice for dependencies on #defines?

Is there a best practice for supporting dependencies on C/C++ preprocessor flags like -DCOMPILE_WITHOUT_FOO? Here's my problem:
> setenv COMPILE_WITHOUT_FOO
> make <Make system reads environment, sets -DCOMPILE_WITHOUT_FOO>
<Compiles nothing, since no source file has changed>
What I would like to do is have all files that rely on #ifdef statements get recompiled:
> setenv COMPILE_WITHOUT_FOO
> make
g++ FileWithIfdefFoo.cpp
What I do not want to is have to recompile everything if the value of COMPILE_WITHOUT_FOO has not changed.
I have a primitive Python script working (see below) that basically writes a header file FooDefines.h and then diffs it to see if anything is different. If it is, it replaces FooDefines.h and then the conventional source file dependency takes over. The define is not passed on the command line with -D. The disadvantage is that I now have to include FooDefines.h in any source file that uses the #ifdef, and also I have a new, dynamically generated header file for every #ifdef. If there's a tool to do this, or a way to avoid using the preprocessor, I'm all ears.
import os, sys
def makeDefineFile(filename, text):
tmpDefineFile = "/tmp/%s%s"%(os.getenv("USER"),filename) #Use os.tempnam?
existingDefineFile = filename
output = open(tmpDefineFile,'w')
output.write(text)
output.close()
status = os.system("diff -q %s %s"%(tmpDefineFile, existingDefineFile))
def checkStatus(status):
failed = False
if os.WIFEXITED(status):
#Check return code
returnCode = os.WEXITSTATUS(status)
failed = returnCode != 0
else:
#Caught a signal, coredump, etc.
failed = True
return failed,status
#If we failed for any reason (file didn't exist, different, etc.)
if checkStatus(status)[0]:
#Copy our tmp into the new file
status = os.system("cp %s %s"%(tmpDefineFile, existingDefineFile))
failed,status = checkStatus(status)
print failed, status
if failed:
print "ERROR: Could not update define in makeDefine.py"
sys.exit(status)
This is certainly not the nicest approach, but it would work:
find . -name '*cpp' -o -name '*h' -exec grep -l COMPILE_WITHOUT_FOO {} \; | xargs touch
That will look through your source code for the macro COMPILE_WITHOUT_FOO, and "touch" each file, which will update the timestamp. Then when you run make, those files will recompile.
If you have ack installed, you can simplify this command:
ack -l --cpp COMPILE_WITHOUT_FOO | xargs touch
I don't believe that it is possible to determine automagically. Preprocessor directives don't get compiled into anything. Generally speaking, I expect to do a full recompile if I depend on a define. DEBUG being a familiar example.
I don't think there is a right way to do it. If you can't do it the right way, then the dumbest way possible is probably the your best option. A text search for COMPILE_WITH_FOO and create dependencies that way. I would classify this as a shenanigan and if you are writing shared code I would recommend seeking pretty significant buy in from your coworkers.
CMake has some facilities that can make this easier. You would create a custom target to do this. You may trade problems here though, maintaining a list of files that depend on your symbol. Your text search could generate that file if it changed though. I've used similar techniques checking whether I needed to rebuild static data repositories based on wget timestamps.
Cheetah is another tool which may be useful.
If it were me, I think I'd do full rebuilds.
Your problem seems tailor-made to treat it with autoconf and autoheader, writing the values of the variables into a config.h file. If that's not possible, consider reading the "-D" directives from a file and writing the flags into that file.
Under all circumstances, you have to avoid builds that depend on environment variables only. You have no way of telling when the environment changed. There is a definitive need to store the variables in a file, the cleanest way would be by autoconf, autoheader and a source and multiple build trees; the second-cleanest way by re-configure-ing for each switch of compile context; and the third-cleanest way a file containing all mutable compiler switches on which all objects dependant on these switches depend themselves.
When you choose to implement the third way, remember not to update this file unnecessarily, e.g. by constructing it in a temporary location and copying it conditionally on diff, and then make rules will be capable of conditionally rebuilding your files depending on flags.
One way to do this is to store each #define's previous value in a file, and use conditionals in your makefile to force update that file whenever the current value doesn't match the previous. Any files which depend on that macro would include the file as a dependency.
Here is an example. It will update file.o if either file.c changed or the variable COMPILE_WITHOUT_FOO is different from last time. It uses $(shell ) to compare the current value with the value stored in the file envvars/COMPILE_WITHOUT_FOO. If they are different, then it creates a command for that file which depends on force, which is always updated.
file.o: file.c envvars/COMPILE_WITHOUT_FOO
gcc -DCOMPILE_WITHOUT_FOO=$(COMPILE_WITHOUT_FOO) $< -o $#
ifneq ($(strip $(shell cat envvars/COMPILE_WITHOUT_FOO 2> /dev/null)), $(strip $(COMPILE_WITHOUT_FOO)))
force: ;
envvars/COMPILE_WITHOUT_FOO: force
echo "$(COMPILE_WITHOUT_FOO)" > envvars/COMPILE_WITHOUT_FOO
endif
If you want to support having macros undefined, you will need to use the ifdef or ifndef conditionals, and have some indication in the file that the value was undefined the last time it was run.
Jay pointed out that "make triggers on date time stamps on files".
Theoretically, you could have your main makefile, call it m1, include variables from a second makefile called m2. m2 would contain a list of all the preprocessor flags.
You could have a make rule for your program depend on m2 being up-to-date.
the rule for making m2 would be to import all the environment variables ( and thus the #include directives ).
the trick would be, the rule for making m2 would detect if there was a diff from the previous version. If so, it would enable a variable that would force a "make all" and/or make clean for the main target. otherwise, it would just update the timestamp on m2 and not trigger a full remake.
finally, the rule for the normal target (make all ) would source in the preprocessor directives from m2 and apply them as required.
this sounds easy/possible in theory, but in practice GNU Make is much harder to get this type of stuff to work. I'm sure it can be done though.
make triggers on date time stamps on files. A dependent file being newer than what depends on it triggers it to recompile. You'll have to put your definition for each option in a separate .h file and ensure that those dependencies are represented in the makefile. Then if you change an option the files dependent on it would be recompiled automatically.
If it takes into account include files that include files you won't have to change the structure of the source. You could include a "BuildSettings.h" file that included all the individual settings files.
The only tough problem would be if you made it smart enough to parse the include guards. I've seen problems with compilation because of include file name collisions and order of include directory searches.
Now that you mention it I should check and see if my IDE is smart enough to automatically create those dependencies for me. Sounds like an excellent thing to add to an IDE.

Can I have one makefile to build a hierarchical project?

I have several hundred files in a non-flat directory structure. My Makefile lists each sourcefile, which, given the size of the project and the fact that there are multiple developers on the project, can create annoyances when we forget to put a new one in or take out the old ones. I'd like to generalize my Makefile so that make can simply build all .cpp and .h files without me having to specify all the filenames, given some generic rules for different types of files.
My question: given a large number of files in a directory with lots of subfolders, how do I tell make to build them all without having to specify each and every subfolder as part of the path? And how do I make it so that I can do this with only one Makefile in the root directory?
EDIT: this almost answers my question, but it requires that you specify all filenames :\
I'm sure a pure-gmake solution is possible, but using an external command to modify the makefile, or generate an external one (which you include in your makefile) is probably much simpler.
Something along the lines of:
all: myprog
find_sources:
zsh -c 'for x in **/*.cpp; echo "myprog: ${x/.cpp/.o}" >> deps.mk'
include deps.mk
and run
make find_sources && make
note: the exact zsh line probably needs some escaping to work in a make file, e.g. $$ instead of $. It can also be replaced with bash + find.
One way that would be platform independent (I mean independent from shell being in Windows or Linux) is this:
DIRS = relative/path1\
relative/path2
dd = absolute/path/to/subdirectories
all:
#$(foreach dir, $(DIRS), $(MAKE) -C $(dd)$(dir) build -f ../../Makefile ;)
build:
... build here
note that spaces and also the semicolon are important here, also it is important to specify the absolute paths, and also specify the path to the appropriate Makefile at the end (in this case I am using only one Makefile on grandparent folder)
But there is a better approach too which involves PHONY targets, it better shows the progress and errors and stops the build if one folder has problem instead of proceeding to other targets:
.PHONY: subdirs $(DIRS)
subdirs: $(DIRS)
$(DIRS):
$(MAKE) -C $# build -f ../../Makefile
all : prepare subdirs
...
build :
... build here
Again I am using only one Makefile here that is supposed to be applicable to all sub-projects. For each sub-project in the grandchild folder the target "build" is created usinf one Makefile in the root.
I would start by using a combination of the wildcard function:
http://www.gnu.org/software/make/manual/make.html#Wildcard-Function
VPATH/vpath
http://www.gnu.org/software/make/manual/make.html#Selective-Search
and the file functions
http://www.gnu.org/software/make/manual/make.html#File-Name-Functions
For exclusion (ie: backups, as Jonathan Leffler mentioned), use a seperate folder not in the vpath for backups, and use good implicit rules.
You will still need to define which folders to do to, but not each file in them.
I'm of two minds on this one. On one hand, if your Make system compiles and links everything it finds, you'll find out in a hurry if someone has left conflicting junk in the source directories. On the other hand, non-conflicting junk will proliferate and you'll have no easy way of distinguishing it from the live code...
I think it depends on a lot of things specific to your shop, such as source source control system and whether you plan to ever have another project with an overlapping code base. That said, if you really want to compile every source file below a given directory and then link them all, I'd suggest simple recursion: to make objects, compile all source files here, add the resultant objects (with full paths) to a list in the top source directory, recurse into all directories here. To link, use the list.