I'm struggling to write Makefiles that properly build my unit tests. As an example, suppose the file structure looks like this
src/foo.cpp
src/foo.hpp
src/main.cpp
tests/test_foo.cpp
tests/test_all.cpp
So, to build the executable test_all, I'd need to build test_foo.o which in turn depends on test_foo.cpp but also on src/foo.o.
What is the best practice in this case? One Makefile in the parent folder? One Makefile per folder? If so, how do I manage the dependencies across folders?
Common practice is a Makefile per directory. That's what I would have suggested before I read "Recursive Make Considered Harmfull" (http://miller.emu.id.au/pmiller/books/rmch/). Now I'd recommend one Makefile. Also check out the automatic dependency generation - now you don't even need to work out what your tests depends on. All you need is some targets.
The common practice is one Makefile for each folder. Here is a simple Makefile.am script for the root folder:
#SUBDIRS = src tests
all:
make -C ./src
make -C ./tests
install:
make -C ./src install
uninstall:
make -C ./src uninstall
clean:
make -C ./src clean
test:
make -C ./tests test
The corresponding Makefile.am for the src folder will look like this:
AM_CPPFLAGS = -I./
bin_PROGRAMS = progName
progName_SOURCES = foo.cpp main.cpp
LDADD = lib-to-link
progName_LDADD = ../libs/
Makefile.am for tests will look similar:
AM_CPPFLAGS = -I../src
bin_PROGRAMS = tests
tests_SOURCES = test_foo.cpp test_all.cpp
Use automake to generate Makefile.in files from the .am files. The configure script will use the .in files to produce the Makefiles. (For small projects you would like to directly hand-code the Makefiles).
Related
I have a project directory structure of:
Root
Source
Common
MyFolder
++ My 3 source files and header
When I am building my project it generates 3 to 4 shared libraries. Lib1 compiled using c++98 and others using c++11. Flags are added in CmakeList.txt which is at root.
I need my 3 source files to be compiled for Lib1 and for other Libs as as well. but here what happens is compiler is first compiling my source file for lib using c++11 and then it is trying to use same .o file for Lib1 as well. So for .o file which is generated using c++11 is throwing exception when same is used for c++98 compiled library.
So how do write this in CmakeList.txt such that compiler rather than trying to use same .o file will compile source file again for Lib1(c++98 compiled library)
Is there any flag I can specify so that it won't take precompiled .o file and will compile it again ?
Here flags are not being overridden for different shared libraries but actually same object file by make file is being used for different flags
This is sort of counter to how makefiles and cmake usually work.
Most users consider it really important that make performs an incremental build.
The usual way with makefiles is to do make clean which is supposed to remove any binaries and object files that were created.
However, sometimes I write cmake scripts that use globbing over the source directory to assemble the project. (That means, it says "just grab all *.cpp files in the /src folder and make an executable from them".) A makefile cannot check what files in a directory, so the make build will be broken after I add a new file, and make clean won't fix it -- the whole makefile will need to be regenerated by cmake.
Usually what I do is, I write a simple bash script, named rebuild.sh or something,
#!/bin/bash
rm -rf build
mkdir build
cd build
cmake ..
make -j3
./tests
And I put that in the root of my repository, and add /build to my .gitignore. I call that when I want to do a full rebuild -- it nukes the build directory, so its foolproof. When I want an incremental rebuild, I just type make again in the /build directory.
The rebuild.sh script can also serve a double purpose if you use travis-ci for continuous integration.
Most build system assume the compiled objects remain the same within the same pass. To avoid shooting your foot I would suggest telling the build system they were actually different objects, while still compiled from same source files.
I'm not familiar with cmake but this is how you do with make:
For example you have a a.cpp which you want to compile 2 times for different compiler options:
#include <stdio.h>
int main(int argc, char* argv[]) {
printf ("Hello %d\n", TOKEN);
return 0;
}
And the Makefile would looks like:
SRC := $(wildcard *.cpp)
OBJ_1 := $(patsubst %.cpp,%_1.o,$(SRC))
OBJ_2 := $(patsubst %.cpp,%_2.o,$(SRC))
all: pass1 pass2
pass1: $(OBJ_1)
gcc -o $# $(OBJ_1) -lstdc++
pass2: $(OBJ_2)
gcc -o $# $(OBJ_2) -lstdc++
%_1.o: %.cpp
gcc -DTOKEN=1 -c $< -o $#
%_2.o: %.cpp
gcc -DTOKEN=2 -c $< -o $#
clean:
rm -f $(OBJ_1) $(OBJ_2)
What I do here is generate two different list of object from the same source files, which you can even do the same for dependency(-MMD -MP flags).
I'm a newbie to g test and Here is what I am trying to do (On a Linux server from console):
1) Create a small project in C++ ( with a header file containing a function prototype, a cpp file with a function in it and another cpp file with main calling the function already defined in the header file )
2) Configure g test to write unit tests and test the function created in the step 1
3) Create another small project with a couple of unit tests (different scenarios to test the function created under the project in step 1)
Can anyone please tell how to configure g test and the projects created with an example?
Thanks in advance
First of all, get the most updated version of GoogleTest from the Subversion repository (you need Subversion installed):
cd ~
svn checkout http://googletest.googlecode.com/svn/trunk/ googletest-read-only
Then, build the library (you need cmake installed):
mv googletest-read-only googletest
mkdir googletest/lib
cd googletest/lib
cmake ..
make
At this point:
compiled libraries are in the ~/googletest/lib directory
include files are in the ~/googletest/include directory
To use googletest:
Include the header in your files:
#include "gtest/gtest.h"
Export the library path:
export GOOGLETESTDIR=~/googletest
Compile with
g++ ... -I$GOOGLETESTDIR/include -L$GOOGLETESTDIR/lib -lgtest -lpthread
Please find the tutorial
# http://www.yolinux.com/TUTORIALS/Cpp-GoogleTest.html
Caution!!
one correction at the makefile (test/src/Makefile). The order of the library path is not correct!!.
It would be like:
CXX = g++
CXXFLAGS = -g -L/opt/gtest/lib -lgtest -lgtest_main -lpthread
INCS = -I./ -I../../src -I/opt/gtest/include
OBJS = ../../src/Addition.o Addition_Test.o ../../src/Multiply.o Multiply_Test.o
testAll: $(OBJS)
$(CXX) $(INCS) -o testAll Main_TestAll.cpp $(OBJS) $(CXXFLAGS)
.cpp.o:
$(CXX) $(CXXFLAGS) -c $< -o $# $(INCS)
clean:
rm testAll *.o testAll.xml
After a small research here is what I found out:
If your project library contains files like:
1) callMain.cpp which calls the function to do some operations
2) reverse.cpp which contains the logic of reversing a number and
3) header.h containing the declaration of function prototypes
And if you have unit test case scenario scripts like unitTest1.cpp and unitTest2.cpp to be tested via gtest then, this can be achieved as follows:
g++ -I<gtest include directory location> -L<gtest directory location> <gtest_main.cc location> reverse.cpp unitTest1.cpp unitTest2.cpp -lgtest -lpthread -o test_try
This compiles and produces an executable like test_try which when executed gives the desired result. Please correct me if I'm wrong anywhere. Happy coding :)
New answer
Today I read the Google Test FAQ. It's not recommend to install a pre-compiled copy of Google Test(for example, into /usr/local). You can find the answer in the FAQ.
So, recommend this answer and this blog article.
Old answer
Following the CMake document of FindGTest.
The code below works for me.
cmake_minimum_required(VERSION 2.8)
################################
# Add gtest environment
################################
enable_testing()
find_package(GTest REQUIRED)
# add gtest include directory: way 1
include_directories(${GTest_INCLUDE_DIRS})
# add gtest include directory: way 2
#include_directories(${GTest_SOURCE_DIRS}/include ${GTest_SOURCE_DIR})
################################
# Build tests
################################
aux_source_directory(. DIR_SRCS)
add_executable(fooTest ${DIR_SRCS})
# parameter `gtest` should at the front of `pthread`
target_link_libraries(fooTest gtest pthread)
# Take all gtest cases as one Cmake test case
add_test(AllFooTest fooTest)
And then, you can using command:
cmake ., generate Makefile
make, build gtest routine
./fooTest, run gtest routine
make test, run cmake test, it's another way you can run the gtest
I am trying to compile my project which has the following structure
Project:
MakeFile
Executable
Source1
.cxx
.h
Source2
.cxx
.h
Build
*.o
And I'm having difficulty writting a Makefile to compile. I currently have commands like:
Src1 = $(wildcard $(SRCDIR1)/*.cxx)
Obj1 = $(patsubst $(SRCDIR1)/%.cxx, $(OBJDIR)/%.o, $(Src1))
But then I have difficulty making the compile rules for the object files a) Because I can no longer do:
$(Obj1): %.cxx
$(CXX) $(CFLAGS) -c $(#:.o=.cxx) -o $#
Because the '$#' command now includes the path of the build directory and b) because the prerequisites now include the build path and I should have a source path. I have read large bits of the make manual to try and find a solution but no luck.
Any help towards a solution appreciated!
Jack
From personal experience, after playing around a bit with "raw" Makefiles, I'd really recommend using some tool building the Makefiles for you, like automake or cmake.
You'll still have to specify all the source files manually - but at least I prefer that to manually fiddling around with the Makefiles.
One option I prefer is building an isomorphic directory structure in the build directory. That is, a source file ${src_dir}/project_x/main.cxx builds into ${build_dir}/project_x/main.o. This way you are protected from name clashes when there are source files with the same name in different source directories. The compiler rule would look something like:
${obj_dir}/%.o : ${src_dir}/%.cxx # % includes directory name, e.g. project_x/main
#-mkdir -p ${#D}
${CXX} -c -o $# ${CPPFLAGS} ${CXXFLAGS} $<
Notice how in the above it creates the target directory on the fly. This is a bit simplistic, in a real-world build system object files depend (using order-only dependency) on its directory, so that make automatically creates the destination directory if it does not exist instead of speculatively trying to create them for each target object file even if it already exists.
My knowledge of make and autotools (which I'm not yet using for this project) is rudimentary at best despite plenty of googling and experimenting over a long period of time. I have a source hierarchy like below that I'm trying to find way to build has seamlessly as possible.
The application is made up of a main application with source in various subfolders under app/src. These are built with the respective Makefile in the root of that folder.
Then I have multiple other utilities that reside different folders under app/tools that each have their own Makefile.
app/src/module1/file1.cpp
app/src/module1/file1.hpp
app/src/module2/file2.cpp
app/src/module2/file2.hpp
app/src/module3/file3.cpp
app/src/module3/file3.hpp
app/src/main.cpp
app/src/main.hpp
app/src/Makefile
app/tools/util1/file1.cpp
app/tools/util1/file1.hpp
app/tools/util1/Makefile
app/tools/util2/file2.cpp
app/tools/util2/file2.hpp
app/tools/util2/Makefile
The problem for me is that some of these tools depend on source files inside the app/src source folder, but with a preprocess macro EXTERNAL_TOOL enabled. So the object files generated from compiling the main app and the varous utilities are not compatible.
Currently to build each portion of the project I'm having to clean the source tree in between. This is painful and certainly not what I want in the end. What would be the best way to go about solving this? Ideas I've had that I've no been able to put into practice are:
Separate build directory for each portion of the project
When building the external tools, tagging their object files in the main app source tree somehow (util.file1.o?)
I'm not too sure I have the time and patience needed to master make / autotools. Might one of the other build tools (scons? cmake?) make this kind of task easier to accomplish? If so which one?
UPDATE: This is what I've got now
SOURCES := util1.cpp util2.cpp util3.cpp \
../../src/module1/file1.cpp \
../../src/module1/file2.cpp \
../../src/module1/file3.cpp \
../../src/module2/file4.cpp \
../../src/module3/file5.cpp \
../../src/module3/file6.cpp \
../../src/module4/file7.cpp \
../../src/module4/file8.cpp \
../../src/module3/file9.cpp \
../../src/module4/file10.cpp \
../../src/module5/file11.cpp \
../../src/module3/file12.cpp \
../../src/module1/file13.cpp \
../../src/module3/file14.cpp \
../../src/module3/file15.cpp
OBJECTS = $(join $(addsuffix .util/, $(dir $(SOURCES))), $(notdir $(SOURCES:.cpp=.o)))
.PHONY: all mkdir
all: util
util: $(OBJECTS)
$(CXX) $(CXXFLAGS) $(OBJECTS) $(LIBS) -o util
$(OBJECTS): | mkdir
$(CXX) -c $(CXXFLAGS) -o $# $(patsubst %.o,%.cpp,$(subst .util/,,$#))
mkdir:
#mkdir -p $(sort $(dir $(OBJECTS)))
clean:
-#rm -f $(OBJECTS) util
-#rmdir $(sort $(dir $(OBJECTS))) 2>/dev/null
I came about this after extensive googling SO browsing. This seems to work, but this part doesn't really seem particular nice (feels like a bit of a hack):
$(OBJECTS): | mkdir
$(CXX) -c $(CXXFLAGS) -o $# $(patsubst %.o,%.cpp,$(subst .util/,,$#))
In particular I'm not too keen on the fact I'm creating the list of objects from sources earlier on and adding the suffix, only to do the reverse down here. I couldn't seem to get it working any other way.
CMake has add_definitions and remove_definitions commands. You can use them to define macros for different parts of your project:
# building tools #
add_definitions(-DEXTERNAL_TOOL)
add_subdirectory($TOOL1$ $BUILD_DIR$)
add_subdirectory($TOOL2$ $BUILD_DIR$)
...
# building main app #
remove_definitions(-DEXTERNAL_TOOL)
add_executable(...)
This can be done with SCons rather painlessly. You will definitely need a build directory hierarchy for the objects built with different preprocessor macros. In SCons terms, creating build directories like this is called variant_dir. I would recomend the following SCons Hierarchical build structure:
app/SConstruct
app/src/module1/file1.cpp
app/src/module1/file1.hpp
app/src/module2/file2.cpp
app/src/module2/file2.hpp
app/src/module3/file3.cpp
app/src/module3/file3.hpp
app/src/main.cpp
app/src/main.hpp
app/src/SConscript_modules
app/src/SConscript_main
app/tools/util1/file1.cpp
app/tools/util1/file1.hpp
app/tools/util2/file2.cpp
app/tools/util2/file2.hpp
app/tools/SConscript
app/build/main/
app/build/target1/modules/
app/build/target2/modules/
app/build/tools/utils/
To be able to build the same source files with different preprocessor macros, you will need to build the same file with several different Environments. These env's could be setup in the src/module SConscript scripts, or from the root SConstruct and passed down. I prefer the second option, since it will make the src/module SCons scripts modular, and unaware (agnostic) of the preprocessor macros.
Here is the root build script, which creates the different env's and orchestrates the sub-directory build scripts:
app/SConstruct
defines1 = ['MACRO1']
defines2 = ['MACRO2']
env1 = Environment(CPPDEFINES = defines1)
env2 = Environment(CPPDEFINES = defines2)
includePaths = [
'src/module1',
'src/module2',
'src/module3',
]
env1.Append(CPPPATH = includePaths)
env2.Append(CPPPATH = includePaths)
# Build different versions of the module libs
SConscript('src/SConscript_modules',
variant_dir = '#build/target1/modules',
exports = {'env':env1},
duplicate=0)
SConscript('src/SConscript_modules',
variant_dir = '#build/target2/modules',
exports = {'env':env2},
duplicate=0)
# Build main with env1
SConscript('src/SConscript_main',
variant_dir = '#build/main',
exports = {'env':env2},
duplicate=0)
# Build tools with env2
SConscript('tools/SConscript',
variant_dir = '#build/utils',
exports = {'env':env2},
duplicate=0)
This is the build script for main
app/src/SConscript_main
Import('env')
sourceFiles = ['main.cpp']
# If you want to modify the env here, Clone() it first, otherwise
# the changes will be visible to all other SConscripts
env.Program(target = 'main', source = sourceFiles)
This is the build script for the module libs, it will be called twice, each time with a different env
app/src/SConscript_modules
Import('env')
module1SourceFiles = ['file1.cpp']
module2SourceFiles = ['file2.cpp']
module3SourceFiles = ['file3.cpp']
# If you want to modify the env here, Clone() it first, otherwise
# the changes will be visible to all other SConscripts
env.Library(target = 'module1', source = module1SourceFiles)
env.Library(target = 'module2', source = module2SourceFiles)
env.Library(target = 'module3', source = module3SourceFiles)
Is there a clean/portable way to descend recursively from a given directory, compiling all found .cpp files into a single output file? I'm not sure if makefiles are capable of this sort of thing, or if it's a job for some kind of build script, but I'd like to avoid maintaining various IDEs' project files along with my code.
There are different things that you can do here. I would suggest that you use a multiplatform build system, and follow the documentation for it. I have used CMake in the past, but I wouldn't know how to tell it to compile all files in a directory.
The advantage is that the user can use CMake to generate project files for most common IDEs, so it would allow VisualStudio users to generate VS solutions, MacOSX users to generate Xcode projects, Eclipse CDK projects in pretty much any environment, Makefiles...
There's the wildcard function which can be used to match a pattern like so:
CXX_FILES = $(wildcard src/*.cpp) # All .cpp files in the directory
This is not recursive, but will at least save you from having to manually specify the files in a certain directory. The rule for building them would look something like this:
CXX_FILES = $(wildcard src/*.cpp) # All .cpp files in the directory
OBJ_FILES = $(CXX_FILES:src/%.cpp=$(OBJ_DIR)/%.o) # Corresponding .o files
# Rules
all: $(OBJ_FILES)
g++ $(OBJ_FILES) -o output_filename
$(OBJ_DIR)/%.o: src/%.cpp
g++ -c $< -o $#
Oh, and to answer your question, this method is completely portable.