I am trying to think through how to design a new project build system.
I want to use CMake to compile for Windows and Linux. I want to have Release and Debug builds, which are pretty straight forward.
The problem is that I will want to deploy this software for multiple hardware projects over the course of the next few years. This means that I will have pre-processor definitions to turn on/off and or alter functionality in my C++ program. I will also want to link configuration specific libraries. A current legacy project uses a header with a list of defines and then links all possible project dependencies.
An example header:
// projects.h
//#define project1
//#define project2
#define project3
Which means that to change hardware/project configurations you have to edit this file to make sure the correct hardware configuration is selected.
What I want to do is to have a configuration for each project which can also be configured for Release or Debug build.
Is there any advice on how to more optimally deal with this in CMake?
for multiple hardware projects
For me it got messy and got messy real quick.
My goals were:
I don't like a single big config.h file. I don't want to have to recompile 100% of files because I added a space in a central config.h file
All boards expose an API that user application can use. Boards don't mix together.
I can easy add new boards and have a board to test the application.
What I discovered is that:
cmake is used to configure your project for different configurations. So cmake does not store a configuration, it is used to choose a configuration.
So you need "external" place to store the configurations for current project
I have multiple applications and multiple boards, the chosen application configuration options are stored inside a makefile. This makefile then configures cmake, which in turn configures the build system, that can then be used to build the application. This is so, because cmake supports one configuration per toolchain. So for architectures with different compilers/compiler options I have to re-run cmake.
I have basically such a directory structure:
- app
- CMakeLists.txt
- main.c
- lib.c
- boards # this is actually git submodule shared between multiple projects
- boardapi # generic board api library
- CMakeLists.txt
- uart_api.h # exposes an api to access uart
- spi_api.h
- timer_api.h
- board_api.h
- some_defines.h # includes some_defines_impl.h
- BOARD1
- toolchain.cmake
- CMakeLists.txt
- implementation1.c
- implementation2.c
- some_defines_impl.h
- BOARD2
- toolchain.cmake
- CMakeLists.txt
- implementation1.c
- implementation2.c
- some_defines_impl.h
- linux
- ... as above ...
- armv5te
- ... as above ...
- CMakeLists.txt
- CMakeLists.txt
- Makefile
I have a boards directory with multiple folders per one board.
Each folder has it's own toolchain.cmake file and CMakeLists.txt file
In each CMakeLists.txt in the board folder a library with the name of the folder is added.
I have a makefile that iterates for all the boards I want to compile this application for and runs cmake + make for each of the boards. Basically BOARDS=BOARD1 BOARD2 and then all: $(foreach board,$(BOARDS),cmake;make;)
cmake is configured with cmake -DCMAKE_TOOLCHAIN_FILE=board/$(BOARD)/toolchain.cmake -DBOARD=$(BOARD)
The board/CMakeLists.txt file all does is add_subdirectory(${BOARD}) and add_library(board INTERFACE ${BOARD}).
The main CMakeLists.txt does add_subdirectory(boards) and after that link_libraries(boards). That should link all libraries with the board library.
Each board directory in turn controls the specific board configuration.
If you have multiple project configurations, just act as if they were normal projects and don't care about board. Example: iterate over them inside app/CMakeLists.txt, like so:
foreach(i IN LISTS project1 project2 project3)
add_executable(main+${i} main.c app.c)
target_add_definitions(${i})
endforeach()
Alternatively, if ex. there are many macros, you could create another structure projects/{project1,project2,project3} and recreate the same structure as for boards, but without the toolchain.cmake file. Then just iterate foreach project and do target_link_libraries(main+${i} ${i}) that will effectively bring all the macro definitions to main application.
Related
I am confused on the right way to get an external library integrated into my own Cmake project (This external project needs to be built along with my project, it's not installed separately, so we can't use find_library, or so I think)
Let's assume we have a project structure like this (simplified for this post):
my_proj/
--CMakeLists.txt
--src/
+---CMakeLists.txt
+---my_server.cpp
That is, we have a master CMakeLists.txt that basically sits at root and invokes CMakeLists for sub directories. Obviously, in this example, because its simplified, I'm not showing all the other files/directories.
I now want to include another C++ GitHub project in my build, which happens to be this C++ bycrypt implementation: https://github.com/trusch/libbcrypt
My goal:
While building my_server.cpp via its make process, I'd like to include the header files for bcrypt and link with its library.
What I've done so far:
- I added a git module for this external library at my project root:
[submodule "third_party/bcrypt"]
path = third_party/bcrypt
url = https://github.com/trusch/libbcrypt
So now, when I checkout my project and do a submodule update, it pulls down bcrypt to ${PROJ_ROOT}/third_party
Next up, I added this to my ROOT CMakeLists.txt
# Process subdirectories
add_subdirectory(third_party/bcrypt)
add_subdirectory(src/)
Great. I know see when I invoke cmake from root, it builds bcrypt inside third_party. And then it builds my src/ directory. The reason I do this is I assume this is the best way to make sure the bcrypt library is ready before my src directory is built.
Questions:
a) Now how do I correctly get the include header path and the library location of this built library into the CMakeLists.txt file inside src/ ? Should I be hardcoding #include "../third_party/bcrypt/include/bcrypt/bcrypt.h" into my_server.cpp and -L ../third_party/libcrypt.so into src/CMakeLists.txt or is there a better way? This is what I've done today and it works, but it looks odd
I have, in src/CMakeLists.txt
set(BCRYPT_LIB,"../third_party/bcrypt/libbcrypt.so")
target_link_libraries(my app ${MY_OTHERLIBS} ${BCRYPT_LIB})
b) Is my approach of relying on sequence of add_directory correct?
Thank you.
The best approach depends on what the bcrypt CMake files are providing you, but it sounds like you want to use find_package, rather than hard-coding the paths. Check out this answer, but there are a few different configurations for find_package: MODULE and CONFIG mode.
If bcrypt builds, and one of the following files gets created for you:
FindBcrypt.cmake
bcrypt-config.cmake
BcryptConfig.cmake
that might give you an idea for which find_package configuration to use. I suggest you check out the documentation for find_package, and look closely at how the search procedure is set up to determine how CMake is searching for bcrypt.
I would like to have the following structure A -> B -> C, where:
C is boilerplate code, wrappers for third-party libraries, very
basic code etc.
B is the common classes, functions and data
structures specific to the project's domain.
A is the project itself.
I would like to make it easy to reuse C or B(+C) in future in my other projects. In addition, I have the following requirements:
As all three projects are in-progress, I would like to have an ability to build C, C+B and C+B+A in one shot.
I would prefer the static linkage over dynamic, so that C and C+B would be static libraries, and C+B+A would be the executable
I would like to keep cmake lists and config files simple and clean. Examples which I found in the official wiki and over the internet are pretty big and monstrous.
It would be great if it won't require changing more than a couple of lines if I'd change the locations of A, B or C in the filesystem.
All these three components are using google-test, but I'm not sure if it is important for the project layout.
I am pretty new to cmake and I don't even understand is it better to write XXXConfig.cmake or FindXXX.cmake files. Also, I am not sure, how should I pass relative paths from subcomponent to the parent component using X_INCLUDE_DIRS.
First I have to admit that I agree with #Tsyvarev. Your CMake environment should fit to your processes/workflow and should take project sizes and team structure into account. Or generally speaking the environment CMake will be used in. And this tends to be - in a positive way - very alive.
So this part of your question is difficult to answer and I'll concentrate on the technical part:
CMake has to know the location of the dependencies - relative or absolute - by
having a monolithic source tree (the one you don't want anymore)
CMake share library with multiple executables
CMake: How to setup Source, Library and CMakeLists.txt dependencies?
a common directory location for includes/libraries/binaries
Custom Directory for CMake Library Output
cmake install not installing libraries on windows
getting the paths via config files/variable definitions
How can I get cmake to find my alternative boost installation?
How to add_custom_command() for the CMake build process itself?
using registration in or installation from a database provided on the host
Making cmake library accessible by other cmake packages automatically
cmake wont run build_command in ExternalProject_Add correctly
To keep your CMake files as simple as possible I would recommend to group your CMake code into separate dedicated files:
Prefer toolchain files over if(SomeCompiler) statements
Move common/repeating code parts as function() bodies into a shared CMake include file
Move complex non-target specific code parts into their own (CMake) script files
Example Code
Since you have specifically asked for the find_package() variant, taking Use CMake-enabled libraries in your CMake project and the things listed above:
MyCommonCode.cmake
cmake_policy(SET CMP0022 NEW)
function(my_export_target _target _include_dir)
file(
WRITE "${CMAKE_CURRENT_BINARY_DIR}/${_target}Config.cmake"
"
include(\"\$\{CMAKE_CURRENT_LIST_DIR\}/${_target}Targets.cmake\")
set_property(
TARGET ${_target}
APPEND PROPERTY
INTERFACE_INCLUDE_DIRECTORIES \"${_include_dir}\"
)
"
)
export(
TARGETS ${_target}
FILE "${CMAKE_CURRENT_BINARY_DIR}/${_target}Targets.cmake"
EXPORT_LINK_INTERFACE_LIBRARIES
)
export(PACKAGE ${_target})
endfunction(my_export_target)
C/CMakeLists.txt
include(MyCommonCode.cmake)
...
my_export_target(C "${CMAKE_CURRENT_SOURCE_DIR}/include")
B/CMakeLists.txt
include(MyCommonCode.cmake)
find_package(C REQUIRED)
...
target_link_libraries(B C)
my_export_target(B "${CMAKE_CURRENT_SOURCE_DIR}/include")
A/CMakeLists.txt
include(MyCommonCode.cmake)
find_package(B REQUIRED)
...
target_link_libraries(A B)
This keeps all 3 build environments separate, only sharing the relatively static MyCommonCode.cmake file. So in this approach I have so far not covered your first point, but would recommend the use of a external script to chain/trigger your build steps for A/B/C.
I have a complex C/C++ bunch of applications that I'm working on which is supposed to also be platform independent. So far, is UNIX/Windows compatible and it runs fine. However, maintaing this monster on VS2010 is a nightmare. I have the following file structure:
/
sources
lib1
include
...
src
...
lib2
include
...
src
...
app3
include
...
src
...
builders
cmake
...
make
...
VS2010
vs2010.sln
lib1
lib1.vcxproj
lib1.vcxproj.filters
lib2
lib2.vcxproj
lib2.vcxproj.filters
app3
app3.vcxproj
app3.vcxproj.filters
As we can see, because everything is platform independent, I had to completely separate the builders from sources. IMHO, that itself is a very good practice and it should be enforced by everyone :)
Here is the problem now... VS2010 is completely unusable when it comes to organize the include/sources files in filters. You have to do that manually by repeatedly doing "Add -> New Filter" followed by "Add -> Exiting Item". I have a VERY complex folder structure and files in each and every include folder. The task for creating the filters becomes a full day job. On the other hand, I could just drag the whole folder from Explorer onto the project inside VS2010 but it will put all header/source files in there without any filters, rendering it worthless: you can't possible search within 100 files for the right one without having some sort of hierarchy..
Question is:
Is VS2010 having some obscure way of importing a folder AND preserving the folder structure as filters? Looks to me that M$FT people who created VS2010 think that M$FT is the only animal in the jungle and you MUST pollute the sources folder with builders projects so you can leverage "show hiden files" to include them in the project along with the folder structure. That is absurd IMHO...
You are using CMake, so I advise you stick with only this. You can generate makefiles and VS2010 project files with it (at least). For VS, the generated files are a sln and a bunch of vxproj (one for each project in the CMake script).
In CMake file you can group files, using the command source_group. The filters will be automatically generated for vs according to the source groups. I don't know for other IDE like Code::Blocks or NetBeans.
If you want automatic grouping based on file path [Comment request]:
# Glob all sources file inside directory ${DIRECTORY}
file(GLOB_RECURSE TMP_FILES
${DIRECTORY}/*.h
${DIRECTORY}/*.cpp
${DIRECTORY}/*.c
)
foreach(f ${TMP_FILES})
# Get the path of the file relative to ${DIRECTORY},
# then alter it (not compulsory)
file(RELATIVE_PATH SRCGR ${DIRECTORY} ${f})
set(SRCGR "Something/${SRCGR}")
# Extract the folder, ie remove the filename part
string(REGEX REPLACE "(.*)(/[^/]*)$" "\\1" SRCGR ${SRCGR})
# Source_group expects \\ (double antislash), not / (slash)
string(REPLACE / \\ SRCGR ${SRCGR})
source_group("${SRCGR}" FILES ${f})
endforeach()
I'm currently working to upgrade a set of c++ binaries that each use their own set of Makefiles to something more modern based off of Autotools. However I can't figure out how to include a third party library (eg. the Oracle Instant Client) into the build/packaging process.
Is this something really simple that I've missed?
Edit to add more detail
My current build environment looks like the following:
/src
/lib
/libfoo
... source and header files
Makefile
/oci #Oracle Instant Client
... header and shared libraries
Makefile
/bin
/bar
... source and header files
Makefile
Makefile
/build
/bin
/lib
build.sh
Today the top level build.sh does the following steps:
Runs each lib's Makefile and copies the output to /build/lib
Runs each binary's Makefile and copied the output to /build/bin
Each Makefile has a set of hardcoded paths to the various sibling directories. Needless to say this has become a nightmare to maintain. I have started testing out autotools but where I am stuck is figuring out the equivalent to copying /src/lib/oci/*.so to /build/lib for compile time linking and bundling into a distribution.
I figured out how to make this happen.
First I switched to a non recursive make.
Next I made the following changes to configure.am as per this page http://www.openismus.com/documents/linux/using_libraries/using_libraries
AC_ARG_WITH([oci-include-path],
[AS_HELP_STRING([--with-oci-include-path],
[location of the oci headers, defaults to lib/oci])],
[OCI_CFLAGS="-$withval"],
[OCI_CFLAGS="-Ilib/oci"])
AC_SUBST([OCI_CFLAGS])
AC_ARG_WITH([oci-lib-path],
[AS_HELP_STRING([--with-oci-lib-path],
[location of the oci libraries, defaults to lib/oci])],
[OCI_LIBS="-L$withval -lclntsh -lnnz11"],
[OCI_LIBS='-L./lib/oci -lclntsh -lnnz11'])
AC_SUBST([OCI_LIBS])
In the Makefile.am you then use the following lines (assuming a binary named foo)
foo_CPPFLAGS = $(OCI_CFLAGS)
foo_LDADD = libnavycommon.la $(OCI_LIBS)
ocidir = $(libdir)
oci_DATA = lib/oci/libclntsh.so.11.1 \
lib/oci/libnnz11.so \
lib/oci/libocci.so.11.1 \
lib/oci/libociicus.so \
lib/oci/libocijdbc11.so
The autotools are not a package management system, and attempting to put that type of functionality in is a bad idea. Rather than incorporating the third party library into your distribution, you should simply have the configure script check for its existence and abort if the required library is not available. The onus is on the user to satisfy the dependency. You can then release a binary package that will allow the user to use the package management system to simplify dependency resolution.
I've become a maintainer of a shared library project. The library is split into a few modules, each of them compiled as static library, then linked together. Eclipse is used as IDE, code stored at SVN server. So far the building process was handlet by hand - building libraries, moving all the .a and .h into shared folder, then building the shared library. The code needs to be compiled for linux, ARM and windows.
The problem is that I need to split the current modules a little bit more, for better testing (multiple test and example simple programs, just one .cpp file with main) and inter-module code sharing (both module A and B use C, but I don't want to connect A and B). This results into more complex dependency tree which is going to be difficult to handle by hand. I also need to be able to build more configurations of one project, possibly linking to different version of dependent projects.
How would you organise the code and set up the development environment?
EDIT: the concrete things I need from the DE:
IDE with GUI (I like vim and shell, but the others don't)
Separate projects, each creating static library, set of headers to include and example programs
Different configurations for each project, linking/including different versions and/or configurations of dependencies
Code completion and SVN support
make and Makefiles are the established and very well-thought-out method for such building and linking jobs, especially in combination with automake and libtool. These tools integrate excellently with SVN, and probably also with Eclipse.
So I solved it for now. I created a folder called Pool. Directory tree:
Pool
- inc
- arm
- proj1 public headers directory
- proj2 public headers directory
- proj3 public headers directory
- lin
- proj1 public headers directory
- proj2 public headers directory
- proj3 public headers directory
- win
- proj1 public headers directory
- proj2 public headers directory
- proj3 public headers directory
-lib
- arm
- libproj1.a
- libproj2.a
- libproj3.a
- lin
- libproj1.a
- libproj2.a
- libproj3.a
- win
- libproj1.a
- libproj2.a
- libproj3.a
The libraries are copied here automatically using makefile. Including header:
#include "proj1/someheader.h"
Linking it:
-L${POOL}/lib/arm -lproj1
Note: beware of -l library parameters order.