I have a C++ project with its source files (.cpp and .h files) in a directory called src and its subdirectories. I want, once my project is compiled, to copy all the header files from this source folder into another directory, maintaining the folderstructure of src.
I have tried to copy these files with post-build commands:
postbuildcommands
{
"{COPY} src/*.h include"
}
and
postbuildcommands
{
"{COPY} src/**.h include"
}
But these only copy the .h files directly in src and not those in subdirectories. For example, this
src
+-- a.h
+-- a.cpp
+-- sub
| +-- b.h
| +-- b.cpp
becomes
include
+-- a.h
instead of
include
+-- a.h
+-- sub
| +-- b.h
Are you using windows linux or mac?
Or does it need to be cross platform?
It looks like the {copy} token doesn't pass the /s flag to xcopy on windows
https://github.com/premake/premake-core/wiki/Tokens
https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/xcopy
One possible solution is to find all files or get them from the project somehow and create a postbuild command for each header file which might be a lot slower
Related
I have a project structured in this way:
project
|\_subdir
| | Some .cxx files
| | some .h files (included with #include "foo.h"
| \_subdir
| | Some .h files (included with #include <subdir/foo.h>)
\_subdir2
| some .cxx files
| some .h files (included with #include "foo.h"
\_subdir2
| Some .h files (included with #include <subdir2/foo.h>)
I need to do a few things here:
The subdir/subdir directories should be publicly available for use in programs that use this library with #include <subdir/foo.h>.
The subdir directories should produce their own statically linked libraries so that I can run tests making sure that the dependency relationships between the different subdir directories are maintained.
The whole thing should produce either a shared or static library that has everything in all the subdir directories and can be linked to by programs using this library.
Currently I have a build system that I've made using autotools (but not automake) plus some custom made perl scripts that process the dependency output from the -M flags and create custom makefiles that express dependency information. Additionally, they create an include directory that's a linkfarm so everybody can use #include <subdir/foo.h> and just -I the linkfarm directory.
I want to convert this to CMake, but I'm at a loss as to how to handle the subdir structure and includes. I can re-arrange the directory structure somewhat if absolutely necessary, but I would very much like to preserve the pseudo-independence of the various subdir directories, partly because it's very important to me to be able to make sure the sub-libraries have a specific dependency relationship as a way to maintain modularity and cleanliness.
How would I do this in CMake? Do I have move the subdir/subdir directories?
Because it looks like if I add the subdir directories as a public include, then things will go really awry as people will see the private .h files at the subdir level as well as being able to include the .cxx files that are in them. Even worse, they will see them as top-level includes (i.e. #include <bar.cxx> where bar.cxx is a private implementation file who's object code should already be in a library).
But if add subdir/subdir as an include directory, the .h files won't appear in subdirectories like they should (i.e. #include <subdir/public.h> won't work, instead I'll have to use #include <public.h> which isn't my intent at all).
If you bother about users of your library, they will see installed files, not ones in the source/build directories. If you don't install private headers (from subdir/), a user won't see them.
But if you do not intend to install library, then you may separate public and private include trees. E.g. each project may have private headers directly in its source directory, and public headers under its include/ subdirectory:
project
|\_subdir
| | Some .cxx files
| | Private .h files (included with #include "foo.h")
| \_include (public include directory)
| \_subdir
| | Public .h files (included with #include <subdir/foo.h>)
\_subdir2
| some .cxx files
| Private .h files (included with #include "foo.h")
\_include (public include directory)
\_subdir2
| Public .h files (included with #include <subdir2/foo.h>)
I've recently started watching STanford's cs106B lectures on youtube, and I've downloaded their "Stanford C++ Libraries" that they've made. I've right-clicked my project, and added the whole folder (named "cs106lib-0.3.1") to the "Include directories" and "Include Headers" sections" but when I import one of the headers "vector.h" and create an object using it it says "unable to resolve identifier vector", and the compiler says the folder doesn't exist, although it's definitely on my desktop. Sorry, if this question has been asked then I can't find it, but I have been stuck looking for the past day.
Don't mix include directories and -headers.
Include directories: Adds directories where your header files are
Include headers: Adds single header files
Also make sure your paths are ok. Let's assume a structure like this:
cs106lib-0.3.1
|
+-- include
| |
| +-- Example1.h
| |
| +-- subdir/Example2.h
|
+-- ...
In this case you have add the directory cs106lib-0.3.1/include to include directories.
Now you can use it like this:
#include "Example1.h"
#include "subdir/Example2.h"
// ...
Also don't forget to add the binaries (if you have) to linker flags.
TIP: Use the code completion to see where you are; eg. type #include "../" <Ctrl+Space> to see files and directories available for to include.
I'm browsing a project which has a src directory similar to:
src/
|-- A.cpp
|-- dir/
|-- |-- B.h
|-- |-- B.cpp
In A.cpp they include B.h as such:
// In A.cpp:
#include "B.h"
In the visual studio 2010 solution generated with CMake this compiles just fine, and there exists no conflict. However, if I make a new project (not from CMake, but manually) and do the include as above, VS will not find B.h. I.e.:
// In A.cpp (non-CMake project version)
#include "B.h" // Error: File not found (VS does not search the
//sub-directory when name is surrounded by "").
#include "dir/B.h" // Works: sub-directory specified.
The CMake file uses GLOB_RECURSE (which I assume is the reason the above works, please correct me if I'm wrong), and simplified it looks something like this:
cmake_minimum_required (VERSION 2.6)
project (cmake_test)
file(GLOB_RECURSE lib_SRCS RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} *.cpp *.h)
include_directories(
${CMAKE_CURRENT_SOURCE_DIR}
${CMAKE_CURRENT_SOURCE_DIR}/dir
)
add_library(lib STATIC
${lib_SRCS}
)
target_link_libraries(lib)
I've been examining the difference of the solution generated by CMake and the one I manually created, but I can't seem to find it.
My question is, in other words, how I could accomplish having VS find my B.h without the relative path included as the solution generated with CMake displays, without actually using CMake (such as by some option or configuration in my Visual Studio Project).
The part that makes it work is
include_directories(
${CMAKE_CURRENT_SOURCE_DIR}
${CMAKE_CURRENT_SOURCE_DIR}/dir
)
Your dir/ directory is added to the include directories. In Visual Studio you need to go to project properties at C/C++->General->Additional Include Directories and add dir/.
I have the following direcory stucture:
src
+-- lib1
+-- lib1.h
+-- lib2
+-- lib2.h
Both lib1 and lib2 are gonna be distributed (installed). lib2 makes use of lib1, so it needs some includes:
#include "../lib1/lib1.h" // 1
#include "lib1/lib1.h" // 2
#include <lib1/lib1.h> // 3
(1) is the straight forward way, but is very unflexible. (2) is the way I use at the moment, but the build system needs to know that src needs to be added to the include path. (3) seems the best for me under the distribution aspect because then it can be assumed that the headers reside in a standard location, but it's not too obvious for me how a build system handles that (in this case, lib1 needs to be installed before lib2 can be compiled).
What's the recommended way?
The only difference between "" and <> forms of include is that the "" forms first search in some places and then fallback to the same places as <>. The set of additional places is implementation dependent and the only common one is the directory of the file containing the include directive. The compiler options which add to the include path usually add for the <> form and so those directories get searched for both form.
So the choice between the two forms is mostly a style one. Using the "" form for the current project and the <> one for system libraries is common. For things in between, made a choice and stick to it in your project.
I vote for version 2.
#include "../lib1/lib1.h" // 1
This assumes the tree will always stay the same. So when you change your structure, you'll need to modify this everywhere.
#include "lib1/lib1.h" // 2
I don't see what the problem of adding src to the include path is. Actually, you don't even need to add src to the include path, you can directly add src/lib1 and just have #include "lib1.h"
#include <lib1/lib1.h> // 3
This style of includes is used for system headers. You should avoid this, as most programmers are used to see windows.h or string or vector inside <>. You're also telling the compiler to first look for those headers in default directories rather than your own. I'd avoid this one.
Side note:
You should think about a structure like this:
src
+-- lib1
+-- lib1.h
+-- lib2
+-- lib2.h
include
where the include directory contains all publicly visible headers. If lib1.h is public, move it there. If not, the structure you currently have should be ok.
I have C++ project that are built using Boost.Build. The project consists of 3 subprojects.
. [root]
\-- source
\-- common
\-- config
\-- config.cpp
\-- project_1
\-- Jamfile.jam
\-- project_2
\-- Jamfile.jam
\-- project_3
\-- Jamfile.jam
\-- Jamroot.jam
Jamroot.jam:
project my_project
: requirements
multi
debug:DEBUG
: default-build
static
: build-dir bin
;
alias project_1 : source/project_1 ;
alias project_2 : source/project_2 ;
alias project_3 : source/project_3 ;
install dist : project_1 project_2 project_3
: on EXE
;
Each project has Jamfile.jam according to this template:
project project_N
: requirements
CONFIG_DEFINE_1=
CONFIG_DEFINE_2=
;
lib config : [ glob ../common/config/*.cpp ] ;
exe project_N
: [ glob *.cpp ] config
:
;
config.cpp uses defines CONFIG_DEFINE_1 and CONFIG_DEFINE_2 for conditional compilation (actually they are simply constants), so there's a separate version of config library per project.
The problem is that such approach causes the config library to be rebuilt each time the whole project is built regardless of were the files changed or not. I.e. building the first time everything is compiled and linked, building the second time without doing any modifications - only the config library is built for each project_N. How should I properly setup the building so no redundand compilation occur?
As I understand it, your config library is shared over different projects and uses different defines for each project.
It's not possible to overcome recompilation in that case, irrespective of boost build.or any other build system. In between compiles of the cpp files the preprocessed files have changed.
If you want to avoid recompilation one option is to split of the config library into different libraries for each project, but depending on what config looks like, having a lot of code duplication is rarely desireable either...
The only other option I can think of is reducing the amount of code that needs to be recompiled every time.
e.g. you have a sourcefile LargeFunction.cpp with
#if CONFIG_DEFINE_1
void VeryLargeFunction() {
...
}
#elif CONFIG_DEFINE_2
void VeryLargeFunction() {
...
}
#endif
Split it into three files, one containing the VeryLargeFunction as defined for DEFINE_1, one as defined for DEFINE_2, and one which simply includes these two based on the values of the defines.
#if CONFIG_DEFINE_1
#include "definitionFileFor1"
#elif CONFIG_DEFINE_2
#include "definitionFileFor2"
#endif
This file still needs to be recompiled every time, but the object files which contain the 'real' code, will not.
You'll effectively only relink existing object files on every compile, i.s.o. recompiling everything.
The disadvantage is more maintenance though, and the different function definitions reside in different files, so the code becomes a bit harder to read.