I'm trying to create a pybind project with multiple subdirectories. I managed to get the example.cpp file working with the following command (link below):
c++ -O3 -Wall -shared -std=c++11 -undefined dynamic_lookup `python3 -m pybind11 --includes` example.cpp -o example`python3-config --extension-suffix`
https://pybind11.readthedocs.io/en/stable/compiling.html#building-manually
It works right now by importing "example" and calling example.method. What I am trying to do moving forward is create a Makefile independent of cmake where I have the following directory structure and import classes when needed.
Top Level Folder
├── Makefile
├── Folder 1
│ ├── Makefile?
│ ├── example1.cpp
│ ├── example2.cpp
│ ├── example3.cpp
│ ├── ...
├── Folder 2
│ ├── Makefile?
│ ├── example1.cpp
│ ├── example2.cpp
│ ├── example3.cpp
│ ├── ...
I want to make the Makefile generic so that I can create a new folder with new cpp files and everything can work cleanly in python. Some classes may use information from other classes in different folders and I'm having some difficulty doing this. Does anyone know how to do this? Thanks!
Related
I am new to Makefiles and have been trying to get following working with little success. In my project each subdir has:
Release/ dir with a makefile that builds object files into ./Release/Src/ and outputs a dir.so to ../Bin/ (1 for each dir)
Debug/ dir with a makefile that build object files into ./Debug/Src/ and outputs a dir.so to ../Bin/ (1 for each dir)
Src/ dir with source code from which Release or Debug is built
Currently the below Makefile seems to always builds Release/ and not Debug/. I would like a way (or some equivalent) to run something like:
make release - which would recursively run make in the release folders OR
make debug - which would recursively run make in the debug folders
I would like to retain the recursive nature of the makefiles if possible here (I appreciate there is some dispute as to whether this is best practice).
Project structure:
├── Makefile
├── Bin
│ ├── *.so
├── dir1
│ ├── Src
│ │ ├── prog1.cpp
│ │ ├── prog2.cpp
│ ├── Debug
│ │ ├── makefile
│ │ ├── Src
│ │ | ├── *.o
| | | ├── *.d
│ ├── Release
│ │ ├── makefile
│ │ ├── Src
│ │ | ├── *.o
| | | ├── *.d
├── dir2
│ ├── Src
│ │ ├── prog1.cpp
│ │ ├── prog2.cpp
│ ├── Debug
│ │ ├── makefile
│ │ ├── Src
│ │ | ├── *.o
| | | ├── *.d
│ ├── Release
│ │ ├── makefile
│ │ ├── Src
│ │ | ├── *.o
| | | ├── *.d
Top level Makefile:
SUBDIRS := dir1 dir2 dir3 #...
DEBUG_DIRS := $(addsuffix /Debug,$(SUBDIRS))
RELEASE_DIRS := $(addsuffix /Release,$(SUBDIRS))
$(info $(RELEASE_DIRS))
.PHONY: debug release $(DEBUG_DIRS) $(RELEASE_DIRS) $(SUBDIRS)
debug: $(DEBUG_DIRS)
$(DEBUG_DIRS):
$(MAKE) -C $#
release: $(RELEASE_DIRS)
$(RELEASE_DIRS):
$(MAKE) -C $#
If you're compiling with different options, then what you need to do...
Produce different .so files such as Dir-debug.so and Dir-release.so.
Based on .o files found in different directories such as obj-debug/foo.o and obj-release/foo.o
Then you can create generic, nearly identical rules for building both obj-debug/%.o and obj-release/%.o
And after that, nearly identical rules for creating the shared libs.
Using GNU make patsubst() method (substitute) means you can fairly readily do something like this:
ALL_MY_SRC = $(shell find ${SRCDIR} -name "*.cpp" | sort)
DEBUG_OBJ = $patsubst(%.cpp, ${DEBUG_OBJDIR}/%.o, ${ALL_MY_SRC})
RELEASE_OBJ = $patsubst(%.cpp, ${RELEASE_OBJDIR}/%.o, ${ALL_MY_SRC})
Those might not be perfect. You might have to play with them a bit. But that should give you one variable with all the source and 1 each for the two sets of object lists. Then you can make a rule like:
${DEBUG_OBJDIR}/%.o: ${SRCDIR}/%.cpp
g++ ...
Again, play with it.
I am currently working on a project which requires C++20 (g++ v11) features and CMake. The project tree is similar to the following one:
- Top level
- src
- IO
- IO.cpp
- CMakeLists.txt
- main.cpp
- CMakeLists.txt
CMake compiles IO module without any problem but It generates gcm.cache folder in a following way:
- build
- Some other CMake files and folders
- bin
- lib
- src
- IO
- gcm.cache
- IO.gcm
Therefore, g++ can not find gcm.cache folder and gives me this error:
IO: error: failed to read compiled module: No such file or directory
IO: note: compiled module file is 'gcm.cache/IO.gcm'
IO: note: imports must be built before being imported
IO: fatal error: returning to the gate for a mechanical issue
I would be grateful if anyone tell me that there is a way to specify gcm.cache locations using CMake or force CMake to search gcm files recursively or tell it to create a top level gcm.cache and store everything inside of it. I can not find any answer on anywhere since C++20 documentations are terrible. Thanks in advance...
I have experienced the exact same issue, and without actually discovering a solution have found a workaround. Complete code found here.
In short, I create a symbolic link such that subprojects are all using the gcm.cache/ directory located in the root directory of the project. Create a symlink like so:
ln -fs ../gcm.cache gcm.cache
This is the directory tree of the project:
.
├── engine
│ ├── core
│ │ └── types.cpp
│ ├── engine.cpp
│ ├── gcm.cache -> ../gcm.cache
│ ├── Makefile
│ └── memory
├── gcm.cache
├── init.sh
├── Makefile
└── testgame
├── gamelib.cpp
├── gcm.cache -> ../gcm.cache
├── Makefile
└── test.cpp
So when gcc builds the engine and testgame projects it actually uses the gcm.cache/ from the root directory. Until something better comes along this is my go-to method.
I have the following directories:
$ cd dir1
$ tree
[...]
├── autogen.sh
├── CMakeLists.txt
├── common.gmake.in
├── configure.ac
├── Makefile.am
├── src
│ ├── Makefile.am
│ ├── Makefile.in
│ ├── [...]
│ └── dir2
│ ├── CMakeLists.txt
│ ├── Makefile.am
│ ├── [...]
And in the Makefile.am of dir2, I have:
[...]
include #abs_top_builddir#/common.gmake
[...]
The first time I compile, everything works just fine. But if I want to compile a second time, because I produced a config.status file, the Makefile wants to execute a make distclean. It will execute it in dir1 and will remove the common.gmake. But as it is recursive, when it will enter in the dir2 directory, I have the following error:
Making distclean in dir2
gmake[2]: Entering directory `/dir1/src/dir2'
Makefile:1760: /dir1/common.gmake: No such file or directory
gmake[2]: *** No rule to make target `/dir1/common.gmake'. Stop.
It looks like to me the recursive target generated by automake is wrong as it wants to remove a file it already removed previously. What do you think ?
I am new to cmake. I have a project which uses dlib and opencv. They are defined as submodules which are in third_party folder. I want to link them to my main project which is 'node' with cmake but I could not achieved. I am sharing my project tree. I did with find_package(OpenCV) and target_link_libraries(recognition-node ${OPENCV_LIBS}) way but I need to compile from source without installing anything. At last, I just want to write 'cmake . && make'
.
├── CMakeLists.txt
├── node
│ ├── build.sh
│ ├── CMakeLists.txt
│ ├── configure.sh
│ ├── findfacestask.cpp
│ ├── findfacestask.h
│ ├── main.cpp
│ ├── matrixwrapper.h
│ ├── poolcontext.cpp
│ ├── poolcontext.h
│ ├── recognition.dat
│ ├── recognizefacetask.cpp
│ ├── recognizefacetask.h
│ ├── runscript
│ ├── sp.dat
│ ├── task.cpp
│ ├── task.h
│ ├── unhandledexception.cpp
│ ├── unhandledexception.h
│ ├── webcamfeed.cpp
│ ├── webcamfeed.h
│ ├── wrapper.cpp
│ └── wrapper.h
└── third_party
├── dlib
│ ├── appveyor.yml
│ ├── CMakeLists.txt
│ ├── dlib
│ ├── docs
│ ├── examples
│ ├── MANIFEST.in
│ ├── python_examples
│ ├── README.md
│ ├── setup.py
│ └── tools
└── opencv
├── 3rdparty
├── apps
├── cmake
├── CMakeLists.txt
├── CONTRIBUTING.md
├── data
├── doc
├── include
├── LICENSE
├── modules
├── platforms
├── README.md
└── samples
Content of my top CMakeLists.txt
cmake_minimum_required(VERSION 2.8.12)
set (CMAKE_CXX_STANDARD 11)
add_subdirectory(node)
add_subdirectory(third_party/dlib)
add_subdirectory(third_party/opencv)
Content of node/CMakeLists.txt
cmake_minimum_required(VERSION 2.8.12)
project(recognition-node)
set(CMAKE_AUTOMOC ON)
find_package(Qt5Widgets REQUIRED)
add_executable(recognition-node main.cpp
webcamfeed.cpp
poolcontext.cpp
unhandledexception.cpp
task.cpp
findfacestask.cpp
wrapper.cpp
recognizefacetask.cpp)
target_link_libraries(recognition-node Qt5::Widgets)
target_link_libraries(recognition-node dlib::dlib)
target_link_libraries(recognition-node opencv::core)
It gives error in 'make' stage which is :
/home/arnes/workspace/recognition-node/node/poolcontext.h:10:28: fatal error:
opencv2/core.hpp: No such file or directory
Since you insist on keeping the opencv in your project tree
It is easier way but I just want it to do in this way.
Here is the solution that for sure works fine with your project tree that you posted in the question and with opencv-3.4.1. For simplicity I will neglect dlib library and Qt dependency, since you didn't have any problem with it.
Root CMakeLists.txt should have the following content:
cmake_minimum_required(VERSION 2.8.11) # or anything higher, if you wish
project(recognition-node CXX)
add_subdirectory(node)
The CMakeLists.txt under the node directory should have the following content:
add_subdirectory(third_party)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11 -g") # or any other additional flags
# at this point you can add find_package(Qt5Widgets REQUIRED) and later link your binary against Qt5::widgets as well
add_executable(myExec main.cpp
# and put here all the other source files of your project ...
)
# for linking libs I have put additionally highgui and imgproc to check the solution against OpenCV official sample
target_link_libraries(myExec opencv_core opencv_highgui opencv_imgproc)
target_include_directories(myExec PUBLIC
third_party/opencv/modules/calib3d/include
third_party/opencv/modules/core/include
third_party/opencv/modules/cudaarithm/include
third_party/opencv/modules/cudabgsegm/include
third_party/opencv/modules/cudacodec/include
third_party/opencv/modules/cudafeatures2d/include
third_party/opencv/modules/cudafilters/include
third_party/opencv/modules/cudaimgproc/include
third_party/opencv/modules/cudalegacy/include
third_party/opencv/modules/cudaobjdetect/include
third_party/opencv/modules/cudaoptflow/include
third_party/opencv/modules/cudastereo/include
third_party/opencv/modules/cudawarping/include
third_party/opencv/modules/cudev/include
third_party/opencv/modules/dnn/include
third_party/opencv/modules/features2d/include
third_party/opencv/modules/flann/include
third_party/opencv/modules/highgui/include
third_party/opencv/modules/imgcodecs/include
third_party/opencv/modules/imgproc/include
third_party/opencv/modules/ml/include
third_party/opencv/modules/objdetect/include
third_party/opencv/modules/photo/include
third_party/opencv/modules/shape/include
third_party/opencv/modules/stitching/include
third_party/opencv/modules/superres/include
third_party/opencv/modules/ts/include
third_party/opencv/modules/video/include
third_party/opencv/modules/videoio/include
third_party/opencv/modules/videostab/include
third_party/opencv/modules/viz/include
third_party/opencv/modules/world/include
)
The CMakeLists.txt under third_party should contain only:
add_subdirectory(opencv)
# add_subdirectory(dlib) # if you will use dlib, of course also add dlib
The sample I used to verify the build is contours2.cpp (just copy pasted the content into main.cpp).
However, I still think that it is a terrible idea to use this solution.
OpenCv takes really a lot of time to compile
you have to manually add include dirs (you can use some macro generators, but usually it looks even more ugly)
in your build system you have a lot of targets (over 300) that you don't really need, including install target
So, my recommendation is: if you want, use this solution for scientific purpose, but just compile and install OpenCv system-wise (or locally, if you are not the admin) when you really need to use it.
I'm trying to unit-testing my Qt application with QTestLib. I saw that the new Visual Studio 2012 has a built-in C++ test framework and googling for it I saw this page that talks about different methods to test a native project. I would have two different projects, one for the normal program and one for the tests. Actually my application is not a DLL but it's a simple C++ exe. Is the best way to test it with another project to link against .obj files or against libs? I wouldn't export anything from the source code as mine is not a DLL
This is a typical QtTest project with three code units: unit1, unit2 and unit3
project/
├── project.pro
├── src
│ ├── main.cpp
│ ├── src.pro
│ ├── unit1.cpp
│ ├── unit1.h
│ ├── unit2.cpp
│ ├── unit2.h
│ ├── unit3.cpp
│ └── unit3.h
└── tests
├── stubs
│ ├── stubs.pro
│ ├── unit1_stub.cpp
│ ├── unit2_stub.cpp
│ └── unit3_stub.cpp
├── test1
│ ├── test1.cpp
│ ├── test1.h
│ └── test1.pro
├── test2
│ ├── test2.cpp
│ ├── test2.h
│ └── test2.pro
├── test3
│ ├── test3.cpp
│ ├── test3.h
│ └── test3.pro
└── tests.pro
This project results in 4 binaries: 1 application itself and three test binaries for testing each of the units. For example, test1 should include src/unit1.cpp, src/unit1.h and also stubbed unit2 and unit3: src/unit2.h, tests/stubs/unit2_stub.cpp, src/unit2.h, tests/stubs/unit3_stub.cpp. With this kind of setup src/unit1.cpp and tests/stubs/unit1_tests.cpp will be compiled twice and this number will grow, if number of units will be larger. This is not a problem for small projects, but for huge projects this may lead to significant increase of build time.
Then splitting unitX.cpp and unitX.h into separate libraries with static linking to main application and each of the tests will eliminate need for building multiple times.