Fast Haskell rebuild+test with file watch using cabal + GHCID? - unit-testing

my question in short is "how to get a fast save-retest workflow in a cabal-managed multi-library haskell project repository?"
I already tried a few things and did some research. Before getting into more details please have a look at the typical project repo structure and then have the question broken down into more details:
Repository Development Structure
i work on multiple Haskell projects that usually have the following form:
.
├── foo
│   ├── foo.cabal
│   ├── src
│   ├── unit-test
│   └── ...
├── bar
│   ├── bar.cabal
│   ├── src
│   ├── unit-test
│   └── ...
├── baz
│   ├── baz.cabal
│   ├── src
│   ├── unit-test
│   └── ...
├── stack.yaml
├── cabal.project
├── nix
│   └── ...
└── ...
The cabal.project file looks like this:
packages:
foo
bar
baz
...
tests: True
run-tests: True
The stack file contains basically the same project list and an LTS ID, so i can just just the stackProject nix function from IOHK's haskell.nix to provide myself a nix shell which has cabal etc. in place. (This question is more about cabal handling so i consider this text paragraph here only a background note that i think is not relevant for this stack overflow question.)
This setup allows me to just run cabal test all anywhere in the project, which is great. This is my simple method to see if i broke anything before closing the next git commit.
Rapid Save-Retest Workflow
Before i got to nix, i used stack build/test --watch which was nice, because i could now have a shell open which always retests and rebuilds the whole project after i changed anything anywhere.
This can be simulated with inotify:
while true; do
inotifywait -e modify -r ./;
cabal test all
done
This is not really fast but it also does the job.
After i got to know about GHCID i was amazed by how blazingly fast it is.
It is also easy to use with cabal repl.
Unfortunately (this problem was also mentioned but left unanswered in a comment here How to run test suite in ghcid with cabal?), GHCID can be run on one specific unit test suite and will not detect changes on the library that the unit tests are supposed to check. (Putting all the library modules into the unit test description in the cabal file is something that i consider an ugly hack and i woul rather like to avoid that)
Also, it seems i can't run GHCID on the whole repository like cabal test all or stack test --watch do.
The extreme speed of GHCID is something that i really want in my workflow.
cabal is a tool that has existed long time before stack, how do people work on their multi-lib repositories to have a fast overview of all the test cases they broke after they edited multiple files in multiple libs? If the GHCID way does not work well, what is the way to do it?

I use a script with stack as follows:
ghcid -c="stack ghci <test-suite>.hs" -T="main" --warnings $#
This means:
Run ghcid
Use stack ghci instead of vanilla ghci
Also load the test suite module into ghci
Run main (from the test suite) upon loading ghci
Run the tests even if the compile generates GHC warnings
You could easily adapt this to use, for example, cabal repl instead of stack ghci.
This has the following drawbacks:
The name of the test module needs to be hard-coded, and is not extracted from package.yaml/the cabal file.
It does not support multiple test suites. These could all be passed to the script, but you'd need a custom main to call them all.
I previously got around these problems by using :!stack test as the command, which runs the all test suites, but since this is issued via the command line rather than loaded into ghci, The tests ran much slower. Additionally, It did not hot-reload on modification to the tests.
The bottom line is: to get the benefits of ghcid, ghcid needs to be informed of which source files to look at. Any reloading of ghcid via a shell command will require a full reload and recompile by ghci, not expoloiting ghci's fast hot reload capability. If this information is stored in a build system config file, your build system needs to integrate with ghcid (absent a custom script.) My guess is this is too low-level for cabal, but I have opened a feature request for stack. Add a comment or reaction there if you want to see this happen!

My current workaround is to nest ghcid calls:
ghcid --target=$LIBRARY_NAME --run=":! ghcid --target=$TEST_SUITE --run"
The innermost ghcid recompiles and reruns the tests when they change, the outermost recompiles the library and restarts the innermost one when the library source changes.

There is ongoing work to support multiple home units in GHC, but as of March 2022, not enough has been implemented yet to support them in ghcid:
Barely any normal GHCi features are supported (#20889). It would be good to support enough for ghcid to work correctly.
In the meantime the ugly hack mentioned by the OP is the best workaround:
during dev time add some settings of the library stanzas to the settings of the test-suite stanzas, usually hs-source-dirs, build-depends, and default-extensions, and remove those local libraries from the build-depends of the test-suite stanzas. import-ing common stanzas can help to reduce this boilerplate.

Related

In QMake, how do I add a subdir only if the target supports C++20?

I want to run my unit tests in all available C++ versions, so I have a directory structure like
tests/
component/
tst_component.cpp
cxx11/
cxx14/
cxx17/
cxx20/
And I compile tst_component.cpp in each of the cxxNN subdirs in C++NN using CONFIG += c++NN. This works well if the compiler supports all these (with 1z for 17 and 2a for 20, for older qmakes). But on our CI, I have one compiler that's too old to understand the -std=c++2a option that CONFIG+=c++2a adds to the command line, so I'd like to conditionally exclude the cxx20 subdir when the compiler doesn't support (even a subset of) c++20.
I know of
CONFIG(QT_CONFIG, c++2a):SUBDIRS += cxx20
but that's testing whether Qt was built in C++20 mode, which is not what I want. It's what I'm using atm, to get the change through the CI, but it means, of course, that the C++20 mode isn't actually tested on the CI until someone installs a Qt built with C++20.
Is there a way that's independent of how Qt is built and doesn't involve maintaining a compiler version whitelist in the build system by hand?
EDIT:20210609 Something like a config test: I try to compile something with CONFIG+=c++2a and if that works, some flag is set (like have_cxx20, enabling me to say have_cxx20:SUBDIRS+=cxx20)?
qmake has a compilation test feature, which you can use to compile a simple source file with a defined set of building flags. See https://doc.qt.io/qt-5/qmake-test-function-reference.html#qtcompiletest-test for the reference, and here is a skeleton for such a project:
mainproject/
├── config.tests
│   └── test
│   ├── test.cpp
│   └── test.pro
├── main.cpp
└── mainproject.pro
mainproject/config.tests/test/test.pro:
SOURCES = test.cpp ## can be just a simple main() function, or even testing some c++20 specific features
QMAKE_CXXFLAGS+= -std=c++20
mainproject/mainproject.pro
load(configure)
qtCompileTest(test) {
message("CPP20 test passed")
} else {
message("CPP20 test failed")
}
TARGET = mainproject
SOURCES += main.cpp

c++ CMake project structure for lib and executable [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I have a question regarding how to structure a c++ project and build that using CMake (inside of CLion). I am very new to c++ (I started to learn the language only two weeks ago) and to CMake (I haven't ever configured a CMake file). I do however have a bunch of experience with other programming languages and their ecosystems like Java, JavaScript, Php, and .NET—perhaps analogies with those ecosystems could help?
We're building our own (small, 2d) game and game-engine for a school project. The game engine should be shippable on its own, the game should build on our own engine. The engine will have its own dependencies (built using SDL2, thats a requirement).
I imagine the engine will become a (static) library, while the actual game will compile to an executable that depends on the engine library.
For the purposes of simplifying project management we're looking to host both the engine and the game code in the same git repository.
Given that we might want to put a math module into our engine (probaby not the case, but for the purpose of this question) I imagine a folder structure like the following:
our-project/
├── README.md
├── engine
│   ├── src
│   │   ├── math.cpp
│   │   └── math.h
│   └── test
│      └── ...
└── game
├── src
│   └── main.cpp
   └── test
      └── ...
The engine would then contain the following code:
// in math.h
namespace engine::math {
int add(int a, int b);
}
// in math.cpp
#include "math.h"
namespace engine::math {
int add(int a, int b) {
return a + b;
}
}
Let's say we wanted to make use of this engine code from our game, I imagne the following code:
// in game/main.cpp
#include <iostream>
#include "engine/math.h"
using namespace engine;
int main() {
std::cout << math::add(10, 1) << std::endl;
}
Questions
How do I setup CMake so that I can both build the engine (static library) individually, and build the game (executable) that depends on that same engine?
How should I manage the engines header files? How do I make them accesible to the game code? Should I use a different structure for the header files?
Is this folder structure manageable? Should I consider a different approach?
You declare separate CMake targets, i.e., in engine/src/CMakeLists.txt:
add_library(engine
math.cpp)
while in game/src/CMakeLists.txt, you have
add_executable(my-game
main.cpp)
target_link_libraries(my-game
PRIVATE
engine)
When using e.g. make, you can build the targets individually by
make engine # build only the engine, not the executable
make my-game # build engine if necessary, then build executable
Separate test targets make sense, too.
Include flags in CMake are propagated as follows.
target_include_directories(engine
INTERFACE
${CMAKE_CURRENT_SOURCE_DIR})
The above setup propagates an include directory to all targets that link against engine. This is simple, but has a drawback: you can't distinguish between public headers and implementation details. As an alternative, you could have a directory engine/src/public/engine with all public headers that shall be used by the game:
target_include_directories(engine
INTERFACE
${CMAKE_CURRENT_SOURCE_DIR}/public)
target_include_directories(engine
PRIVATE
${CMAKE_CURRENT_SOURCE_DIR}/public/engine)
This way, client code in game uses #include "engine/math.h", while in engine, you can just go with #include "math.h". I like such a setup, as it's easily visible what is the interface of a library and what is its implementation. But it's also somewhat a matter of taste.
Again opinionated. But I think this is a good directory structure. Stick to it.

no output generated to make auto documentation by sphinx on fortran code

We want to use autodoc tool Sphinx to make a documentation of fortran programs.
By using two extensions developed by vacumm "vacumm.sphinxext.fortran_domain" and "vacumm.sphinxext.fortran_autodoc", we are supposed to be able to parse fortran codes by Sphinx.
We can successfully run make html to generate html document files, but the output is empty. I mean, it seems that the fortran program has not been parsed and documented, but there is no error.
My program tree is like this:
├── doc
│   ├── _build
│   ├── conf.py
│   ├── index.rst
│   ├── make.bat
│   ├── Makefile
│   ├── _static
│   └── _templates
└── project
└── caf10.f
conf.py includes:
sys.path.insert(0, os.path.abspath("../project/caf10.f"))
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.doctest',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'vacumm.sphinxext.fortran_domain',
'vacumm.sphinxext.fortran_autodoc'
]
fortran_src = [
'/home/Documents/fortran-sphinx/project/caf10.f'
]
index.rst contains:
Documentation for the Code
**************************
.. f:autoprogram:: caf10
Contents:
.. toctree::
:maxdepth: 2
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
Fortran program is something like:
C Handling system components
PROGRAM caf10
IMPLICIT NONE
INTEGER NOERR, NSCOM, I
CHARACTER FNAME*12, SCNAME*24
C Initialise program
CALL TQINI(NOERR)
FNAME = 'cosi.dat'
C Open cosi.dat (system C-O-Si)
C for reading
CALL TQOPNA(FNAME, 10, NOERR)
C Read data-file
CALL TQRFIL(NOERR)
C Close data-file
CALL TQCLOS(10, NOERR)
C Print the names of all system components
WRITE(*,FMT='(A)') 'Names of system components:'
DO I=1, NSCOM
CALL TQGNSC(I, SCNAME, NOERR)
WRITE(*,FMT='(I2,A)') I, ': ' // SCNAME
ENDDO
END
We don't have module or function definition in these fortran programs. We want to highlight the codes and comments, use cross references, and print the output of each method into the documentation.
The output of generated document is just the program name, and nothing else is included in the documentation:
program caf10
By running make html, it logs no error:
$ make html
sphinx-build -b html -d _build/doctrees . _build/html
Running Sphinx v1.2.3
loading pickled environment... not yet created
loading intersphinx inventory from http://docs.python.org/objects.inv...
parsing fortran sources... done
building [html]: targets for 1 source files that are out of date
updating environment: 1 added, 0 changed, 0 removed
test1ng sources... [100%] index
/home/masood/Documents/fortran-sphinx-2/doc/index.rst:4: WARNING: test2
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
preparing documents... done
writing output... [100%] index
writing additional files... genindex search
copying static files... done
copying extra files... done
dumping search index... done
dumping object inventory... done
build succeeded, 1 warning.
Do you have any idea how can we make a document which parsed the program, highlighted commands, included comments, and added cross references?

How do I avoid multiple definition errors when purposefully linking the same object more than once?

I'm using gnu make, and gcc for compiling. I'm on linux (Ubuntu)
Here is a diagram for the game black jack where the nodes represent eventual object files.
I have organized the code where each class is in it's own folder.
tree -d
.
└── classes
├── dealer
│   └── hand
├── deck
│   └── card
│   ├── card_color
│   ├── card_suit
│   ├── card_value
│   └── colorizer
├── discard_pile
│   └── card
├── user
│   └── hand
│   └── card
└── user_choice
In order to make the file-system behave like a graph, some directories are symbolic links.
There is no repeated code.
For example, if the actual card code is in the directory deck, then the hand and discard_pile folders have a symbolic link to that directory in them.
I have run into multiple definition errors, because my program links against all the sub directory object files.
It will follow symbolic links and end up gathering some object files more than once.
I can't tell it to ignore symbolically linked folders, because sometimes they need to be followed.
For example: if I were to make a driver program in the dealer folder, it would need to gather the hand,card,card_color,card_suit,card_value,colorizer objects (even though the hand folder is a symbolic link).
Is there a way to continue gathering duplicate object files, but to tell gcc or make to ignore the duplicates before linking?
Because they are symbolic links, the path names are different.
Also, there may be instances where objects are named the same, but are different implementations.
I'm not sure how to identify that duplicate object files are indeed the same file and should not be linked more than once.
I don't want to make this question too general by asking for advice on how to manage big and expanding projects (where entity relationships are complicated), but if you think my structure is nonsensical or problematic, please point out the problems and other ways you would go about it. I was trying to avoid having all my source files in one directory, because without class diagrams, the relationships would be difficult to derive.
Just becuse you USE the same function multiple times in different contexts doesn't mean you need to link it more than once.
Your card.cpp needs to go into a common location that can be used by all your different components.
Typically, a project will have a set of header files, and a set of source files (.cpp or .c for example). When I work on small projects - less than about a dozen source files, I just keep all the files in one directory. But there's nothing wrong with keeping it a bit more split up. However, a file should only be in one place in the directory structure [!!NO LINKS!!].
Header files are either with their respective source files, or in a separate "include" directory.
Typically, source files don't "care" where the header files are, they are just "somewhere". Instead you tell the compiler to look for headers in "../some-component" with -I ../some-component.
And each source file is only compiled and linked once, to form the final binary.

C++ project organisation (with gtest, cmake and doxygen)

I am new to programming in general so I decided that I would start by making a simple vector class in C++. However I would like to get in to good habits from the start rather than trying to modify my workflow later on.
I currently have only two files vector3.hpp and vector3.cpp. This project will slowly start to grow (making it much more of a general linear algebra library) as I become more familiar with everything, so I would like to adopt a "standard" project layout to make life easier later on. So after looking around I have found two ways to go about organizing hpp and cpp files, the first being:
project
└── src
├── vector3.hpp
└── vector3.cpp
and the second being:
project
├── inc
│ └── project
│ └── vector3.hpp
└── src
└── vector3.cpp
Which would you recommend and why?
Secondly I would like to use the Google C++ Testing Framework for unit testing my code as it seems fairly easy to use. Do you suggest bundling this with my code, for example in a inc/gtest or contrib/gtest folder? If bundled, do you suggest using the fuse_gtest_files.py script to reduce the number or files, or leaving it as is? If not bundled how is this dependency handled?
When it comes to writing tests, how are these generally organized? I was thinking to have one cpp file for each class (test_vector3.cpp for example) but all compiled in to one binary so that they can all be run together easily?
Since the gtest library is generally build using cmake and make, I was thinking that it would make sense for my project to also be built like this? If I decided to use the following project layout:
├── CMakeLists.txt
├── contrib
│ └── gtest
│ ├── gtest-all.cc
│ └── gtest.h
├── docs
│ └── Doxyfile
├── inc
│ └── project
│ └── vector3.cpp
├── src
│ └── vector3.cpp
└── test
└── test_vector3.cpp
How would the CMakeLists.txt have to look so that it can either build just the library or the library and the tests? Also I have seen quite a few projects that have a build and a bin directory. Does the build happen in the build directory and then the binaries moved out in to the bin directory? Would the binaries for the tests and the library live in the same place? Or would it make more sense to structure it as follows:
test
├── bin
├── build
└── src
└── test_vector3.cpp
I would also like to use doxygen to document my code. Is it possible to get this to automatically run with cmake and make?
Sorry for so many questions, but I have not found a book on C++ that satisfactorily answers these type of questions.
C++ build systems are a bit of a black art and the older the project
the more weird stuff you can find so it is not surprising that a lot
of questions come up. I'll try to walk through the questions one by one and mention some general things regarding building C++ libraries.
Separating headers and cpp files in directories. This is only
essential if you are building a component that is supposed to be used
as a library as opposed to an actual application. Your headers are the
basis for users to interact with what you offer and must be
installed. This means they have to be in a subdirectory (no-one wants
lots of headers ending up in top-level /usr/include/) and your
headers must be able to include themselves with such a setup.
└── prj
├── include
│   └── prj
│   ├── header2.h
│   └── header.h
└── src
└── x.cpp
works well, because include paths work out and you can use easy
globbing for install targets.
Bundling dependencies: I think this largely depends on the ability of
the build system to locate and configure dependencies and how
dependent your code on a single version is. It also depends on how
able your users are and how easy is the dependency to install on their
platform. CMake comes with a find_package script for Google
Test. This makes things a lot easier. I would go with bundling only
when necessary and avoid it otherwise.
How to build: Avoid in-source builds. CMake makes out of source-builds
easy and it makes life a lot easier.
I suppose you also want to use CTest to run tests for your system (it
also comes with build-in support for GTest). An important decision for
directory layout and test organization will be: Do you end up with
subprojects? If so, you need some more work when setting up CMakeLists
and should split your subprojects into subdirectories, each with its
own include and src files. Maybe even their own doxygen runs and
outputs (combining multiple doxygen projects is possible, but not easy
or pretty).
You will end up with something like this:
└── prj
├── CMakeLists.txt <-- (1)
├── include
│   └── prj
│   ├── header2.hpp
│   └── header.hpp
├── src
│   ├── CMakeLists.txt <-- (2)
│   └── x.cpp
└── test
├── CMakeLists.txt <-- (3)
├── data
│   └── testdata.yyy
└── testcase.cpp
where
(1) configures dependencies, platform specifics and output paths
(2) configures the library you are going to build
(3) configures the test executables and test-cases
In case you have sub-components I would suggest adding another hierarchy and use the tree above for each sub-project. Then things get tricky, because you need to decide if sub-components search and configure their dependencies or if you do that in the top-level. This should be decided on a case-by-case basis.
Doxygen: After you managed to go through the configuration dance of
doxygen, it is trivial to use CMake add_custom_command to add a
doc target.
This is how my projects end up and I have seen some very similar projects, but of course this is no cure all.
Addendum At some point you will want to generate a config.hpp
file that contains a version define and maybe a define to some version
control identifier (a Git hash or SVN revision number). CMake has
modules to automate finding that information and to generate
files. You can use CMake's configure_file to replace variables in a
template file with variables defined inside the CMakeLists.txt.
If you are building libraries you will also need an export define to
get the difference between compilers right, e.g. __declspec on MSVC
and visibility attributes on GCC/clang.
As a starter, there are some conventional names for directories that you cannot ignore, these are based on the long tradition with the Unix file system. These are:
trunk
├── bin : for all executables (applications)
├── lib : for all other binaries (static and shared libraries (.so or .dll))
├── include : for all header files
├── src : for source files
└── doc : for documentation
It is probably a good idea to stick to this basic layout, at least at the top-level.
About splitting the header files and source files (cpp), both schemes are fairly common. However, I tend to prefer keeping them together, it is just more practical on day-to-day tasks to have the files together. Also, when all the code is under one top-level folder, i.e., the trunk/src/ folder, you can notice that all the other folders (bin, lib, include, doc, and maybe some test folder) at the top level, in addition to the "build" directory for an out-of-source build, are all folders that contain nothing more than files that are generated in the build process. And thus, only the src folder needs to be backed up, or much better, kept under a version control system / server (like Git or SVN).
And when it comes to installing your header files on the destination system (if you want to eventually distribute your library), well, CMake has a command for installing files (implicitly creates a "install" target, to do "make install") which you can use to put all the headers into the /usr/include/ directory. I just use the following cmake macro for this purpose:
# custom macro to register some headers as target for installation:
# setup_headers("/path/to/header/something.h" "/relative/install/path")
macro(setup_headers HEADER_FILES HEADER_PATH)
foreach(CURRENT_HEADER_FILE ${HEADER_FILES})
install(FILES "${SRCROOT}${CURRENT_HEADER_FILE}" DESTINATION "${INCLUDEROOT}${HEADER_PATH}")
endforeach(CURRENT_HEADER_FILE)
endmacro(setup_headers)
Where SRCROOT is a cmake variable that I set to the src folder, and INCLUDEROOT is cmake variable that I configure to wherever to headers need to go. Of course, there are many other ways to do this, and I'm sure my way is not the best. The point is, there is no reason to split the headers and sources just because only the headers need to be installed on the target system, because it is very easy, especially with CMake (or CPack), to pick out and configure the headers to be installed without having to have them in a separate directory. And this is what I have seen in most libraries.
Quote: Secondly I would like to use the Google C++ Testing Framework for unit testing my code as it seems fairly easy to use. Do you suggest bundling this with my code, for example in a "inc/gtest" or "contrib/gtest" folder? If bundled, do you suggest using the fuse_gtest_files.py script to reduce the number or files, or leaving it as is? If not bundled how is this dependency handled?
Don't bundle dependencies with your library. This is generally a pretty horrible idea, and I always hate it when I'm stuck trying to build a library that did that. It should be your last resort, and beware of the pitfalls. Often, people bundle dependencies with their library either because they target a terrible development environment (e.g., Windows), or because they only support an old (deprecated) version of the library (dependency) in question. The main pitfall is that your bundled dependency might clash with already installed versions of the same library / application (e.g., you bundled gtest, but the person trying to build your library already has a newer (or older) version of gtest already installed, then the two might clash and give that person a very nasty headache). So, as I said, do it at your own risk, and I would say only as a last resort. Asking the people to install a few dependencies before being able to compile your library is a much lesser evil than trying to resolve clashes between your bundled dependencies and existing installations.
Quote: When it comes to writing tests, how are these generally organised? I was thinking to have one cpp file for each class (test_vector3.cpp for example) but all compiled in to one binary so that they can all be run together easily?
One cpp file per class (or small cohesive group of classes and functions) is more usual and practical in my opinion. However, definitely, don't compile them all into one binary just so that "they can all be run together". That's a really bad idea. Generally, when it comes to coding, you want to split things up as much as it is reasonable to do so. In the case of unit-tests, you don't want one binary to run all the tests, because that means that any little change that you make to anything in your library is likely to cause a near total recompilation of that unit-test program, and that's just minutes / hours lost waiting for recompilation. Just stick to a simple scheme: 1 unit = 1 unit-test program. Then, use either a script or a unit-test framework (such as gtest and/or CTest) to run all the test programs and report to failure/success rates.
Quote: Since the gtest library is generally build using cmake and make, I was thinking that it would make sense for my project to also be built like this? If I decided to use the following project layout:
I would rather suggest this layout:
trunk
├── bin
├── lib
│ └── project
│ └── libvector3.so
│ └── libvector3.a products of installation / building
├── docs
│ └── Doxyfile
├── include
│ └── project
│ └── vector3.hpp
│_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
│
├── src
│ └── CMakeLists.txt
│ └── Doxyfile.in
│ └── project part of version-control / source-distribution
│ └── CMakeLists.txt
│ └── vector3.hpp
│ └── vector3.cpp
│ └── test
│ └── test_vector3.cpp
│_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
│
├── build
└── test working directories for building / testing
└── test_vector3
A few things to notice here. First, the sub-directories of your src directory should mirror the sub-directories of your include directory, this is just to keep things intuitive (also, try to keep your sub-directory structure reasonably flat (shallow), because deep nesting of folders is often more of a hassle than anything else). Second, the "include" directory is just an installation directory, its contents are just whatever headers are picked out of the src directory.
Third, the CMake system is intended to be distributed over the source sub-directories, not as one CMakeLists.txt file at the top-level. This keeps things local, and that's good (in the spirit of splitting things up into independent pieces). If you add a new source, a new header, or a new test program, all you need is to edit one small and simple CMakeLists.txt file in the sub-directory in question, without affecting anything else. This also allows you to restructure the directories with ease (CMakeLists are local and contained in the sub-directories being moved). The top-level CMakeLists should contain most of the top-level configurations, such as setting up destination directories, custom commands (or macros), and finding packages installed on the system. The lower-level CMakeLists should contain only simple lists of headers, sources, and unit-test sources, and the cmake commands that register them to compilation targets.
Quote: How would the CMakeLists.txt have to look so that it can either build just the library or the library and the tests?
Basic answer is that CMake allows you to specifically exclude certain targets from "all" (which is what is built when you type "make"), and you can also create specific bundles of targets. I can't do a CMake tutorial here, but it is fairly straight forward to find out by yourself. In this specific case, however, the recommended solution is, of course, to use CTest, which is just an additional set of commands that you can use in the CMakeLists files to register a number of targets (programs) that are marked as unit-tests. So, CMake will put all the tests in a special category of builds, and that is exactly what you asked for, so, problem solved.
Quote: Also I have seen quite a few projects that have a build ad a bin directory. Does the build happen in the build directory and then the binaries moved out in to the bin directory? Would the binaries for the tests and the library live in the same place? Or would it make more sense to structure it as follows:
Having a build directory outside the source ("out-of-source" build) is really the only sane thing to do, it is the de facto standard these days. So, definitely, have a separate "build" directory, outside the source directory, just as the CMake people recommend, and as every programmer I have ever met does. As for the bin directory, well, that is a convention, and it is probably a good idea to stick to it, as I said in the beginning of this post.
Quote: I would also like to use doxygen to document my code. Is it possible to get this to automatically run with cmake and make?
Yes. It is more than possible, it is awesome. Depending on how fancy you want to get, there are several possibilities. CMake does have a module for Doxygen (i.e., find_package(Doxygen)) which allows you to register targets that will run Doxygen on some files. If you want to do more fancy things, like updating the version number in the Doxyfile, or automatically entering a date / author stamps for source files and so on, it is all possible with a bit of CMake kung-fu. Generally, doing this will involve that you keep a source Doxyfile (e.g., the "Doxyfile.in" that I put in the folder layout above) which has tokens to be found and replaced by CMake's parsing commands. In my top-level CMakeLists file, you will find one such piece of CMake kung-fu that does a few fancy things with cmake-doxygen together.
Structuring the project
I would generally favour the following:
├── CMakeLists.txt
|
├── docs/
│ └── Doxyfile
|
├── include/
│ └── project/
│ └── vector3.hpp
|
├── src/
└── project/
└── vector3.cpp
└── test/
└── test_vector3.cpp
This means that you have a very clearly defined set of API files for your library, and the structure means that clients of your library would do
#include "project/vector3.hpp"
rather than the less explicit
#include "vector3.hpp"
I like the structure of the /src tree to match that of the /include tree, but that's personal preference really. However, if your project expands to contain subdirectories within /include/project, it would generally help to match those inside the /src tree.
For the tests, I favour keeping them "close" to the files they test, and if you do end up with subdirectories within /src, it's a pretty easy paradigm for others to follow if they want to find a given file's test code.
Testing
Secondly I would like to use the Google C++ Testing Framework for unit testing my code as it seems fairly easy to use.
Gtest is indeed simple to use and is fairly comprehensive in terms of its capabilities. It can be used alongside gmock very easily to extend its capabilities, but my own experiences with gmock have been less favourable. I'm quite prepared to accept that this may well be down to my own shortcomings, but gmock tests tends to be more difficult to create, and much more fragile / difficult to maintain. A big nail in the gmock coffin is that it really doesn't play nice with smart pointers.
This is a very trivial and subjective answer to a huge question (which probably doesn't really belong on S.O.)
Do you suggest bundling this with my code, for example in a "inc/gtest" or "contrib/gtest" folder? If bundled, do you suggest using the fuse_gtest_files.py script to reduce the number or files, or leaving it as is? If not bundled how is this dependency handled?
I prefer using CMake's ExternalProject_Add module. This avoids you having to keep gtest source code in your repository, or installing it anywhere. It is downloaded and built in your build tree automatically.
See my answer dealing with the specifics here.
When it comes to writing tests, how are these generally organised? I was thinking to have one cpp file for each class (test_vector3.cpp for example) but all compiled in to one binary so that they can all be run together easily?
Good plan.
Building
I'm a fan of CMake, but as with your test-related questions, S.O. is probably not the best place to ask for opinions on such a subjective issue.
How would the CMakeLists.txt have to look so that it can either build just the library or the library and the tests?
add_library(ProjectLibrary <All library sources and headers>)
add_executable(ProjectTest <All test files>)
target_link_libraries(ProjectTest ProjectLibrary)
The library will appear as a target "ProjectLibrary", and the test suite as a target "ProjectTest". By specifying the library as a dependency of the test exe, building the test exe will automatically cause the library to be rebuilt if it is out of date.
Also I have seen quite a few projects that have a build ad a bin directory. Does the build happen in the build directory and then the binaries moved out in to the bin directory? Would the binaries for the tests and the library live in the same place?
CMake recommends "out-of-source" builds, i.e. you create your own build directory outside the project and run CMake from there. This avoids "polluting" your source tree with build files, and is highly desirable if you're using a vcs.
You can specify that the binaries are moved or copied to a different directory once built, or that they are created by default in another directory, but there's generally no need. CMake provides comprehensive ways to install your project if desired, or make it easy for other CMake projects to "find" the relevant files of your project.
With regards to CMake's own support for finding and executing gtest tests, this would largely be inappropriate if you build gtest as part of your project. The FindGtest module is really designed to be used in the case where gtest has been built separately outside of your project.
CMake provides its own test framework (CTest), and ideally, every gtest case would be added as a CTest case.
However, the GTEST_ADD_TESTS macro provided by FindGtest to allow easy addition of gtest cases as individual ctest cases is somewhat lacking in that it doesn't work for gtest's macros other than TEST and TEST_F. Value- or Type-parameterised tests using TEST_P, TYPED_TEST_P, etc. aren't handled at all.
The problem doesn't have an easy solution that I know of. The most robust way to get a list of gtest cases is to execute the test exe with the flag --gtest_list_tests. However, this can only be done once the exe is built, so CMake can't make use of this. Which leaves you with two choices; CMake must try to parse C++ code to deduce the names of the tests (non-trivial in the extreme if you want to take into account all gtest macros, commented-out tests, disabled tests), or test cases are added by hand to the CMakeLists.txt file.
I would also like to use doxygen to document my code. Is it possible to get this to automatically run with cmake and make?
Yes - although I have no experience on this front. CMake provides FindDoxygen for this purpose.
In addition to the other (excellent) answers, I am going to describe a structure I've been using for relatively large-scale projects.
I am not going to address the subquestion about Doxygen, since I would just repeat what is said in the other answers.
Rationale
For modularity and maintainability, the project is organized as a set of small units.
For clarity, let's name them UnitX, with X = A, B, C, ... (but they can have any general name).
The directory structure is then organized to reflect this choice, with the possibility to group units if necessary.
Solution
The basic directory layout is the following (content of units is detailed later on):
project
├── CMakeLists.txt
├── UnitA
├── UnitB
├── GroupA
│ └── CMakeLists.txt
│ └── GroupB
│ └── CMakeLists.txt
│ └── UnitC
│ └── UnitD
│ └── UnitE
project/CMakeLists.txt could contain the following:
cmake_minimum_required(VERSION 3.0.2)
project(project)
enable_testing() # This will be necessary for testing (details below)
add_subdirectory(UnitA)
add_subdirectory(UnitB)
add_subdirectory(GroupA)
and project/GroupA/CMakeLists.txt:
add_subdirectory(GroupB)
add_subdirectory(UnitE)
and project/GroupB/CMakeLists.txt:
add_subdirectory(UnitC)
add_subdirectory(UnitD)
Now to the structure of the different units (let's take, as an example, UnitD)
project/GroupA/GroupB/UnitD
├── README.md
├── CMakeLists.txt
├── lib
│ └── CMakeLists.txt
│ └── UnitD
│ └── ClassA.h
│ └── ClassA.cpp
│ └── ClassB.h
│ └── ClassB.cpp
├── test
│ └── CMakeLists.txt
│ └── ClassATest.cpp
│ └── ClassBTest.cpp
│ └── [main.cpp]
To the different components:
I like having source (.cpp) and headers (.h) in the same folder. This avoids a duplicate directory hierarchy, makes maintenance easier. For installation, it is no problem (especially with CMake) to just filter the header files.
The role of the directory UnitD is to later on allow including files with #include <UnitD/ClassA.h>. Also, when installing this unit, you can just copy the directory structure as is. Note that you can also organize your source files in subdirectories.
I like a README file to summarize what the unit is about and specify useful information about it.
CMakeLists.txt could simply contain:
add_subdirectory(lib)
add_subdirectory(test)
lib/CMakeLists.txt:
project(UnitD)
set(headers
UnitD/ClassA.h
UnitD/ClassB.h
)
set(sources
UnitD/ClassA.cpp
UnitD/ClassB.cpp
)
add_library(${TARGET_NAME} STATIC ${headers} ${sources})
# INSTALL_INTERFACE: folder to which you will install a directory UnitD containing the headers
target_include_directories(UnitD
PUBLIC $<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}>
PUBLIC $<INSTALL_INTERFACE:include/SomeDir>
)
target_link_libraries(UnitD
PUBLIC UnitA
PRIVATE UnitC
)
Here, note that it is not necessary to tell CMake that we want the include directories for UnitA and UnitC, as this was already specified when configuring those units. Also, PUBLIC will tell all targets that depend on UnitD that they should automatically include the UnitA dependency, while UnitC won't be required then (PRIVATE).
test/CMakeLists.txt (see further below if you want to use GTest for it):
project(UnitDTests)
add_executable(UnitDTests
ClassATest.cpp
ClassBTest.cpp
[main.cpp]
)
target_link_libraries(UnitDTests
PUBLIC UnitD
)
add_test(
NAME UnitDTests
COMMAND UnitDTests
)
Using GoogleTest
For Google Test, the easiest is if its source is present in somewhere your source directory, but you don't have to actually add it there yourself.
I've been using this project to download it automatically, and I wrap its usage in a function to make sure that it is downloaded only once, even though we have several test targets.
This CMake function is the following:
function(import_gtest)
include (DownloadProject)
if (NOT TARGET gmock_main)
include(DownloadProject)
download_project(PROJ googletest
GIT_REPOSITORY https://github.com/google/googletest.git
GIT_TAG release-1.8.0
UPDATE_DISCONNECTED 1
)
set(gtest_force_shared_crt ON CACHE BOOL "" FORCE) # Prevent GoogleTest from overriding our compiler/linker options when building with Visual Studio
add_subdirectory(${googletest_SOURCE_DIR} ${googletest_BINARY_DIR} EXCLUDE_FROM_ALL)
endif()
endfunction()
and then, when I want to use it inside one of my test targets, I will add the following lines to the CMakeLists.txt (this is for the example above, test/CMakeLists.txt):
import_gtest()
target_link_libraries(UnitDTests gtest_main gmock_main)