How to handle semi-relative include paths with cppcheck - c++

Our C++ project is organized in several modules (== subfolders) and headers are placed next to the .cpp files:
CMakeLists.txt
src
│
└───folder1
│ │
│ └───subfolder1
│ │ MyClass1.h
│ │ MyClass1.cpp
│ │ ...
│
└───folder2
│ │
│ └───subfolder2
│ │ MyClass2.h
│ │ MyClass2.cpp
│ │ ...
Include directives are always defined relative to the folder src and not relative to the code file e.g. in MyClass1.cpp:
#include "folder1/subfolder1/MyClass1.h" // even the own header is defined semi-relatively
#include "folder2/subfolder2/MyClass2.h"
MyClass1::MyClass1() {
// some code
}
I recently noticed that cppcheck (version 1.89) has problems with this and does not correctly
resolve macros defined in a header file -> False complaints about correct code
find problems with class member initialization (e.g. MyClass::MyClass() : _foo(_foo) {}) -> No complaints about incorrect code
When providing -I src to the cppcheck CLI, macros are correctly identified and actual issues like above are found, but analysis time skyrockets from 2 to 20 minutes.
I suspect, that by providing the whole source code again via -I, the files are all re-parsed as header files. Unfortunately, I don't have a specific include/ sub folder which I can use here. What is advised here? I am already using multiple jobs: -j 4.

After some further investigation, this might actually not be a valid question:
I manually took all header files and copied them to a separate folder: find . -type f \( -iname \*.h -o -iname \*.inl -o -iname \*.hpp \) -exec cp --parents \{\} ./../__cppcheckWorkaroundInclude \; which I then provided via -I to cppcheck
The check still took 20 minutes, so my assumption that cppcheck wrongfully scans too much is invalid - it actually takes this long because it is a lot of code.
So providing -I __cppcheckWorkaroundInclude or -I src to cppcheck did not make a difference - at least I did not observe one. Meaning, that -I works as expected
To solve the performance issues (I run the job inside a CI and 20 minutes is a lot of time which I would rather spend on executing automated tests), I did the following:
Reduce the checked configurations --max-configs=5
Used incremental checking via --cppcheck-build-dir=CppcheckBuildDir
To integrate this in our Gitlab CI, I had to cache this directory:
cache:
paths:
- CppcheckBuildDir
and before the cppcheck call, the directory has to be created (but with the -p flag because the dir might already be there, taken from the cache!):
mkdir -p CppcheckBuildDir

Related

Qt How to add dynamic resource files to project without using .qrc?

My qt project is like this. While the program is running, the "config.xml" will be edited, then the "run.bat" file will be called and produce a lot of data under this folder.
BTW, The size of output data may be over 100GB. I don't have to use them in the program.
│ mainWindow.cpp
│ mainWindow.ui
│ mainWindow.h
│
└─Resource
├─bin
│ │ core.exe
│ │ gencase.exe
│
└─work
├─task1
│ │ config.xml
│ │ run.bat
│ └─ output
│ │ datafiles
│
├─task2
└─...
I want to easily use the relative path of these files, so the "run.bat" can call the "core.exe" and show the output data.
But the files under the work folder are big and should be editable, I don't think adding all the files into .qrc is a good idea.
I don't know how to handle this situation.
Qt Resource files (.qrc) are for static resources (i.e. available at compile time).
Files created during runtime are often stored inside one of the following locations:
QStandardPaths::TempLocation
QStandardPaths::AppLocalDataLocation
QStandardPaths::CacheLocation
QStandardPaths::AppDataLocation
QStandardPaths::GenericConfigLocation
See QStandardPaths for more information.
Storing those files relative to your program executable is often undesirable, as this location is often not writable for the end user (but may be handy in some cases).
If the user should be able to run or edit those files with an external editor, storing these files in a subfolder of the QStandardPaths::HomeLocation may be a good idea too. In that case, it may be desirable to allow the user to optionally change this location.

Can't manage to link compiler to files from other folders

I'm working in VSCode with C++. I want to use and compile files from different folders, I made an example of what I want to do https://github.com/ChrisvdHoorn/Cplusplus_project (renamed .vscode to vscode so that I could upload to github, is actually called .vscode (!))
Cplusplus_project
└───.vscode
│ │ c_cpp_properties.json
│ │ launch.json
│ │ settings.json
| | tasks.json
│
└───include
| │ TestLib.h
|
└───src
| main.cpp
| TestLib.cpp
main.cpp uses TestLib.h and Testlib.cpp.
This compiles fine when main.cpp, TestLib.h and Testlib.cpp are placed directly in the worspace, but not in the file structure above.
I came across similar questions on the internet, and the general answer was that you need to add something like "-I${workspaceFolder}/include", the the args section of the compiler. I tried this, but no luck.
Any help is very welcome :)
Sub question: I use the playbutton to compile and run the code. Is this actually running the task from tasks.json? I couldn't really find a definitive answer on it online.

How to run tests in a multimodule Go project

I'm learning Go with the specialization Programming with Google Go. It has several courses, each of which has several modules.
My project looks like the following (made with https://ascii-tree-generator.com/):
google-golang/
├─ .github/
│ ├─ workflows/
│ │ ├─ ci.yml
├─ golang-getting-started/
│ ├─ module1/
│ │ ├─ main.go
│ │ ├─ main_test.go
│ ├─ module2/
│ │ ├─ trunc/
│ │ │ ├─ main.go
│ │ │ ├─ main_test.go
├─ .gitignore
├─ README.md
I'd like to run all the tests in the *_test.go files for every commit, and it's not clear to me how to do that from the root directory (google-golang). I don't see a need for one module to import another as the exercises can be done independently. This question has two answers, one suggests using submodules, another recommends using Go workspace, but neither provide specific instructions that someone new like me can learn from.
I'm not asking for help on GitHub Actions, I know how to write those. I'm looking for one or more commands that'll find and run the tests.
You seem confused about what a module is. The rule of thumb is, one go.mod file equals one module. Go Wiki, gomod:
A module is defined by a tree of Go source files with a go.mod file in the tree's root directory.
Based on your directory tree shown in your question, there is no go.mod file in sight, hence nothing there is a Go module. As a matter of fact, if you attempt running a module-aware command from google-golang or golang-getting-started you'll have:
go: go.mod file not found in current directory or any parent directory; see 'go help modules'
If you want to run all tests from the root of a multi-module repo, as the title of your question says, with Go 1.17 you can:
init the sub-modules:
$ cd google-golang/golang-getting-started/module1
$ go mod init example.com/module1
$ cd ../module2
$ go mod init example.com/module2
use the trick suggested by Cerise Limón from the root dir of the multi-module project
$ cd google-golang
$ find . -name go.mod -execdir go test ./... \;
If you don't actually care about keeping the sub-repos as separate modules, you can init a module in the root repo and run all tests from there:
$ cd google-golang # or google-golang/golang-getting-started/
$ go mod init example.com/mainmod
$ go test ./...
...however, even this hack works right now, it doesn't make a lot of sense, because your repos named moduleN have main.go files, whose packages are supposedly named main, in each of them. So they are indeed organized as separate sub-modules.

Linking SDL-bgi in Code::Blocks on windows using a custom install dir

I have a program, that works on Linux using SDL-bgi. I have downloaded the SDL-bgi binaries from http://libxbgi.sourceforge.net/ , and can get it to compile, but I can't get it to link (giving me"undefined reference to ... errors). The download provides a DLL, but according to the answer to this question, I need a .lib. I can't find any .lib anywhere n the download.
Here is the output of tree /f in the folder extracted:
│ AUTHORS
│ BUGS
│ build.sh
│ ChangeLog
│ CMakeLists.txt
│ INSTALL.md
│ LICENSE
│ README.md
│ sdl_bgi.spec
│ TODO
│ VERSION
│
├───bin
│ ├───CodeBlocks
│ │ SDL_bgi.dll
│ │
│ ├───Dev-Cpp
│ │ SDL_bgi.dll
│ │
│ └───Mingw64
│ SDL_bgi.dll
│
├───doc
│ functions.md
│ functions.pdf
│ howto_CodeBlocks.md
│ howto_CodeBlocks.pdf
│ howto_Dev-Cpp.md
│ howto_Dev-Cpp.pdf
│ sdl_bgi-quickref.pdf
│ sdl_bgi-quickref.tex
│ SDL_bgi_logo.png
│ SDL_bgi_logo.svg
│ turtlegraphics.pdf
│ turtlegraphics.tex
│ using.md
│ using.pdf
│
├───src
│ graphics.h
│ Makefile
│ Makefile.CodeBlocks
│ Makefile.DevCpp
│ SDL_bgi.c
│ SDL_bgi.h
│
└───test
[... a bunch of c files]
I am trying to link to the library from where it is, instead of C:\CodeBlocks\MinGW\bin as recommended by the docs, because I don't have the right privileges.
First Method if you don't have installed old bordland graphics interface in Codeblocks then use this method other-wise you get error like multiple declaration because both library use same function declaration
step:1- Create a New c console project name it what ever you want then paste the
files from include folder from the zip below and paste it here
C:\Program Files (x86)\CodeBlocks\MinGW\include
then paste libsdlbgi.a from the zip file to this path
C:\Program Files (x86)\CodeBlocks\MinGW\lib
SDLBGI.zip
step:2-Go to the codeblocks then go to setting then select compiler go to linker option and click add then browse to previously pasted libsdlbgi.a file then add it then go to search directory linker and do same with libsdlbgi.a
again.
Step:3-Write any simple program and built and compile and run it.
Second method if you want both bgi and sdl bgi
Follow all the steps carefully to get it works
Step:1- Simply create a new project in Code-Block using SDL2 from the category click enter and give a title to the project then click next and finish to create to new project.
Step:2- Go to your directory where you have created your project then copy these files
'SDL_bgi.c' , 'SDL_bgi.h' and 'sdlbgidemo.c' , 'logo.bmp' from the "SDL_bgi-2.2.4" [src folder] and [test folder] to project folder.
Step:3-First delete default main.cpp file from codeblock project then add "sdlbgidemo.c" into the project then click ok in next window in CodeBlocks which ask for
select target file to "debug" and "release"
Step:4-select the sdlbgidemo.c in codeblock then open it in the code editor then change the #include <'graphics.h'> header to #include "SDL_bgi.c"
Step:5-Last Step press built and run button it will built and run the project and you will probably see the previously pasted logo.bmp on the screen then you can see demo of all SDL_bgi Library funtions.
Here it is Code-block project file [zip] for you with all the necessary modification so you can easily run it using Code-blocks.
[Code-Blocks Project File.zip]
if you don't want to repeat step:2 and step:3 every time you created new project you follow Step:5 to solve this problem.
Step:6-First go to the directory where you have installed your Codeblocks then browse through [Mingw] then [include] folder in this folder you can simply copy 'SDL_bgi.c' , 'SDL_bgi.h' or if you have already installed Old Borland Graphics library [BGI] then you have to create separate folder for SDL-BGI and have to paste all files which i tell you earlier in this newly created folder then if you want you to use or include these files in your project simply
type these lines #include "SDL_bgi.c" or #include "[Previously created folder]/SDL_bgi.c" to run your program.
Here is the link of zip file which have some screen shot of all the steps we have taken so far...
[ScreenShot.zip]
I hope this solve your problem and save your time.
At Last
That's all folks

What's a good directory structure for larger C++ projects using Makefile?

What's a good directory structure for larger C++ projects using Makefile ?
This is how my directory structure looks at the moment:
lib/ (class implementations *.cpp)
include/ (class definitions *.h)
tests/ (main.cpp for quick tests)
Now, I'm not sure how my Makefile should look like... it doesn't seem to work when .cpp files and .h files aren't in the same directory. Could anyone point me to a common directory structure with an accompanying Makefile so that I don't reinvent the wheel ?
Separating the .cpp of the .h file is not always a good solution. Generally I separate both of them when it is used as a library (public header in include and private header with the source code).
If it is a library, this structure is ok.
lib/ (class implementations *.cpp .h)
include/ (class definitions *.h) <- Only those to be installed in your system
tests/ (main.cpp for quick tests)
doc/ (doxygen or any kind of documentation)
If it is a application
src/ (source for the application)
lib/ (source for the application library *.cpp *.hpp)
include/ (interface for the library *.h)
tests/ (main.cpp for quick tests) <- use cppunit for this part
doc/ (doxygen or any kind of documentation)
Use the flag -I$(PROJECT_BASE)/include to specify the include path for the compilation
If it is a big project, it can be good to use tool like autoconf/automake or cmake to build everything. It will ease the development.
For those who find this question after 2020, an alternative modern and reasoned vision of "Canonical Project Structure" for C++ has been presented by Boris Kolpackov: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p1204r0.html
Briefly - no include/ and src/ split. All headers, sources, modules and unit tests go into one directory. Implementation details may be separated from public API by moving to <name>/<name>/details/ subdirectory.
<name>/
├── <name>/
│ ├── headers...
│ ├── sources...
│ ├── modules...
│ └── unit tests...
└── tests/
├── functional_test1/
├── functional_test2/
├── integration_test1/
├── integration_test2/
└── ...
For example:
bestlib/
├── bestlib/
│ ├── foo.h
│ ├── foo.cpp
│ ├── foo.test.cpp
│ ├── bar.h
│ ├── bar.cpp
│ └── bar.test.cpp
└── tests/
├── functional_test1/
└── integration_test1/
If you have many source files, it may also be a good idea to further subdivide your source directory. For instance, one subdirectory for the core functionality of your application, one for the GUI, etc.
src/core
src/database
src/effects
src/gui
...
Doing so also forces you to avoid unneeded relationships between your "modules", which is a prerequisite to nice and reusable code.
There is no one specific or required directory structure.
You can set it up anyway you like. Your problem is simple to solve. Just instruct Makefile to look into subdirectories or put compiled objects into subdirectories instead of using just current directory.
You would just use in Makefile paths:
%.o : %.cpp
replace with
bin/%.o : %.cpp
So it will check if binary file in directory bin exists and so on, you can apply the same to locations where files are compiled.
There are ways to add/remove/modify paths of source and object files.
Have a look at gnu make manual, specifically section 8.3 Functions for File Names,and the one before that 8.2 Functions for String Substitution and Analysis.
You can do stuff like:
get a list of objects from list of source files in current directory:
OBJ = $(patsubst %.cpp, %.o, $(wildcard *.cpp))
Output:
Application.o Market.o ordermatch.o
If binary objects are in subdirectory bin but source code is in current directory you can apply prefix bin to generated object files:
OBJ = $(addprefix bin/,$(patsubst %.cpp, %.o, $(wildcard *.cpp)))
Output:
bin/Application.o bin/Market.o bin/ordermatch.o
And so on.
This is an old question. But you can consider the Pitchfork Project as a general guide.
https://github.com/vector-of-bool/pitchfork for the project.
Some Documentation here
There is no "good directory structure". Pick a structure you're comfortable with and stick to it. Some like placing source files (headers and implementation files) in a src/ directory, so the root directory of the project has nothing but a makefile, a readme and little else. Some like placing helper libraries under a lib/ directory, unittests under test/ or src/test/, documentation under doc/ etc.
I have yet to hear of anyone splitting header files and implementation files into two distinct directories though. Personally I don't like splitting files into directories much. I usually place all my source in a single directory and all the documentation in another directory. If I rely on good search tools anyway, there's no need for a complex directory structure.
make can deal with the sort of structure where the makefile resides in a different directory than the source. The only thing is that it will invoke the rules from the directory of the makefile -- compilers usually have no problem compiling source that is in some subdirectory. You don't have to specify relative paths in your #includes; just specify the include path with compiler flags (gcc's -I flag etc).
If you haven't seen it before read Recursive Make Considered Harmful.
Short, short version: Though very common the recursive make idiom is less than optimal and gets ever worse as projects grow larger and more complicated. An alternative is presented.
Related link: What is your experience with non-recursive make?