Can cppcheck do a dry run to list all files without actually checking them? - cppcheck

I am running cppcheck on a big project and I'm trying to exclude a couple of files and folders that are 3rd party and/or generated. Is it possible to let cppcheck do a dry run and let it show the files it would normally try to check?

No, such a functionality is not (yet) implemented.
The best solution that comes to my mind is to use options like this:
cppcheck --check-config ./ 2> /dev/null
Explanation:
The --check-config option lets Cppcheck only check if includes are
missing. No further analysis is done, no bugs are reported.
Replace ./ with the path to the source files that you want to have the output for.
2> /dev/null suppresses Cppcheck messages for missing includes and other errors
Running this on the Cppcheck sources I get an output like this:
$ ./cppcheck --check-config ./ 2> /dev/null
Checking addons/test/cert-test.c ...
1/291 files checked 0% done
Checking addons/test/cert-test.cpp ...
2/291 files checked 0% done
Checking addons/test/misc-test.cpp ...
3/291 files checked 0% done
Checking addons/test/misra/misra-suppressions1-test.c ...
4/291 files checked 0% done
Checking addons/test/misra/misra-suppressions2-test.c ...
5/291 files checked 0% done
Checking addons/test/misra/misra-test.c ...
6/291 files checked 0% done
Checking addons/test/misra/misra-test.cpp ...
7/291 files checked 0% done
Checking addons/test/naming_test.c ...
8/291 files checked 0% done
Checking addons/test/naming_test.cpp ...
9/291 files checked 0% done
Checking addons/test/namingng_test.c ...
10/291 files checked 0% done
Checking addons/test/path1/misra-suppressions1-test.c ...
11/291 files checked 0% done
Checking addons/test/path1/misra-suppressions2-test.c ...
...

Related

Testing generated Go code without co-locating tests

Have some auto-generated golang code for protobuf messages and I'm looking to add some additional testing, without locating the file under the same directory path. This is to allow easy removal of the existing generated code to be sure that if a file is dropped from being generated, it's not left included in the codebase by accident.
The current layout of these files are controlled by prototool so I have something like the following:
/pkg/<other1>
/pkg/<other2>
/pkg/<name-generated>/v1/component_api.pb.go
/pkg/<name-generated>/v1/component_api.pb.gw.go
/pkg/<name-generated>/v1/component_api.pb.validate.go
The *.validate.go comes from envoyproxy/protoc-gen-validate, and *.pb.go & *.pb.gw.go are coming from protobuf and grpc libraries. The other1 and other2 are two helper libraries that we have included along with the generated code to make it easier for client side apps. The server side is in a separate repo and imports as needed.
Because it's useful to be able to delete /pkg/<name> before re-running prototool I've placed some tests of component_api (mostly to exercise the validate rules generated automatically) under the path:
/internal/pkg/<name>/v1/component_api_test.go
While this works for go test -v ./..., it appears not to work to well when generating coverage with -coverpkg.
go test -coverpkg=./... -coverprofile=coverage/go/coverage.out -v ./...
go build <pkgname>/internal/pkg/<name>/v1: no non-test Go files in ....
<output from the tests in /internal/pkg/<name>/v1/component_api_test.go>
....
....
coverage: 10.5% of statements in ./...
ok <pkgname>/internal/pkg/<name>/v1 0.014s coverage: 10.5% of statements in ./...
FAIL <pkgname>/pkg/other1 [build failed]
FAIL <pkgname>/pkg/other2 [build failed]
? <pkgname>/pkg/<name>/v1 [no test files]
FAIL
Coverage tests failed
Generated coverage/go/html/main.html
The reason for use of -coverpkg is that without it there doesn't seem to be anything spotting that any of the code under <pkgname>/pkg/<name>/v1 is covered, and we've see issues with what it reports previously not showing the real level of coverage, which are solved by use of -coverpkg:
go test -cover -coverprofile=coverage/go/coverage.out ./...
ok <pkgname>internal/pkg/<name>/v1 0.007s coverage: [no statements]
ok <pkgname>/pkg/other1 0.005s coverage: 100.0% of statements
ok <pkgname>/pkg/other2 0.177s coverage: 100.0% of statements
? <pkgname>/pkg/<name>/v1 [no test files]
Looking at the resulting coverage/go/coverage.out includes no mention of anything under <pkgname>/pkg/<name>/v1 being exercised.
I'm not attached to the current layout beyond being limited on <pkgname>/pkg/<name>/v1 being automatically managed by prototool and it's rules around naming for the generated files. Would like to ensure the other modules we have can remain exported to be used as helper libraries and I would like to be able to add tests for <pkgname>/pkg/<name>/v1 without needing to locate them in the same directory to allow for easy delete + recreate of generated files, while still getting sensible coverage reports.
I've tried fiddling with the packages passed to -coverpkg and replacing ./... on the command-line and haven't been able to come up with something that works. Perhaps I'm just not familiar with the right invocation?
Other than that is there a different layout that will take care of this for me?
To handle this scenario, simply create a doc.go file in the same directory as the dis-located tests with just the package and comment. This will allow the standard arguments to work and golang appears to be reasonably happy with an empty file.
Once in place the following will work as expected.
go test -coverpkg=./... -coverprofile=coverage/go/coverage.out -v ./...
Idea based on suggestion in https://stackoverflow.com/a/47025370/1597808

gcov -r generates no gcov file

I want to generate a coverage report for my github repo, it's a big C++ project built with CMake.
I add --coverage(equals to -fprofile-arcs -ftest-coverage) option to g++ build and link arguments in my CMakeLists.txt, and I use gcov to generate code coverage data and upload the data to codecov.io.
These automated builds and tests are done in my Jenkins CI. The related code of Jenkins's execute shell is here:
cmake -DCODE_COVERAGE=ON -Dasan=OFF -DNEBULA_ASAN_PRELOAD= ..
make -j8
ctest -j8
#generate code coverage report and upload it to codecov.io
TEST_FILES=$(find . -name '*.cpp.o')
#gcov -abcr $TEST_FILES
for test_file in $TEST_FILES; do gcov -abcr $test_file; done
curl -s https://codecov.io/bash | bash -s - -t c85ebdca-ec7c-4301-a6ed-7967cf175db5
When Jenkins excutes make -j8 and ctest -j8, and gcov -abcr $test_file I do get the related .gcno files, .gcda files, and gcov files. But when Jenkins excutes curl -s https://codecov.io/bash | bash -s - -t c85ebdca-ec7c-4301-a6ed-7967cf175db5 to upload the data files to codecov, the output seems strange. I see a lot of similar errors like:
ExecutionPlan.gcno:'_ZNK6nebula4cpp29ValueType8get_typeEv' has arcs to entry block
ExecutionPlan.gcno:'_ZNK6nebula4cpp29ValueType8get_typeEv' has arcs from exit block
And the -r option of gcov seems doesn't work because I also get a lot of coverage report of system header files.
And at the last the Jenkins tells me the build fails, so I can't see the coverage report in codecov.
Ps:
The source cpp files is not in the same directory with the .cpp.o(CMake's default C++ object type is .cpp.o not .o) files. And I do the above excute shell's code in the build directory, the .gcno, .gcda, and .gcov files are generated in the same directory.
Bucause the CMake's default object type is .cpp.o, the gcno and gcda files type generated is .cpp.gcno and .cpp.gcdas. So I use gcov filename.cpp.o insead of gcov filename.cpp which will tell me couldn't open filename.gcno errors.
I do some experiments to find the reason of errors. I find when I give gcov one input file like gcov 1.cpp.o, I will get no has arcs from exit block errors, but when I give gcov more than one input files like gcov 1.cpp.o 2.cpp.o , I get the errors of has arcs from exit block. I think a probably reason is about combing multiple gcov files.

Annoying error message: cannot merge previous GCDA file

Problem:
I'm using the following flags to generate the code coverage of my Qt application (.pro file):
QMAKE_CXXFLAGS += --coverage
QMAKE_LFLAGS += --coverage
The code coverage is correctly generated, the problem is that if I want to run only one test function/class (and the GCDA files were already created) I get the following error message:
profiling: /Users/user/.../build-myapp/myclass.gcda: cannot merge previous GCDA file: corrupt arc tag (0x00000000)
Note that the error message is shown for each GCDA file. Also, note that it doesn't seem to affect the test cases.
Workaround:
As explained here, it "is a result of the build tools failing to merge current results into the existing .gcda coverage files". As answered in the question, an option is to delete the GCDA files before running the tests. For example, by adding the following command in the build phase:
find . -name "*.gcda" -print0 | xargs -0 rm
Question:
My problem is that I don't want to delete the old GCDA files every time I run the test cases. As I'm running only one test function/class, I want to keep the old GCDA files as they are, and only merge the GCDA file related to the current class. As I manually checked, it is already being done because only the coverage of my current class is updated, and the old coverages remain the same.
So, is there a command to just ignore (don't show) the error messages related to the GCDA merging problems? Or even better, a command to only update the GCDA files related to the current test class?
Note: I'm using Qt 5.3.2 on macOS Sierra with Clang.
Related questions:
Code coverage warnings spam output
How to merge multiple versions of gcda files?
.gcda files don't merge on multiple runs
When you compile with profiling, the results of each run are stored in a file that ends with .gcda.
The error appears when your existing .gcda file is in a different format than the current program that just finished running.
This happened to me when I had run a Linux version of my executable, then recompiled for MacOS and ran it. The MacOS program saw the existing .gcda file and generated pages of errors.
Eliminate the errors by removing the existing .gcda file.
I came across the same problem. My solution is:
There is a general Test.pro project which uses SUBDIRS to include every test project as subproject.
In each test subproject I have the following line
QMAKE_POST_LINK = rm -f "*.gcda"
This deletes the *.gcda files only for the subproject which was just relinked. Unmodified projects keep their *.gcda files.

GCovr does not generate a valid report for c++ project build by scons

I am tring to generate coverage report by gcovr. But got invalid report or 0% coverage report
My project structure look like.
Myproject:
src-----where my all source files stored
testCases---Where my all source files stored
RunTestCases--From where I run my testCases
First I build src by scons from src directory.
Then I build TestCases from testCases directory by scons
Then I run test cases form RunTestCases directory
Last I generated coverage report for source file from src directory
Everthings is ok. My test Cases got pass but coverage report is 0%
I used below command to generated coverage report
gcovr -r . -e "include.*" --html --html-details -o $result_dir/result.html
Please help me..

How to ignore header files in gcov output?

I am using gcov and gcovr to generate my code test coverage (the tests are done with google test++).
So, I compile with the -ftest-coverage -fprofile-arcs options, and then I run gcovr (which itself runs gcov).
However, on my output, I have the cpp files, with a cover of 100%, but also the .h files, even if they do not have executable code, and hence they have a 0% coverage output.
This 0% does not mean anything, and hence, I would like to remove the .h files from the coverage output. I can't find anything about that...
I already try to add : -e "*.h" to the govr options, to exclude files with .h extension, but it doesn't work (it actually excludes everything...).
Does anybody have an idea ??
Thank you !!
I was also struggling with this, now found the right solution. Here an example and way of working:
When you run gcov, use option -o coverage.xml, then open the file. Find the filename you want to exclude and copy the filename.
Open in your browser following link and copy the entire filename to the part which says: TEST STRING
https://regex101.com/
Now, where it says: REGULAR EXPRESSION, create a regular expression which makes the entire filename BLUE. Make sure this expression does not apply to other files which are needed to show in your coverage report.
Here some basic rules: Usually, for "some characters" you can use ".*" what has nothing to do with files of type *.cpp, where you want to see the cpp files! So if you want to exclude anything like "<some characters>include<more characters>", then you can use ".*include.*". This will also exclude some filename like include.cpp
Because the . has a meaning in regular expressions, use \. So if you want to exclude "<something>.h" files, use ".*\.h"
Example what works for me: (exclude include files and the test framework and linux include directory and exclude jenkins files)
Also I want to exclude any cpp file which has the word test, what I can do with ".*test[_-[A-Z|a-z]*\.cpp"
Real life example:
gcovr --xml -e ".*unit_test\.cpp" -e ".*\.h" -e ".*usr/include/.*" -e ".*jenkins.*" -e ".*test[_-|A-Z|a-z]*\.cpp" -o coverage.xml
have fun!
Just use -e to exclude you files..
Your command will look like
gcovr -r . -e "include.*" --html --html-details -o $result_dir/result.html
It will exclude all include files of your project.