Codecov: gcov code coverage for a C++ library - c++

I am writting a small C++ static library. Within GitHub Actions I have a "unit test" workflow which compiles and runs a test program and the code coverage is submitted to the Codecov service afterwards. It runs: g++ test.cpp library/library.cpp --coverage -o test, then ./test, followed by gcov -o . test.cpp. In the next step the results are submitted to my Codecov account with a standard bash <(curl -s https://codecov.io/bash) (having a CODECOV_TOKEN set as an env. variable). Everything works fine up to this point.
However, in the coverage I see reports for both .cpp and .h files inslide the library directory but also a coverage of the test.cpp. I am not interested in the coverage of my test code and it is skewing the statistics. Is there any way to submit only coverage reports of the library files?

There is no easy way to exclude files at the gcov level, but I solved the issue by restricting the codecov submission: https://docs.codecov.io/docs/ignoring-paths

Related

How to generate Code Coverage for entire c++ codebase coverage w/ gcov/lcov

We have a large body of C/C++ code that is cross-compiled for an embedded Linux target. We've recently begun implementing unit tests (using gmock/gtest) that are run on our development server (which is Linux as well). The unit tests are executed automatically when check-ins are detected (we're using Microsoft Azure pipeline).
We're using gcov and lcov to analyze & report code coverage during those unit tests, which has worked out fairly well. However, given that we didn't start out unit testing, a large portion of our codebase is not covered by unit tests. An interesting metric beyond "what is the unit test coverage for those files being unit tested" is "how much of our entire codebase is being covered by unit tests", which includes those files not currently being unit tested. With gcov, you need to actually compile & link a given source file and then execute the resulting program to get the possible coverage data for that file.
I have used the following script but it generates report for the classes that has unit test.:
# Get the path to the current folder
SCRIPT_DIR=$(pwd)
# SRC_DIR is the directory containing the .gcno files (%{buildDir} in Qt Creator)
SRC_DIR="$SCRIPT_DIR/../../build-KEBPLCComService-Desktop-Debug/"
# COV_DIR is the directory where the coverage results will be stored
COV_DIR="$SCRIPT_DIR/../coverage"
############################################################################################################
# Path where the HTML files should be saved
HTML_RESULTS="${COV_DIR}""/html"
# Create the html folder if it does not exists
mkdir -p ${HTML_RESULTS}
# Generate our initial info
lcov -d "${SRC_DIR}" -c -o "${COV_DIR}/coverage.info"
# Remove some paths/files which we don't want to calculate the code coverage (e.g. third party libraries) and generate a new coverage file filtered (feel free to edit it when necessary)
lcov -r "${COV_DIR}/coverage.info" "*Qt*.framework*" "*.h" "*/tests/*" "*Xcode.app*" "*.moc" "*moc_*.cpp" "*/test/*" "*/build*/*" -o "${COV_DIR}/coverage-filtered.info"
# Generate the HTML files
genhtml -o "${HTML_RESULTS}" "${COV_DIR}/coverage-filtered.info"
# Reset our counts
lcov -d "${COV_DIR}" -z
# Open the index.html
firefox "${HTML_RESULTS}/index.html"
How can i get code coverage for entire codebase?

How do I produce a graphical code profile report for C++ code compiled with Clang LLVM?

How do I produce a graphical code profile report for C++ code compiled with Clang LLVM?
What command-line options to I pass to clang++ to instruct it to gather profiling data when the code is executed?
Into which file(s) is the gathered profiling data stored?
What are the post-processing steps to convert the collected profile data into a graphical report that shows how often each function is called, what percentage of time is spent in each function, and from which functions each function is called (similar to https://s3-us-west-2.amazonaws.com/brunorijsman-public/example-rift-python-code-profile.png)?
I have full control over the C++ source code and Makefile.
It has to be LLVM clang++ (GNU g++ is not an option for me). Xcode is also not an option for me.
Clang supports a couple of different code coverage implementations (which also output how often a line has been executed) such as Source-Based Code Coverage and a gcov-compatible one. Open source tools seems to have better support for gcov output in general, so I would recommend that route.
What command-line options to I pass to clang++ to instruct it to gather profiling data when the code is executed?
For Source-Based Code Coverage:
According to llvm-cov, the correct flags for gathering profiling data when is -fprofile-instr-generate -fcoverage-mapping when compiling and -fprofile-instr-generate when linking.
For the gcov compatible output: -fprofile-arcs -ftest-coverage
Into which file(s) is the gathered profiling data stored?
For Source-Based Code Coverage:
After you run the program compile and linked with the flags above, the coverage data is stored in default.profraw in your current working directory. The profiling data file name can be changed by recompiling with -fprofile-instr-generate=filename or by setting the environment variable LLVM_PROFILE_FILE before running the executable.
For gcov compatible output: After you run the program you will get *.gcda and *.gcno files.
What are the post-processing steps to convert the collected profile data into a graphical report that shows how often each function is called, what percentage of time is spent in each function
For Source-Based Code Coverage:
Index your .profraw file into a .profdata file: llvm-profdata merge -o default.profdata -sparse=true default.profraw
Either use llvm-cov show --instr-profile default.profdata ./your_program to view the coverage in the terminal or use llvm-cov export ./your_program --instr-profile default.profdata > out.json to convert your profiling data to JSON and find/create a program to generate a report for you.
For gcov-compatible output:
Use lcov or gcovr to generate HTML output. This lets you easily view line and branch coverage for each file. I tend to use gcovr since it is a simple pip install gcovr away if you don't have it installed. Then the usage would be gcovr --gcov-executable "llvm-cov gcov" -r . --html --html-details -o out.html.
and from which functions each function is called (similar to https://s3-us-west-2.amazonaws.com/brunorijsman-public/example-rift-python-code-profile.png)?
For this type of information I would try to have a look at Callgrind and KCacheGrind. I have not found any tool which can generate this type of information given *.profdata or *.gcda files.
As aforementioned, gprof is one profiling tool that you can use. There's a problem however,it only counts CPU-time-in-process, it basically can't see I/O calls. It's also confused by recursion. Callgrind also shares similar problems, KCacheGrind uses Valgrind which actually interprets all the code. At the end of the day however, these are the options, personally, for small scale I would go with gprof simply due to its usability and documentation.

gcov -r generates no gcov file

I want to generate a coverage report for my github repo, it's a big C++ project built with CMake.
I add --coverage(equals to -fprofile-arcs -ftest-coverage) option to g++ build and link arguments in my CMakeLists.txt, and I use gcov to generate code coverage data and upload the data to codecov.io.
These automated builds and tests are done in my Jenkins CI. The related code of Jenkins's execute shell is here:
cmake -DCODE_COVERAGE=ON -Dasan=OFF -DNEBULA_ASAN_PRELOAD= ..
make -j8
ctest -j8
#generate code coverage report and upload it to codecov.io
TEST_FILES=$(find . -name '*.cpp.o')
#gcov -abcr $TEST_FILES
for test_file in $TEST_FILES; do gcov -abcr $test_file; done
curl -s https://codecov.io/bash | bash -s - -t c85ebdca-ec7c-4301-a6ed-7967cf175db5
When Jenkins excutes make -j8 and ctest -j8, and gcov -abcr $test_file I do get the related .gcno files, .gcda files, and gcov files. But when Jenkins excutes curl -s https://codecov.io/bash | bash -s - -t c85ebdca-ec7c-4301-a6ed-7967cf175db5 to upload the data files to codecov, the output seems strange. I see a lot of similar errors like:
ExecutionPlan.gcno:'_ZNK6nebula4cpp29ValueType8get_typeEv' has arcs to entry block
ExecutionPlan.gcno:'_ZNK6nebula4cpp29ValueType8get_typeEv' has arcs from exit block
And the -r option of gcov seems doesn't work because I also get a lot of coverage report of system header files.
And at the last the Jenkins tells me the build fails, so I can't see the coverage report in codecov.
Ps:
The source cpp files is not in the same directory with the .cpp.o(CMake's default C++ object type is .cpp.o not .o) files. And I do the above excute shell's code in the build directory, the .gcno, .gcda, and .gcov files are generated in the same directory.
Bucause the CMake's default object type is .cpp.o, the gcno and gcda files type generated is .cpp.gcno and .cpp.gcdas. So I use gcov filename.cpp.o insead of gcov filename.cpp which will tell me couldn't open filename.gcno errors.
I do some experiments to find the reason of errors. I find when I give gcov one input file like gcov 1.cpp.o, I will get no has arcs from exit block errors, but when I give gcov more than one input files like gcov 1.cpp.o 2.cpp.o , I get the errors of has arcs from exit block. I think a probably reason is about combing multiple gcov files.

How to get correct code coverage using GCOV? As it keeps on showing 0% covered

I am trying to get code coverage on subset of a massive codebase, currently experimenting with gcov.
Having compiled subset of code base with following compile flags -fprofile-arcs -ftest-coverage and -lgcov link flag. All the .gcno files are created.
Now when I run the code file suppose A.cpp (not directly through terminal but functions of that file are called through some other file ) I see some uncertain behavior with .gcda file.
It doesn't get updated on every run (checked the stat for .gcda file), but does get updated in some runs.
Once I run gcov it gives 0% coverage for A.cpp as well as all other .cpp files.
I have verified that the script do run as there are results obtained that are only possible by functions provided in the A.cpp.
Have checked the version of gcov and gcc, I am using gcc version 8.3.1, and don't see any version mismatch. I have tried running gcov on a small test script factorial.cpp and it shows correct coverage on it.
Expecting some kind of code coverage on subset code base.

Temporarily get jest coverage to show only files in a specific folder

For my reactjs app, I'm adding integration tests to a view and want to keep an eye on coverage along the way. I'm currently getting all files on every run.
Question
While adding tests to increase coverage, how can I get jest coverage to show only files in a specific folder?
Examples tried
$ yarn test --collectCoverageFrom=src/app/components/Tools
Test run, bu no coverage is showing here.
$ yarn test Tools --coverage --collectCoverageFrom=src/app/components/Tools
I get Ran all test suites matching "Tools".
$ yarn test src/app/components/Tools --coverage
Here I see the coverage percentage is smaller but still lists all files.
$ yarn test -o --coverage
Again as the previous, I see the coverage percentage is smaller but still lists all files.
It would be best to have an argument in the CLI, but you can add a temporary script in your package.json:
"tools-coverage": "react-app-rewired test --env=jsdom src/app/components/Tools --coverage --collectCoverageFrom=src/app/components/Tools/**/*.js"
Running the full command in the terminal just doesn't work.
Now you can instead run $ yarn tools-coverage to just test your Tools folder you targeted in your package.json.
An example shown here in the create-react-app issue:
https://github.com/facebook/create-react-app/issues/1455#issuecomment-277010725