I'm in the process of generating coverage information of a c++ application and using gtest for the same, and the C++ application is part of the buildroot build. Got correct coverage information for almost files except header files. So I googled the same and found this link as useful, having said that I didn't get the correct answer.
gcov is not generating coverage information for header files
From the link, I understood that to obtain "mymoneyaccount.cpp.gcov",execute "gcov mymoneyaccount.cpp", but to obtain "mymoneyaccount.h.gcov", we need to execute "gcov mymoneyaccounttest.cpp". My first doubt is Anyone have an idea why we need to execute the test app to generate the .gcov of header file? Both files include "mymoneyaccount.h"."
My exact scenario is I'm trying to get the code coverage of source files which is located in "Source" folder and the test application is located in the "Test" folder.
Please find the folder structure.
Source
1.1 a
1.1.1 logic.cpp
1.1.2 logic.h
Test
2.1 a
2.1.1 logictest.cpp
Both Source and Test are art of Buildroot build system, cross compiled it and run the test in the Raspberry Pi-3. As part of Compilation process, logic.cpp.gcno & logictest.cpp.gcno files were generated in the build PC and as part of execution process in the Raspberry Pi-3, corresponding .gcda files were generated. Copied the .gcda files from the RPi to the corresponding locations in Build PC. Executed the gcov on the .gcno files and get the coverage details. Coverage of logic.cpp.gcov is valid and the .h.gcov in the Test directory is satisfying, were .h.gcov in the Source directory is not. Opened the .h.gcov in the Source dir, it shows line counts with the exact source code(not valid coverage data), but the .h.gcov in the Test dir, shows the line counts with code as /EOF/(hope it shows the valid coverage data). Hope it is clear now. Is any solution to view the content of the .h.gcov in the Test directory as the exact code?. Then only we can ensure that we got a valid coverage data and we can implement more test cases with the remaining code. Thank you.
Related
Background
I have large c++ application with complex directory structure. Structure is so deep that code repository can't be stored in Jenkins workspace, but is some root directory, otherwise build fails since path length limit is busted.
Now since application is tested in different environments, test application is run in diffrent machine. Application and all resources are compressed and copied to test machine where tests are run using OpenCppCoverage and as a result Cobertura xml is produced.
Now since source code is needed to show covarage result xml is copied back to build machine and then feed to Jenkins Cobertura plugin.
Problem
Coverage reports shows only percent results for module or source code. Code content is not show, but this error message is show:
Source
Source code is unavailable. Some possible reasons are:
This is not the most recent build (to save on disk space, this plugin only keeps the most recent build’s source code).
Cobertura found the source code but did not provide enough information to locate the source code.
Cobertura could not find the source code, so this plugin has no hope of finding it.
You do not have sufficient permissions to view this file.
Now I've found this SO answear which is promising:
The output xml file has to be in the same folder as where coverage
is run, so:
coverage xml -o coverage.xml
The reference to the source folder is put into coverage.xml and if
the output file is put into another folder, the reference to the
source folder will be incorrect.
Problem is that:
I've run tests on different machine (this can be overcome by script which modifies paths in xml).
my source code can't be inside a workspace during a build time
placing xml in respective directory of source code is not accepted by Cobertura plugin. It ends with this error:
[Cobertura] Publishing Cobertura coverage report...
FATAL: Unable to find coverage results
java.io.IOException: Expecting Ant GLOB pattern, but saw 'C:/build_coverage/Products/MyMagicProduct/Src/test/*Coverage.xml'. See http://ant.apache.org/manual/Types/fileset.html for syntax
This is part of xml result (before modifications):
<?xml version="1.0" encoding="utf-8"?>
<coverage line-rate="0.63669186741173223" branch-rate="0" complexity="0" branches-covered="0" branches-valid="0" timestamp="0" lines-covered="122029" lines-valid="191661" version="0">
<sources>
<source>c:</source>
<source>C:</source>
</sources>
<packages>
<package name="C:\jenkins\workspace\MMP_coverage\MyMagicProduct\src\x64\Debug\MMPServer.exe" line-rate="0.63040511358728513" branch-rate="0" complexity="0">
<classes>
<class name="AuditHandler.cpp" filename="build_coverage\Products\MyMagicProduct\Src\Common\AuditHandler.cpp" line-rate="0.92682926829268297" branch-rate="0" complexity="0">
<methods/>
<lines>
<line number="18" hits="1"/>
<line number="19" hits="1"/>
<line number="23" hits="1"/>
<line number="25" hits="1"/>
<line number="27" hits="1"/>
....
</lines>
</class>
....
The biggest issue is that I'm not sure if location of xml is actual a problem since plugin doesn't report details of the issues encountered when trying to fetch/find respective source code. Second bullet from Cobertura which may explain problem is totally confusing:
Cobertura found the source code but did not provide enough information to locate the source code.
What else I've tried
I've ensured that anyone can read source code (to avoid problem with access)
I've modified xml so filename contains path relative to: jenkins workspace, path where xml file with coverity report is located
copied my source code to various locations, even containing "cobertura" directory since something like this I've found in plugin source code
I've tried understand the issue by inspecting source code.
I've found some (a bit old) github project which maybe a hint howto fix it - currently I'm trying to understudy what it exactly does (I don't want to import this project to my build structure).
So far no luck.
Update:
Suddenly (I'm not sure what I have done) it works for my account. Problem is that it works only for me all other users have same issue. This clearly indicate that issue must be a security.
I encountered a very similar issue when I had to develop a CI pipeline for a very huge C++ client. I had the best results if I avoided the Cobertura Plugin and instead used the HTML Publisher Plugin. The main issue I had was also finding the source files.
Convert OpenCppCoverage result to HTML
This step is quite easy. You have to add the parameter --export_type=html:<outputPath> (see Commandline-reference) to the OpenCppCoverage call.
mkdir CodeCoverage
OpenCppCoverage.exe --export_type=html:CodeCoverage <GoogleTest.exe>
The commands above should result in a html-file in the directory <jenkins_workspace>/CodeCoverage/index.html
Publish the OpenCppCoverage result
To do this we use the HTML Publisher Plugin as I mentioned above. reportDir is the directory created in step one and which contains our html-file. Its path is relative to the Jenkins workspace.
publishHTML target: [
allowMissing: false,
alwaysLinkToLastBuild: true,
keepAll: true,
reportDir: 'CodeCoverage',
reportFiles: 'index.html',
reportName: 'Code Coverage'
]
and to be sure that everyone can download and check the result locally we archieve the result of OpenCppCoverage:
archiveArtifacts artifacts: 'CodeCoverage/*.*'
You can see the result now in the sidebar of your pipeline under Code Coverage and the result will look like the following:
This is the solution that worked for me.
I hope this helps at least a bit. I can only advice do avoid the Cobertura Plugin. I wasted so much time try to fix it and recognize my sources...
Ok I've found reasons why I had a problems with this plugin.
xml from openCppCoverage is just correct. No changes are needed here to make it work (as far as sources are there where pdb file points to). Sources outside Jenkins workspace are not the problem here. When I copied executable from build machine to test machine, then run tests with openCppCoverage and copied result back to build machine it is just fine.
In job configuration any user which supposed to view code coverage has to have access to Job/workspace in security section. In my case I've enabled this for all logged in users. This covers last bullet point of error message.
Most important thing: build must be successful. I mean form beginning to the end. Doesn't meter if step containing call to cobertura plugin was successful. If any step (even in the future step) fails then cobertura will not show code for this coverage run. In my case build job was failing since one of tests was timing out. This was caused by openCppCoverage overhead which slows down tests by factor 3. My script was detecting timeout and killing one of tests.
I discovered that not successful build was a problem by accident. During experiments I noticed two cases when cobertura has shown source code:
I've rerun job and removed all steps but one responsible for publishing coloratura results
I run whole job such way it run a single test case which passed
Not sowing coverage if build is not successful is reasonable (if test failed then most probably wrong branch of code has been taken), but UI should indicate that in different way.
Conclusion
This is great example how it is important to report errors to user with precise details what went wrong and why. I wasted at least whole weak to figure out what is actually wrong which bullet point of error message is actually my case. In fact error message from plugin doesn't cover all reasons of not showing the code.
I will file report that plugin should give better explanation what went wrong.
I'm in the middle of documenting my C++ GUI library and I just started using Doxygen. I've got two test files that are documented now, but I have problems when trying to generate the CHM help files. Doxygen runs without error, and dot appears to be functioning correctly to generate images.
However, it appears the resulting .hhc, .hhk, and .hhp files are broken in some way. index.hhc and index.hhk are exactly the same and running 'hhc index.hhp' does not work. It returns an error :
HHC6000: Error: An internal file could not be created. Make certain there is enough disk space on the drive where you are compiling your file.
HHC5007: Error: Fatal navigational compilation error. This is likely the result of an invalid contents (.hhc) file.
I have uploaded a zip file of my two test sources, the Doxyfile generated by the Doxy Wizard, and the .hh* files created by doxygen.
http://members.allegro.cc/EdgarReynaldo/temp/test1.zip
Both HTML Help Workshop and GraphViz are on my path.
Do I need to change a setting in the doxyfile? How do I fix this?
Regards, bugsquasher
EDIT
After taking albert 's advice, everything seemed to magically work. Nothing was really different though.
I cannot get coverage reporting to work within SonarQube. I have a C++ project for which I am using the build-wrapper-linux-x86-64 along with the sonar-scanner. The basic static analysis for the source code seems to work but there is nothing about test code coverage reported within SonarQube.
As part of the same workflow I am using lcov and genhtml to make a unit test coverage report, so I am confident that most of the code coverage steps are being correctly executed. When I manually view the .gcov files I can see run counts in the first column, so there is data there.
I have my code organised into modules. The sonar-project.properties file includes the following:
# List of the module identifiers
sonar.modules=Module1,Module2
# Path is relative to the sonar-project.properties file. Replace "\" by "/" on Windows.
# This property is optional if sonar.modules is set.
sonar.sources=./Sources,./Tests
HeliosEmulator.sonar.sources=./Application,./Sources,./Tests
sonar.cfamily.build-wrapper-output=build_output
# Existing reports
sonar.cfamily.build-wrapper-output=build_output
#sonar.cfamily.cppunit.reportsPath=junit
sonar.cfamily.gcov.reportsPath=.
#sonar.cxx.cppcheck.reportPath=cppcheck-result-1.xml
#sonar.cxx.xunit.reportPath=cpputest_*.xml
sonar.junit.reportPaths=junit
I would also like to get the unit test results displayed under the Sonar tools. As I am using the CppUTest framework I do not have an xunit or junit test output at present though. This can be dealt with as a separate issue but as I am unable to found much documentation of how to use the cfamily scanner online I do not know if the tests not being listed is relevant.
I had forgotten to setup my CI system correctly. The .gcov files did not exist for the job that was running the sonar-scanner. They only existed in the testing job that generated the coverage report. No files in the scanner job mean it cannot make a coverage report.
When I set the GitLab CI system I am using to keep the .gcov files as artefacts the coverage reporting suddenly started working.
The .gcov files were generated by a test job and need to be transferred to the sonar-scanner job via the artefact store. This is because GitLab CI does not share a work area between dependent jobs and you have to explicitly say what files must be copied.
Problem:
I'm using the following flags to generate the code coverage of my Qt application (.pro file):
QMAKE_CXXFLAGS += --coverage
QMAKE_LFLAGS += --coverage
The code coverage is correctly generated, the problem is that if I want to run only one test function/class (and the GCDA files were already created) I get the following error message:
profiling: /Users/user/.../build-myapp/myclass.gcda: cannot merge previous GCDA file: corrupt arc tag (0x00000000)
Note that the error message is shown for each GCDA file. Also, note that it doesn't seem to affect the test cases.
Workaround:
As explained here, it "is a result of the build tools failing to merge current results into the existing .gcda coverage files". As answered in the question, an option is to delete the GCDA files before running the tests. For example, by adding the following command in the build phase:
find . -name "*.gcda" -print0 | xargs -0 rm
Question:
My problem is that I don't want to delete the old GCDA files every time I run the test cases. As I'm running only one test function/class, I want to keep the old GCDA files as they are, and only merge the GCDA file related to the current class. As I manually checked, it is already being done because only the coverage of my current class is updated, and the old coverages remain the same.
So, is there a command to just ignore (don't show) the error messages related to the GCDA merging problems? Or even better, a command to only update the GCDA files related to the current test class?
Note: I'm using Qt 5.3.2 on macOS Sierra with Clang.
Related questions:
Code coverage warnings spam output
How to merge multiple versions of gcda files?
.gcda files don't merge on multiple runs
When you compile with profiling, the results of each run are stored in a file that ends with .gcda.
The error appears when your existing .gcda file is in a different format than the current program that just finished running.
This happened to me when I had run a Linux version of my executable, then recompiled for MacOS and ran it. The MacOS program saw the existing .gcda file and generated pages of errors.
Eliminate the errors by removing the existing .gcda file.
I came across the same problem. My solution is:
There is a general Test.pro project which uses SUBDIRS to include every test project as subproject.
In each test subproject I have the following line
QMAKE_POST_LINK = rm -f "*.gcda"
This deletes the *.gcda files only for the subproject which was just relinked. Unmodified projects keep their *.gcda files.
I am using HWUT for unit testing and want to put my tests under version control. Adding the test code and the GOOD folder is obvious. But what about other files e.g. the ADM folder?
NEEDED
GOOD/*:
hwut-info.dat: If you specify it.
Makefile: If you specifiy it.
Your test scripts and source files that implement the test.
ADM/cache.fly: Optional; Only check-in if queries on past tests are to be accomplished without doing the tests.
NOT TO BE CHECKED-IN
OUT/*
Any result produced by 'make'
Any temporary log files
Note, SCMs usually have a 'prop:ignore' property, or an 'ignore' file. You may adapt this according to the information above.