Which HWUT files to put under version control - unit-testing

I am using HWUT for unit testing and want to put my tests under version control. Adding the test code and the GOOD folder is obvious. But what about other files e.g. the ADM folder?

NEEDED
GOOD/*:
hwut-info.dat: If you specify it.
Makefile: If you specifiy it.
Your test scripts and source files that implement the test.
ADM/cache.fly: Optional; Only check-in if queries on past tests are to be accomplished without doing the tests.
NOT TO BE CHECKED-IN
OUT/*
Any result produced by 'make'
Any temporary log files
Note, SCMs usually have a 'prop:ignore' property, or an 'ignore' file. You may adapt this according to the information above.

Related

How to determine the tree of files which are imported during a test case?

When I run a test in Go, is there any way for me to get the list of files that the code imports, directly or indirectly? For example, this could help me rule out changes from certain parts of the codebase when debugging a failing test.
Alternatively, with Git, can we find out what the lowest common ancestor git tree node is for the files exercised in a given test?
Context: I'm looking into automated flakiness detection for my test suite, and I want to be able to know the dependency tree for every test so that I can detect flaky tests better.
For example, if TestX fails for version x of the code, and later on some files in the same codebase which are not used at all by TestX are changed, and then TestX passes, I want to be able to detect that this is a flaky test, even though the overall codebase that the test suite ran on has changed.
You are probably looking for go list -test -deps [packages].
For an explanation of what the flags do, you can check Go command List packages or modules:
-deps:
The -deps flag causes list to iterate over not just the named packages but also all their dependencies. It visits them in a depth-first post-order traversal, so that a package is listed only after all its dependencies. [...]
-test:
The -test flag causes list to report not only the named packages but also their test binaries (for packages with tests), to convey to source code analysis tools exactly how test binaries are constructed. The reported import path for a test binary is the import path of the package followed by a ".test" suffix, as in "math/rand.test". [...]
Maybe I'll state the obvious, but remember that list works on packages, not single files, so the command above will include dependencies of the non-test sources (which should be what you want anyway).

Why the test.h.gcov, shows the content as /EOF/

I'm in the process of generating coverage information of a c++ application and using gtest for the same, and the C++ application is part of the buildroot build. Got correct coverage information for almost files except header files. So I googled the same and found this link as useful, having said that I didn't get the correct answer.
gcov is not generating coverage information for header files
From the link, I understood that to obtain "mymoneyaccount.cpp.gcov",execute "gcov mymoneyaccount.cpp", but to obtain "mymoneyaccount.h.gcov", we need to execute "gcov mymoneyaccounttest.cpp". My first doubt is Anyone have an idea why we need to execute the test app to generate the .gcov of header file? Both files include "mymoneyaccount.h"."
My exact scenario is I'm trying to get the code coverage of source files which is located in "Source" folder and the test application is located in the "Test" folder.
Please find the folder structure.
Source
1.1 a
1.1.1 logic.cpp
1.1.2 logic.h
Test
2.1 a
2.1.1 logictest.cpp
Both Source and Test are art of Buildroot build system, cross compiled it and run the test in the Raspberry Pi-3. As part of Compilation process, logic.cpp.gcno & logictest.cpp.gcno files were generated in the build PC and as part of execution process in the Raspberry Pi-3, corresponding .gcda files were generated. Copied the .gcda files from the RPi to the corresponding locations in Build PC. Executed the gcov on the .gcno files and get the coverage details. Coverage of logic.cpp.gcov is valid and the .h.gcov in the Test directory is satisfying, were .h.gcov in the Source directory is not. Opened the .h.gcov in the Source dir, it shows line counts with the exact source code(not valid coverage data), but the .h.gcov in the Test dir, shows the line counts with code as /EOF/(hope it shows the valid coverage data). Hope it is clear now. Is any solution to view the content of the .h.gcov in the Test directory as the exact code?. Then only we can ensure that we got a valid coverage data and we can implement more test cases with the remaining code. Thank you.

Testing generated Go code without co-locating tests

Have some auto-generated golang code for protobuf messages and I'm looking to add some additional testing, without locating the file under the same directory path. This is to allow easy removal of the existing generated code to be sure that if a file is dropped from being generated, it's not left included in the codebase by accident.
The current layout of these files are controlled by prototool so I have something like the following:
/pkg/<other1>
/pkg/<other2>
/pkg/<name-generated>/v1/component_api.pb.go
/pkg/<name-generated>/v1/component_api.pb.gw.go
/pkg/<name-generated>/v1/component_api.pb.validate.go
The *.validate.go comes from envoyproxy/protoc-gen-validate, and *.pb.go & *.pb.gw.go are coming from protobuf and grpc libraries. The other1 and other2 are two helper libraries that we have included along with the generated code to make it easier for client side apps. The server side is in a separate repo and imports as needed.
Because it's useful to be able to delete /pkg/<name> before re-running prototool I've placed some tests of component_api (mostly to exercise the validate rules generated automatically) under the path:
/internal/pkg/<name>/v1/component_api_test.go
While this works for go test -v ./..., it appears not to work to well when generating coverage with -coverpkg.
go test -coverpkg=./... -coverprofile=coverage/go/coverage.out -v ./...
go build <pkgname>/internal/pkg/<name>/v1: no non-test Go files in ....
<output from the tests in /internal/pkg/<name>/v1/component_api_test.go>
....
....
coverage: 10.5% of statements in ./...
ok <pkgname>/internal/pkg/<name>/v1 0.014s coverage: 10.5% of statements in ./...
FAIL <pkgname>/pkg/other1 [build failed]
FAIL <pkgname>/pkg/other2 [build failed]
? <pkgname>/pkg/<name>/v1 [no test files]
FAIL
Coverage tests failed
Generated coverage/go/html/main.html
The reason for use of -coverpkg is that without it there doesn't seem to be anything spotting that any of the code under <pkgname>/pkg/<name>/v1 is covered, and we've see issues with what it reports previously not showing the real level of coverage, which are solved by use of -coverpkg:
go test -cover -coverprofile=coverage/go/coverage.out ./...
ok <pkgname>internal/pkg/<name>/v1 0.007s coverage: [no statements]
ok <pkgname>/pkg/other1 0.005s coverage: 100.0% of statements
ok <pkgname>/pkg/other2 0.177s coverage: 100.0% of statements
? <pkgname>/pkg/<name>/v1 [no test files]
Looking at the resulting coverage/go/coverage.out includes no mention of anything under <pkgname>/pkg/<name>/v1 being exercised.
I'm not attached to the current layout beyond being limited on <pkgname>/pkg/<name>/v1 being automatically managed by prototool and it's rules around naming for the generated files. Would like to ensure the other modules we have can remain exported to be used as helper libraries and I would like to be able to add tests for <pkgname>/pkg/<name>/v1 without needing to locate them in the same directory to allow for easy delete + recreate of generated files, while still getting sensible coverage reports.
I've tried fiddling with the packages passed to -coverpkg and replacing ./... on the command-line and haven't been able to come up with something that works. Perhaps I'm just not familiar with the right invocation?
Other than that is there a different layout that will take care of this for me?
To handle this scenario, simply create a doc.go file in the same directory as the dis-located tests with just the package and comment. This will allow the standard arguments to work and golang appears to be reasonably happy with an empty file.
Once in place the following will work as expected.
go test -coverpkg=./... -coverprofile=coverage/go/coverage.out -v ./...
Idea based on suggestion in https://stackoverflow.com/a/47025370/1597808

How Do I Setup SonarQube cfamil.gcov Correctly?

I cannot get coverage reporting to work within SonarQube. I have a C++ project for which I am using the build-wrapper-linux-x86-64 along with the sonar-scanner. The basic static analysis for the source code seems to work but there is nothing about test code coverage reported within SonarQube.
As part of the same workflow I am using lcov and genhtml to make a unit test coverage report, so I am confident that most of the code coverage steps are being correctly executed. When I manually view the .gcov files I can see run counts in the first column, so there is data there.
I have my code organised into modules. The sonar-project.properties file includes the following:
# List of the module identifiers
sonar.modules=Module1,Module2
# Path is relative to the sonar-project.properties file. Replace "\" by "/" on Windows.
# This property is optional if sonar.modules is set.
sonar.sources=./Sources,./Tests
HeliosEmulator.sonar.sources=./Application,./Sources,./Tests
sonar.cfamily.build-wrapper-output=build_output
# Existing reports
sonar.cfamily.build-wrapper-output=build_output
#sonar.cfamily.cppunit.reportsPath=junit
sonar.cfamily.gcov.reportsPath=.
#sonar.cxx.cppcheck.reportPath=cppcheck-result-1.xml
#sonar.cxx.xunit.reportPath=cpputest_*.xml
sonar.junit.reportPaths=junit
I would also like to get the unit test results displayed under the Sonar tools. As I am using the CppUTest framework I do not have an xunit or junit test output at present though. This can be dealt with as a separate issue but as I am unable to found much documentation of how to use the cfamily scanner online I do not know if the tests not being listed is relevant.
I had forgotten to setup my CI system correctly. The .gcov files did not exist for the job that was running the sonar-scanner. They only existed in the testing job that generated the coverage report. No files in the scanner job mean it cannot make a coverage report.
When I set the GitLab CI system I am using to keep the .gcov files as artefacts the coverage reporting suddenly started working.
The .gcov files were generated by a test job and need to be transferred to the sonar-scanner job via the artefact store. This is because GitLab CI does not share a work area between dependent jobs and you have to explicitly say what files must be copied.

Using AsConfigured and still be able to get UnitTest results in TFS

So I am running into an issue when I go to build my projects using tfs build controller using the Output location "AsConfigred" it will not detect my unit tests. Let me give a little info on my setup.
TFS 2013 Update 2, Default Process Template
Here is a few screenshots that can hopefully help fill in what I can't in typing. I am copying my build out to a file share on our network so that we can use other utilities use the output. I don't want to use "PerProject" or "SingleFolder" because they mess up the file structure we have configured (These both will run the tests). So i have the files copy to folder names "SingleOutputFolder" which is a child of the DropLocation. I would like to be able to run from the drop folder or run from the bin folder for each of my tests (I don't care which). However it doesn't seem to detect/run ANY of the tests. Any help would be greatly appreciated. Please let me know if you need any additional information.
I have tried using ***test*.dll, Install\SingleFolderOutput**.test.dll, and $(TF_BUILD_DROPLOCATION)\Install\SingleFolderOutput*test*.dll
But I am not sure what variables are available and understand where the scope of its execution is.
Given that you're using Build Output location set to AsConfigured you have to change the default values of the Test sources spec setting to allow build to find the test libraries in the bin folders. Here's an example.
If the full path to the unit test libraries is:
E:\Builds\7\<TFS Team Project>\<Build Definition>\src\<Unit Test Project>\bin\Release\*test*.dll
use
..\src\*UnitTest*\bin\*\*test*.dll;
This question was asked on MSDN forums here.
MSDN Forums Suggested Workaround
The suggested workaround in the accepted answer (as of 8 a.m. on June 20) is to specify the full path to the test projects' binary folders: For example:
C:\Builds\{agentId}\{teamProjectName}\{buildDefinitionName}\src\{solutionName}\{testProjectName}\bin*\Debug\*test*.dll*
which really should have been shown as
{agentWorkingFolder}\src\{relativePathToTestProjectBinariesFolder}\*test*.dll
However this approach is very brittle, for the following reasons:
Any new test projects you add to the solution will not be executed until you add them to the build definition's list of test sources:
It will break under any of the following circumstances:
the build definition is renamed
the working folder in build agent properties is modified
you have multiple build agents, and a different agent than the one you specified in {id} runs the build
Improved Workaround
My workaround mitigates the issues listed in #2 (can't do anything about #1).
In the path specified above, replace the initial part:
{agentWorkingFolder}
with
..
so you have
..\src\{relativePathToTestProjectBinariesFolder}\*test*.dll
This works because the internal working directory is apparently the \binaries\ folder that is a sibling of the \src\ folder. Navigating up to the parent folder (whatever it is named, we don't care) and back in to \src\ before specifying the path to the test projects binaries does the trick.
Note: If you have multiple test projects, you add additional entries, separated with semicolons:
..\src\{relativePathToTestProjectONEBinariesFolder}\*test*.dll;..\src\{relativePathToTestProjectTWOBinariesFolder}\*test*.dll;..\src\{relativePathToTestProjectTHREEBinariesFolder}\*test*.dll;
What I ended up doing was adding a post build event to copy all of the test.dll into the staging location folder in the specific build that is basically equivalent to where it would go on a SingleFolder build and do that on each test project.
if "$(TeamBuildOutDir)" == "" (
echo "Building Interactively not in TFS"
) else (
echo "Building in TFS"
xcopy "$(TargetDir)*.*" "$(TeamBuildBinaries)\" /Y /E /S
)
MSBUILD parameter in the build def that told it to basically drop in the folder that TFS looks for them.
/p:TeamBuildBinaries="$(TF_BUILD_BINARIESDIRECTORY)"
Kept the default Test assembly file specification:
**\*test*.dll
View this link for the information on the variable that I used and what relative path it exists at.
Another solution is to do the reverse.
Leave all of the files in the root so that all of the built in functionality works. There is more than just test execution in there. What about static code analysis, impact analysis..among others. You would have to do something custom for them all.
Instead use a pre-drop powershell script to create your Install arrangement from the root files.
If it is an application then you can use the _ApplicationFolder Nuget package to create an _PublishApplications folder same as you get for web applications.