when using Devel::Cover, difference between ./Build test and cover -t - unit-testing

I'm looking to determine coverage of some unit tests. The project requires a second directory (basically from a GitHub repository) and when running everything, I usually add the 2nd lib directory with
use lib "/path/to/second/dir/lib";
and currently I have this in the Build.PL file (I'm using Module::Build). Running ./Build test gives a nice summary of all of the test (about 10 files-worth), however, running cover -test gives me errors, specifically, a
Can't use an undefined value as a symbol reference at XXX
which is a file in that second directory.
Is this because it appears that the Module::Build tends to copy files to the blib directory? If so, does it use the files there? What are the other differences between running cover -test and ./Build test?

Related

How to set path at test discovery in visual studio?

How can I set the path to my external binaries during test discovery in visual studio's Test Explorer? After that how to make sure, it uses the correct paths?
I use windows 10 and VS 2019. I have a solution that builds some binaries and some tests into different folders. Also, I have some 3rd party dependencies, each in its own folder.
Something like:
solutionDir/
-ownBinaries/
-testBinaries/
-externalBinaries/
I'd like to use the Test Explorer to run my tests. For this purpose, I use a .runsettings file. I installed Google Test adapter via NuGet (later it will run on CI, so this is the only option). The automatic runsetting discovery is disabled, and this file is selected as the runsettings file. It overrides the workingDir to my ownBinaries folder, and extend the PATH enviroment variable with the externalBinaries. The relevant parts are:
<SolutionSettings>
<Settings>
<AdditionalTestExecutionParam>-testdirectory=$(SolutionDir)</AdditionalTestExecutionParam>
<WorkingDir>$(SolutionDir)ownBinaries</WorkingDir>
<PathExtension>$(SolutionDir)externalBinaries</PathExtension>
</Settings>
</SolutionSettings>
This is works fine, after my tests are discovered, but I have problems when it tries to discover my tests.
I use google test and c++, so the test discovery tries to run those tests with the --gtest-list-tests argument, then populate the view with the test name, case, etc. The binaries are just fine, builds without error, I can run them from the debugger, and they produce the output I want.
But the test explorer won't show them, because it doesn't set the externalBinaries path.
This is what lead me to this situation.
First I copied every binaries next to my test exe, namely into the testBinaries folder. Then, I could run it in the cmd with the --gtest-list-tests argument. Everything was fine, all my test names showed up. Started VS, and Test Explorer discovered all my tests, it was able to run them.
Then I done a clean build, so the external stuff deleted from the testBin folder. The Test Explorer cached the test names, so it was able to run them.
Restart VS. Test Explorer tries to discover my tests. but it fails whit this helping message: (removed date+time)
Google Test Adapter: Test discovery starting...
Failed to run test executable 'D:\MySolution\testBinaries\SBCUnitTest.exe': One or more errors occurred.
Check out Google Test Adapter's trouble shooting section at https://github.com/csoltenborn/GoogleTestAdapter#trouble_shooting
In particular: launch command prompt, change into directory '..\ownBinaries', and execute the following command to make sure your tests can be run in general.
D:\MySolution\testBinaries\SBCUnitTest.exe --gtest_list_tests -testdirectory=
Found 0 tests in executable D:\MySolution\testBinaries\SBCUnitTest.exe
Test discovery completed, overall duration: 00:00:00.3022924
Have you noticed that -testDirectory= is empty despite it is set in the runsettings file?
I'm completely lost how I can proceed with it. This workaround is quite heavy to copy all files, then delete all but the test binaries each time when I start VS.
Here is the link for the Troubleshooting section mentioned in the error message.
I've read through the readme file on github, also the runsetting docs on Microsoft's website.
Edit
I made progress with the VsTest.console.exe, I can successfully run all my tests with the proper arguments as below:
& "VSTest.console.exe" *_uTest.exe /Settings:..\MySolution.gta.runsettings /TestAdapterPath:"..\packages\GoogleTestAdapter.0.18.0\build\_common\"
I use the same *.runsettings and *.gta_settings_helper files. Those files are used to get absolute paths for the dependencies. I could run this from different folders, but then I had to adjust the arguments (test discovery pattern, relative path to runsettings, and relative path to GTA).
Great news, that it successfully runs on Azure (it uses vstest.console).
Edit 2
Tried to merge the workingDir and pathExtension nodes, so only one needed (the pathExtension). No success.
Tried to install Test adapter for google test in the VS installer, delete the runsetting file, and set the properties in VS->Tools->Options then Test adapter for google test. Even the example pathExtension didn't worked for me.
Found the extended logs under %AppData%/Local/Temp/TestAdapter/someNumber/*.txt and in that log I've found one line as the runsettings file. I paste here the formatted version of the log
<RunSettings>
<GoogleTestAdapterSettings>
<SolutionSettings>
<Settings>
<WorkingDir>$(SolutionDir)</WorkingDir>
<PathExtension>$(SolutionDir)externalBinaries</PathExtension>
</Settings>
</SolutionSettings>
<ProjectSettings>
</ProjectSettings>
<GoogleTestAdapterSettings>
<SolutionSettings>
<Settings>
</Settings>
</SolutionSettings>
<ProjectSettings>
</ProjectSettings>
</GoogleTestAdapterSettings>
</GoogleTestAdapterSettings>
</RunSettings>
Does anybody know why is there an empty google test adapter setting? Where does it comes from? I think this is overwrites my settings.
It turned out, before first run the relative paths are not known.
Trivial solution
Add the full path to the PATH Extension under Visual Studio -> Options -> Test Adapter for Google Test settings. Meanwhile the custom *.runsetting file is not selected.
Using this method all my tests are discoverable, but it is a manual setting for each repo cloned.

OCaml dune: get absolute path to source directory

I have a project and there is a ./tests directory at its root containing several hundred MB of data that is used by the tests of several libraries:
./tests
./src/lib1/dune
./src/lib1/tests/dune
./src/lib1/tests/tests.ml
./src/lib2/dune
./src/lib2/tests/dune
./src/lib2/tests/tests.ml
...
I also defined tests that use the data in ./tests for each library like this:
(rule
(alias runtest)
(action (run ./tests/tests.exe)))
I now have to somehow communicate the location of the test data to each of my tests.exe. What is the most elegant way of doing this using dune?
It seems that dune copies my test data into _build which is unnecessary because the data never changes and it doesn't make sense to waste several hundred MB of space that way. From the documentation it seems that %{project_root} would contain the path to my source files but unfortunately, the variable evaluates to . which is useless for the tests which are run after a cd _build/default/src/libX and thus . does not point to the project root anymore. So is there a dune-way to specify the path to the original source directory without ugly hacks?
Right now, I'm using an environment variable containing the full path before I run dune runtest but is there a more integrated way?
I have not tried it myself but it sounds like the data_only_dirs stanza is what you are looking for: https://dune.readthedocs.io/en/stable/dune-files.html#data-only-dirs-since-1-6

How to run multiple test files in a go package

I have a project structure like this:
pkg
|
--pkg.go
--pkg_test.go
--a.go
--a_test.go
--b.go
--b_test.go
--c.go
--c_test.go
I wish to get the coverage for all the source files belonging to the package i.e.(pkg.go, a.go, b.go and c.go). However, when I run:
go test -v pkg
tests are run for only 1/4 go files.
Is there any way I can test my package without moving all the test codes within one file and keeping the file structure intact ?
if your working directory is that of your package, to test all of the files you could run:
go test ./...
if you wanted to get test coverage, you could run:
go test ./... -cover

Annoying error message: cannot merge previous GCDA file

Problem:
I'm using the following flags to generate the code coverage of my Qt application (.pro file):
QMAKE_CXXFLAGS += --coverage
QMAKE_LFLAGS += --coverage
The code coverage is correctly generated, the problem is that if I want to run only one test function/class (and the GCDA files were already created) I get the following error message:
profiling: /Users/user/.../build-myapp/myclass.gcda: cannot merge previous GCDA file: corrupt arc tag (0x00000000)
Note that the error message is shown for each GCDA file. Also, note that it doesn't seem to affect the test cases.
Workaround:
As explained here, it "is a result of the build tools failing to merge current results into the existing .gcda coverage files". As answered in the question, an option is to delete the GCDA files before running the tests. For example, by adding the following command in the build phase:
find . -name "*.gcda" -print0 | xargs -0 rm
Question:
My problem is that I don't want to delete the old GCDA files every time I run the test cases. As I'm running only one test function/class, I want to keep the old GCDA files as they are, and only merge the GCDA file related to the current class. As I manually checked, it is already being done because only the coverage of my current class is updated, and the old coverages remain the same.
So, is there a command to just ignore (don't show) the error messages related to the GCDA merging problems? Or even better, a command to only update the GCDA files related to the current test class?
Note: I'm using Qt 5.3.2 on macOS Sierra with Clang.
Related questions:
Code coverage warnings spam output
How to merge multiple versions of gcda files?
.gcda files don't merge on multiple runs
When you compile with profiling, the results of each run are stored in a file that ends with .gcda.
The error appears when your existing .gcda file is in a different format than the current program that just finished running.
This happened to me when I had run a Linux version of my executable, then recompiled for MacOS and ran it. The MacOS program saw the existing .gcda file and generated pages of errors.
Eliminate the errors by removing the existing .gcda file.
I came across the same problem. My solution is:
There is a general Test.pro project which uses SUBDIRS to include every test project as subproject.
In each test subproject I have the following line
QMAKE_POST_LINK = rm -f "*.gcda"
This deletes the *.gcda files only for the subproject which was just relinked. Unmodified projects keep their *.gcda files.

Using AsConfigured and still be able to get UnitTest results in TFS

So I am running into an issue when I go to build my projects using tfs build controller using the Output location "AsConfigred" it will not detect my unit tests. Let me give a little info on my setup.
TFS 2013 Update 2, Default Process Template
Here is a few screenshots that can hopefully help fill in what I can't in typing. I am copying my build out to a file share on our network so that we can use other utilities use the output. I don't want to use "PerProject" or "SingleFolder" because they mess up the file structure we have configured (These both will run the tests). So i have the files copy to folder names "SingleOutputFolder" which is a child of the DropLocation. I would like to be able to run from the drop folder or run from the bin folder for each of my tests (I don't care which). However it doesn't seem to detect/run ANY of the tests. Any help would be greatly appreciated. Please let me know if you need any additional information.
I have tried using ***test*.dll, Install\SingleFolderOutput**.test.dll, and $(TF_BUILD_DROPLOCATION)\Install\SingleFolderOutput*test*.dll
But I am not sure what variables are available and understand where the scope of its execution is.
Given that you're using Build Output location set to AsConfigured you have to change the default values of the Test sources spec setting to allow build to find the test libraries in the bin folders. Here's an example.
If the full path to the unit test libraries is:
E:\Builds\7\<TFS Team Project>\<Build Definition>\src\<Unit Test Project>\bin\Release\*test*.dll
use
..\src\*UnitTest*\bin\*\*test*.dll;
This question was asked on MSDN forums here.
MSDN Forums Suggested Workaround
The suggested workaround in the accepted answer (as of 8 a.m. on June 20) is to specify the full path to the test projects' binary folders: For example:
C:\Builds\{agentId}\{teamProjectName}\{buildDefinitionName}\src\{solutionName}\{testProjectName}\bin*\Debug\*test*.dll*
which really should have been shown as
{agentWorkingFolder}\src\{relativePathToTestProjectBinariesFolder}\*test*.dll
However this approach is very brittle, for the following reasons:
Any new test projects you add to the solution will not be executed until you add them to the build definition's list of test sources:
It will break under any of the following circumstances:
the build definition is renamed
the working folder in build agent properties is modified
you have multiple build agents, and a different agent than the one you specified in {id} runs the build
Improved Workaround
My workaround mitigates the issues listed in #2 (can't do anything about #1).
In the path specified above, replace the initial part:
{agentWorkingFolder}
with
..
so you have
..\src\{relativePathToTestProjectBinariesFolder}\*test*.dll
This works because the internal working directory is apparently the \binaries\ folder that is a sibling of the \src\ folder. Navigating up to the parent folder (whatever it is named, we don't care) and back in to \src\ before specifying the path to the test projects binaries does the trick.
Note: If you have multiple test projects, you add additional entries, separated with semicolons:
..\src\{relativePathToTestProjectONEBinariesFolder}\*test*.dll;..\src\{relativePathToTestProjectTWOBinariesFolder}\*test*.dll;..\src\{relativePathToTestProjectTHREEBinariesFolder}\*test*.dll;
What I ended up doing was adding a post build event to copy all of the test.dll into the staging location folder in the specific build that is basically equivalent to where it would go on a SingleFolder build and do that on each test project.
if "$(TeamBuildOutDir)" == "" (
echo "Building Interactively not in TFS"
) else (
echo "Building in TFS"
xcopy "$(TargetDir)*.*" "$(TeamBuildBinaries)\" /Y /E /S
)
MSBUILD parameter in the build def that told it to basically drop in the folder that TFS looks for them.
/p:TeamBuildBinaries="$(TF_BUILD_BINARIESDIRECTORY)"
Kept the default Test assembly file specification:
**\*test*.dll
View this link for the information on the variable that I used and what relative path it exists at.
Another solution is to do the reverse.
Leave all of the files in the root so that all of the built in functionality works. There is more than just test execution in there. What about static code analysis, impact analysis..among others. You would have to do something custom for them all.
Instead use a pre-drop powershell script to create your Install arrangement from the root files.
If it is an application then you can use the _ApplicationFolder Nuget package to create an _PublishApplications folder same as you get for web applications.