How to make MATLAB xUnit work on MATLAB R2008b (7.7)? - unit-testing

I copied the matlab_xunit folder to C:\Program Files, and included it (and its subfolders) on the MATLAB path. Now MATLAB recognizes new commands such as
runtests
But this command does not find any tests on the current folder. What I have done wrong? What else can I do?
>> runtests
Starting test run with 0 test cases.
PASSED in 0.000 seconds.

I am the creator of MATLAB xUnit. The most likely explanation for what you are seeing is some problem in the test files. Can you post a sample test file so I can look at it?
If you are writing subfunction-style test files, do any files in your current directory start with "test" or "Test"? Does the file contain any subfunctions that begin with "test" or "Test"? When you call one of those files with no input arguments and a single output argument, does it return a TestSuite object? If not, then double-check the documentation about creating subfunction tests.
Are you instead writing test files that subclass TestCase? Do they contain methods that begin with "test" or "Test"?

This document on the File Exchange page for the MATLAB xUnit Test Framework submission should help. It says that you have to create a folder with your test-case M-files in it, then make that your working directory using CD.

Related

when using Devel::Cover, difference between ./Build test and cover -t

I'm looking to determine coverage of some unit tests. The project requires a second directory (basically from a GitHub repository) and when running everything, I usually add the 2nd lib directory with
use lib "/path/to/second/dir/lib";
and currently I have this in the Build.PL file (I'm using Module::Build). Running ./Build test gives a nice summary of all of the test (about 10 files-worth), however, running cover -test gives me errors, specifically, a
Can't use an undefined value as a symbol reference at XXX
which is a file in that second directory.
Is this because it appears that the Module::Build tends to copy files to the blib directory? If so, does it use the files there? What are the other differences between running cover -test and ./Build test?

Why the test.h.gcov, shows the content as /EOF/

I'm in the process of generating coverage information of a c++ application and using gtest for the same, and the C++ application is part of the buildroot build. Got correct coverage information for almost files except header files. So I googled the same and found this link as useful, having said that I didn't get the correct answer.
gcov is not generating coverage information for header files
From the link, I understood that to obtain "mymoneyaccount.cpp.gcov",execute "gcov mymoneyaccount.cpp", but to obtain "mymoneyaccount.h.gcov", we need to execute "gcov mymoneyaccounttest.cpp". My first doubt is Anyone have an idea why we need to execute the test app to generate the .gcov of header file? Both files include "mymoneyaccount.h"."
My exact scenario is I'm trying to get the code coverage of source files which is located in "Source" folder and the test application is located in the "Test" folder.
Please find the folder structure.
Source
1.1 a
1.1.1 logic.cpp
1.1.2 logic.h
Test
2.1 a
2.1.1 logictest.cpp
Both Source and Test are art of Buildroot build system, cross compiled it and run the test in the Raspberry Pi-3. As part of Compilation process, logic.cpp.gcno & logictest.cpp.gcno files were generated in the build PC and as part of execution process in the Raspberry Pi-3, corresponding .gcda files were generated. Copied the .gcda files from the RPi to the corresponding locations in Build PC. Executed the gcov on the .gcno files and get the coverage details. Coverage of logic.cpp.gcov is valid and the .h.gcov in the Test directory is satisfying, were .h.gcov in the Source directory is not. Opened the .h.gcov in the Source dir, it shows line counts with the exact source code(not valid coverage data), but the .h.gcov in the Test dir, shows the line counts with code as /EOF/(hope it shows the valid coverage data). Hope it is clear now. Is any solution to view the content of the .h.gcov in the Test directory as the exact code?. Then only we can ensure that we got a valid coverage data and we can implement more test cases with the remaining code. Thank you.

gtest unit tests target configuration file path

In my C++ application, I have a text file (dataFile.txt) that is installed on the Linux target machine in the following path:
/SoftwareHomeDir/Configuration/Application/dataFile.txt
This file exists on my Rational ClearCase source code environment under the path:
/ProjectName/config/Application/dataFile.txt
I am developping a unitTest in gtest that does following:
Read a specific data from dataFile.txt , if the data does not exist than write it into the file.
1) I am avoiding to create an environment variable to check whether I am in the compilation environment or the target machine. Then add additional test code in the final release. I really want to separate test code from final code.
2) I am not using any IDE (no visual studio, no qt, etc.), just notepad++
3) The compilatio. server is shared (access with a username, however the root folder "/" is shared. Which means that if I create the path "/SoftwareHomeDir/Confiugration/Application/dataFile.txt", it will be visible by all users, and if another user is running his gtest unitTest, he may overwrite my file.
4) In the final code, the path to the dataFile is hard coded, and it is very costly (will take few seconds to run) to implement a filesearch(filename) method to look for the file in the entire hard drive before reading the file.
Question:
I am looking for a solution to unit-test my code in the compilation environment that is using /ProjectName/config/Application/dataFile.txt
The solution to my problem was to combine gmock with gtest as described by the link
https://github.com/google/googletest/blob/master/googlemock/docs/CookBook.md#delegating-calls-to-a-fake
The only modification I made to my code is that instead of defining the path to the configuration data using #define, I created a function getConfigFilePath() that returns the hardcoded path of the configuration file in the installed application. From here, I mocked the class and in my mock, I call a fake getConfigFilePath() that returns, when the real code is executing, the hardcoded path of the config file in the project tree in ClearCase. This is precisely what I was looking for.

Generate test results using xunit in VSO build task for asp.net core app

I have this build :
It works fine. The only issue is that the Test Results are overridden. So I actually end up with the test results for the last test project executed.
This is executed by build engine;
C:\Program Files\dotnet\dotnet.exe test C:/agent/_work/4/s/test/Services.UnitTests/project.json --configuration release -xml ./TEST-tle.xml
C:\Program Files\dotnet\dotnet.exe test C:/agent/_work/4/s/test/Web.UnitTests/project.json --configuration release -xml ./TEST-tle.xml
What could help:
1) having "dotnet test" generate XML output file - did not find a way how to do that
2) Use a variable for -xml output file in Build Task. That variable could be a random string/number or just a project name being tested - like what Build engine feeds to "dotnet.exe test". No way how to do that.
Any ideas? Thanks.
I think that, although you're running the task against all of the projects in one go, as the .Net Core (Preview) task doesn't have a working directory, that the test results are being generated at solution root (or similar) and done for each project in turn.
I set mine up using simple command line tasks...
Tool: dotnet
Arguments: test -xml testresults.xml
Working folder: {insert the folder for the project to test here}
These work fine but I have one set up for each project. You could try creating a task for each library and adding the full path to the test results argument (or name them appropriately as starain suggested).
This feels like a minor bug to me.
Based on my test, it doesn’t recognize the date variable as Build Number.
To deal with this issue, you can add another .Net Core (Test) step to run xunit test with different result file.
For example:

Which HWUT files to put under version control

I am using HWUT for unit testing and want to put my tests under version control. Adding the test code and the GOOD folder is obvious. But what about other files e.g. the ADM folder?
NEEDED
GOOD/*:
hwut-info.dat: If you specify it.
Makefile: If you specifiy it.
Your test scripts and source files that implement the test.
ADM/cache.fly: Optional; Only check-in if queries on past tests are to be accomplished without doing the tests.
NOT TO BE CHECKED-IN
OUT/*
Any result produced by 'make'
Any temporary log files
Note, SCMs usually have a 'prop:ignore' property, or an 'ignore' file. You may adapt this according to the information above.