parsing django test results - django

I recently wrote some tests for one of my django Projects. What I now want to do is to call the test command from a script.
I am looking to parse the test results and save them. Is that at all possible for django testing framework?

The easiest way would be to use a standard test output format, such as JUnit XML, for which there are already libraries. Right now, I'm using django-jenkins, which provides a nice output that I can view in our CI tool.
If you'd like to roll your own solution, I'd reccomend coding your own Test Runner, and customizing the suite_result method.

Related

What is the equivalent of autotest/guard for django

When I code in Ruby on Rails, I rely on Guard to listen for changes to the code base so when I'm writing tests, I don't need to manually run the tests in the file I'm working on each time.
https://github.com/guard/guard-rspec
What is the closest thing to thing for django so I can enjoy the same workflow?
Specifically, what I want to do is be able to have tests run, based on:
what run tests based on files I have changed, and not
know whether to run the test command based on whether a test run is currently taking place
work with existing tests written with unittest
work with something like factory boy to let me use factories instead of fixtures
I've used nose before, and pytest and I'm comfortable using both - but I haven't used many of pytests extensive set of libraries.
What are my options for this?

Create a simple unit tests framework from scratch in Coldfusion

I know there are existing tools for testing a ColdFusion application (MXUnit, MockBox), but I'm creating a custom tool, so that it will require less configuration.
When I run a unit test file, it's done via a generic 'model' which retrieves all functions from the unit test file. Within each test function, I have to call assertEquals -- but these functions are in the model, so I cannot access them.
I tried by passing the model itself to the unit test file so it can call the models functions directly but it doesn't work and it adds logic to the test file, which I don't like.
I can also extend the model in the test file but I will have to call directly the test file, call super.init(this) so the model can fetch test functions, etc..
Is there a way to achieve this kind of process? What's the best option?
In answer to your question, it sounds like you want to inject variables / methods into the subject under test. You can do it like so:
myInstance["methodName"] = myFunction;
You can then call the injected method like so:
myInstance.myFunction();
Both MXUnit and TestBox use this technique.
Having said that I don't quite understand why you want to re-invent the wheel. TestBox is an excellent, proven testing framework which has a wealth of features which would take you an incredible amount of time to replicate. I'm not quite sure what the configuration issue you have could be - it really doesn't require very much setup. Maybe it might be worth asking how to setup and use TestBox rather than how to build your own testing solution :)
There is a good book (which is available in a free version) which you can read on TestBox here : http://testbox.ortusbooks.com/
Good luck!

Generate python unit test document

My bosses want a list of all the unit tests with their descriptions. Since this will change frequently I'd like to find a way to generate it instead of trying to manually keep it up to date. I am using python for this project. Is there some way to make doxygen or some other tool do this?

Unit testing/continuous integration with Simulink/Stateflow

How can I perform unit testing in Simulink, or preferably, Stateflow?
I'm a fan of agile software methods, including test driven development. I'm responsible for the development of safety critical control software and we're using Matlab/Simulink/Stateflow for the development of it. This toolset is selected because of the link with plant (hardware) models. (model-in-the-loop, hardware-in-the-loop)
I have found some links on Stackoverflow: Unit-testing framework for MATLAB: xunit, slunit and doctest.
Does anyone have experience in using those or different unit test frameworks?
How to link this to continuous integration systems (i.e. Hudson)?
EDIT: This is now much easier and getting easier all the time with the Jenkins plugin for MATLAB
ORIGINAL ANSWER:
As Craig mentioned there is indeed a framework in MATLAB introduced in R2013a. Furthermore, this framework added a TAPPlugin in R2014a which outputs the Test Anything Protocal. Using that protocol you can set up your CI build with a TAPPlugin (eg. Jenkins, TeamCity) so that the CI system can fail the build if the tests fail.
Your CI build may look like a shell command to start MATLAB and run all your tests:
/your/path/to/matlab/bin/matlab -nosplash -nodisplay -nodesktop -r "runAllMyTests"
Then the runAllMyTests creates the suite to run and runs it with the tap output being redirected to a file. You'll need to tweak specifics here, but perhaps this can help you get started:
function runAllMyTests
import matlab.unittest.TestSuite;
import matlab.unittest.TestRunner;
import matlab.unittest.plugins.TAPPlugin;
import matlab.unittest.plugins.ToFile;
try
% Create the suite and runner
suite = TestSuite.fromPackage('packageThatContainsTests', 'IncludingSubpackages', true);
runner = TestRunner.withTextOutput;
% Add the TAPPlugin directed to a file in the Jenkins workspace
tapFile = fullfile(getenv('WORKSPACE'), 'testResults.tap');
runner.addPlugin(TAPPlugin.producingOriginalFormat(ToFile(tapFile)));
runner.run(suite);
catch e;
disp(e.getReport);
exit(1);
end;
exit force;
EDIT: I used this topic as the first two posts of a new developer oriented blog launched this year
Unit testing Simulink is not straightforward, unfortunately. Mathworks have the SystemTest. Alternatively, you can roll-your-own Simulink testing framework, which is the approach that we've followed and is not too difficult, but you may need to built test-harnesses programmatically.
In order to integrate with CI, you need to create a function/script that executes all the tests, then you can use the command-line parameters for MATLAB.exe to run a script on start-up. I'm not sure anyone has a good way of integrating the test reports with the CI software, though. Just look at the number of comments in Unit-testing framework for MATLAB.
With 2015a Matlab introduces a new product name "Simulink Test". Perhaps that'll simplify this mess.
http://www.mathworks.com/products/simulink-test/features.html#manage-test-plans-and-test-execution
R2016b introduces integration between Simulink Test and MATLAB Unit Testing framework. Tests created with Simulink Test using Test Manager (*.mldatx) are recognized by and can be run natively using the MATLAB Unit Test Runner and thus you can generate JUnit style XML test results or TAP test results facilitating Continuous integration workflows.
See this reference for more information: https://www.mathworks.com/help/sltest/ug/run-test-files-using-matlab-unit-test.html?s_tid=gn_loc_drop
The documentation shows an example of producing TAP results using matlab.unittest.plugins.TAPPlugin but you can use XMLPlugin (https://www.mathworks.com/help/matlab/ref/matlab.unittest.plugins.xmlplugin-class.html) instead just as easily.
This should open up a better integration within just MATLAB environment even without CI in the picture with the ability to have MATLAB and Simulink Tests together in the same test suite and have them run together seamlessly. For example if you have a directory MYDIR with both native MATLAB unit tests and Simulink Tests, you can do something as simple as the follows to execute both kinds of tests in one shot:
results = runtests(MYDIR)
If your system is complex, you should decompose it using Model Reference and test each of these independently.
An other solution (more "old school") is to put your main blocks in a library and to create small models.
To test these submodels and especially those with a State Machine (Stateflow), the best is to create temporal test cases
with the Signal builder block. You have a powerful function signalbuilder to interact with this block and load test cases. My method is to get for each case of each submodel an input file and an output file. Your outputs of the model are the "correct" output and the one from the blocks. The model is run with sim (no external inputs) and the 2 outputs are compared with a script the indicated which signal is different (and when).
You could use an existent system but I prefer to use my own to run each case (or some of them).
I don't have any public code for that but that's the way I use. I don't use a CIS so I can't answer the second part of your question.
Matlab (since 2013b) has built-in support for xUnit, in the form of the Unit Testing Framework.
I haven't used it but since it's possible to run simulink from Matlab with sim() then this framework can be used to test your simulink models. You libraries and possibly models will need a wrapper to execute it as the other answerers have noted.
There are plenty of examples on the Mathworks site, unfortunately non of them run simulink models. I'd code an example for you, but I don't have ML2013b :-(
In order to initiate your tests from a CI (I use Jenkins) then you can call matlab to run a .m file that runs your test suite, this example cmd script will call Run_Tests.m from Matlab:
IF EXIST "C:\Program Files (x86)\MATLAB\R2013b\bin\win32\matlab.exe" (
REM WinXP
"C:\Program Files (x86)\MATLAB\R2013b\bin\win32\matlab.exe" -r "Run_Tests;exit" -logfile matlab.log
) ELSE (
REM Win7
"C:\Program Files\MATLAB\R2013b\bin\win32\matlab.exe" -r "Run_Tests;exit" -logfile matlab.log
)
Note that if a startup.m exists in the directory that you call Matlab from, then it'll be executed automatically beforeRun_Tests.m`.
I think you are searching for something like EZTEST. It is intended for your special purpose: Test driven development for Simulink and Stateflow on unit level. For safety critical software, there is also a Safety Manual included, which describes what is covered regarding ISO 26262.
Edit: I am a developer of this software, so my opinion may be biased. But I am not involved in the marketing or sale of the product. I am just posting this, because I know that there are little to none unit test frameworks out there, meeting the questioner's needs (as the answers might suppose).
Testing units using SIL and PIL is also supported. Unfortunately I am not familiar with Hudson, so I cannot address this part of the question.
I've seen different solutions to the problem of unit testing Simulink models. Simulink Verification & Validation which did not support xUnit concepts of test runners and test suites at the time of examination, TPT being overloaded with functionality, not easy to use and very hard in terms of changeability and maintainability.
Furthermore I've seen custom solutions with Matlab scripts and Excel tables which were lightweight but also difficult in terms of understandability and maintainability. I'd still not recommend using any of these approaches, at least not for unit testing.
In the end, we ended up using a C unit testing framework (CUnit) testing the generated code. While this definitely has the disadvantage that you have to generate code before testing, it also has a lot of advantages, like easy integration into CI systems, high flexibility of writing unit tests, fast execution of unit tests and last but not least refactoring capabilities in terms of switching from Simulink to another model-based environment or to hand-written code. Especially the last point should not be underestimated since I have seen many Simulink models that should have been hand-written modules in the first place. Nowadays, I'd recommend using GoogleTest instead of CUnit.

DbUnit for C++?

We're developing in C++ under Linux and about to set up automated tests. We intend to use a testing framework like CppUnit oder CxxTest. We're using Ant to build the software and we will also use it to run the tests.
As some tests are going to involve database access, we are looking for a tool or framework which facilitates the tasks of preparing and cleaning up test data in the database - just like DbUnit (a JUnit extension) in the Java world.
Another option may be to employ the actual DbUnit - a Java VM is available. Making use of DbUnit's Ant task seems to be most promising. Any related field reports are welcome!
I would recommend boost unit testing. You would probably have to use the setup and teardown to manually clean up the database. Of course, you could build your own C++ DbUnit in ODBC. IF you do let me know because I could use this as well!
I suppose you have your own C++ api to work with DB.
If it is true, you'd better to do all your DB preparation by your own. In that case you will test your DB API as well.
As there seems to be no DbUnit-like tool for C++ development, we've built a little framework of our own. Basically it's an adaptor for calling actual DbUnit operations from within C/C++ testrunners. It makes use of the Ant tasks provided by DbUnit.
We defined some macros like TS_DB_INSERT(filename) which call system("ant -Ddb.dataset=filename db.insert") and the like.
In this case, db.insert is an Ant target which executes a DbUnit task performing an INSERT operation on the database. The filename references an XML dataset containing the data to insert.
There's also an assertion macro which wraps a DbUnit compare.
The test case might look like this:
void testDatabaseStuff
{
TS_DB_INSERT("input.xml");
TestedClass::doSomething();
TS_DB_ASSERT("expected.xml");
}