I need to assign my test results to use cases.
Currently I have TestNG tests for my classes (unit tests).
Obviously those tests are classes exist because of use cases, but there is no obvious 1-1 mapping.
Is it possible to configure TestNG reports to include custom groups in reporting?
Like
F02UC01 parse input
for this use case I have test classes:
com.company.product.input.ParseTest
F03UC02 produce output
for this use case I have test
com.company.product.input.OutputTest
com.company.product.input.AnotherOutputTest
Ideally, I do not want to rerun or rewrite existing tests. I just want another test report, with diffrent grouping criterias.
Usually you do this by creating a suite xml file. E.g.
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" >
<suite name="Use Case Tests">
<test name="F02UC01 parse input">
<classes>
<class name="com.company.product.input.ParseTest"/>
</classes>
</test>
<test name="F03UC02 produce output">
<classes>
<class name="com.company.product.input.OutputTest"/>
<class name="com.company.product.input.AnotherOutputTest"/>
</classes>
</test>
</suite>
Maybe you can select them by packages instead of classes.
<packages>
<package name="com.company.product.input.*"/>
</packages>
You can also mix classes and packages selection. See the testng documentation.
EDIT
I am running all tests anyway. I just want another report, where some of the tests are use cases.
I guess in this case you have to implement your own IReporter.
I would create an annotation that I can add to test methods to logically group them. E.g.
#Test
#TestTag("F02UC01 parse input")
public void someTest(){
}
and then use the IAnnotationFinder in my custom reporter to report tests grouped by the annotation's value.
Related
I am using VisualPHPUnit and I am trying to organize my tests into suites (due to the fact that Selenium IDE does not export PHPUnit test suites).
I am currently implementing the option of the configuration XML file.
Yet,it is very limiting cause I want to run test suites on demand and not uploading each time on the server a new XML file (aka test suite).
I know that I can create an XML file with many test-suites in it but I would like to run them individually.
As you understand i am struggling towards DontRepeatYourself principle and code reuse.You know..just choose login.php , then the testcase and them logout.php and run them.
Is something like this possible???
Moreover...would it be difficult VisualPHPUnit to parse only one XML file and create a dropdown box of testsuites to choose from and run?
This is my XML file
<phpunit>
<!-- This is required for VPU to work correctly -->
<listeners>
<listener class="PHPUnit_Util_Log_JSON"></listener>
</listeners>
<testsuites>
<testsuite name="TestSuite1">
<file>/var/www/VisualPHPUnit/app/unitTests/Login/Login.php</file>
<file>/var/www/VisualPHPUnit/app/unitTests/CreateCourse/CreateCourse1.php</file>
<file>/var/www/VisualPHPUnit/app/unitTests/Logout.php</file>
</testsuite>
</testsuites>
</phpunit>
actually the PHP formaters are once again available for Selenium IDE. You may want to check this out:
https://addons.mozilla.org/en-us/firefox/addon/selenium-ide-php-formatters/
there are 2 php formaters: PHP_unit and PHP_Selenium
I am using Unity for IOC. I would like to configure the creation of moq objects using the app.config.
My config looks some like this:
<unity xmlns="schemas.microsoft.com/practices/2010/unity">
<container>
<register type="Namespace1.IFoo, FooInterface"
mapTo="Namespace2.FooImp, FooImplementation">
</register>
</container>
</unity>
I am looking for a technique to specify a configuration some like this:
<unity xmlns="schemas.microsoft.com/practices/2010/unity">
<container>
<register type="Namespace1.IFoo, FooInterface"
mapTo="Moq.Mock<IFoo>, Moq">
</register>
</container>
</unity>
I Know that I have to access .Object property of a Mock but this sample is just meant to explain what I want to do.
In other words: I do not want to use code to configure unity to use moq. An option would of course be to create some helpers that can be used in general.
After no one could help I figured something out.
Use a Factory for moq and add a factory resolution via unity
sample goes here
Hi would like to reach out to the community to gain insight and advice
on the approach to Test-Driven Development for the work I'm carrying out.
I am working on a ASP.NET MVC3 project that parses in a physical XML file (containing chart and table data).
First the app generates a model representation of the xml nodes.
The controller is there to carriy out the application logic,
that ultimately renders to a specific HTML view with charts and tables.
I am thinking that I will be building a model that represents the xml ie classes like dataset,header,dimension etc with
appropriate interfaces. Is this the right approach. (Please see below the sample xml)
What sort of units tests would I write?
Would I start with unit tests that access the physical XML files (probably not)?
Should I stream fragmetns of xml strings into an Xdocument? (isn't that teting .net code?)
Presuming I don't want to create concrete XDocument classes, how do I mock out the object eg
First test I want to do (I think) is to load the xml and test the end_Date is correct
I have a XMLHelper Class that loads the xml and returns a class representation of a header with property end date.
So my concrete code would look something roughly like
var dataset = XmlHelper.Load(filePathOrXmlStream);
var header=dataset.Header;
Assert.AreEqual("5/06/2011",header.EndDate);
presume that the below XML is used for the stream or file load.
XML Source
<dataset>
<header>
<end_date>5/06/2011</end_date>
<dimension id="mkt" desc="market">
<item mkt="0" desc="Company A" />
<item mkt="1" desc="Company B" />
</dimension>
<dimension id="prd" desc="product">
<item prd="0" desc="Product A " Groups_Total="Segment Totals" Total="Yes" Product="All" grp="Category" />
</dimension>
<dimension id="msr" desc="measure">
<item msr="0" tag="ACTIVE_1" desc="Active Products" />
</dimension>
<dimension id="tim" desc="time">
<item tim="0" tag="LAST WEEK -52" desc="06/06/10 " />
<item tim="1" tag="LAST WEEK -26" desc="05/12/10 " />
<item tim="2" tag="LAST WEEK 0" desc="05/06/11 " />
</dimension>
</header>
<data>
<dpGroup tim="0">
<dp mkt="0" prd="0" msr="0" tim="0">1031</dp>
<dp mkt="1" prd="0" msr="0" tim="0">986</dp>
</dpGroup>
<dpGroup tim="1">
<dp mkt="0" prd="0" msr="0" tim="1">970</dp>
<dp mkt="1" prd="0" msr="0" tim="1">937</dp>
</dpGroup>
<dpGroup tim="2">
<dp mkt="0" prd="0" msr="0" tim="2">982</dp>
<dp mkt="1" prd="0" msr="0" tim="2">955</dp>
</dpGroup>
</data>
</dataset>
I would do the most important test first:
Given model representation of xml,
when user asks html output,
controller should produce correct view with chart/table.
Making and passing that test will make you think about overall design too. After that, it will be branch & bound.
I think you are approaching the problem properly. There are really 2 separate steps in your process:
1) transform the XML document into a class representation, a Model
2) render a Model to a View
The part where TDD should work well is step 2, because you are dealing with objects. You can then follow the path outlined by Taesung Shin. You can define what the interface for your object is if needed, and have a IChartModel with, say, a StartDate property, which you can then Mock, set the StartDate to whatever you want, and write assertions about what should be true of the View in that case. As Taesung said, this will help you drive your design.
The part where TDD will not work that well is in step 1. Unit tests shine when you can entirely work in memory, and by definition a file on disk does not work well in that context. What I would do then, if you thought it was worthwhile the effort, would be to have sample files, and test your XmlReader against these files, to make sure that you are reading what you should, and properly populating the inputs for step 2. These will not be "proper" unit tests, but more integration tests. I would tend to create a "happy file", with proper inputs, and possibly files for potentially malformed cases. As you encounter bugs over time, you can start adding new files. These tests wouldn't be fun to write, though.
If you are going to create that XML file in your application, you may consider having tests where you create these files and read them back, which might give you more "code control" over what is going on, as opposed to having to maintain fixed files over time, if your design evolves.
The biggest benefit you would get in separating this, in my opinion, is that by separating how you want the data to be structured and used in your MVC app, from how to get that data from an XML file, is that you will have the benefit of a separation in 2 different layers, and if you happen to either decide to pull that data from SQL, or to change the structure of your XML file over time, you will have solid decoupling between data access, and data utilization. You will have a domain model (what a chart should be), and can then plug various data sources to it.
I'd like to implement some before and after method advisors in Coldspring 2.0, and I'd like to use the new schema for AOP and the new autoproxying feature. Unfortunently, the Narwhal documentation for AOP is currently a cliffhanger. Can anyone give me an example of a Coldspring 2.0 configuration file that uses the AOP schema?
I just finished off 1 more section in the AOP documentation, but in the mean time, here are a few examples to get the ball rolling.
This is an example of setting up around advice. It calls the method timeMethod on the object timer, that matches the pointcut of execution(public * *(..)), which translated to: a method execution, that is public, that returns anything, that is named anything, and takes any arguments, of any types. Essentially, it matches everything.
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.coldspringframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:aop="http://www.coldspringframework.org/schema/aop"
xsi:schemaLocation="http://www.coldspringframework.org/schema/beans http://coldspringframework.org/schema/coldspring-beans-2.0.xsd
http://www.coldspringframework.org/schema/aop http://www.coldspringframework.org/schema/coldspring-aop-2.0.xsd"
>
<!-- AOP configuration -->
<aop:config>
<aop:aspect ref="timer">
<aop:around method="timeMethod"
pointcut="execution(public * *(..))"/>
</aop:aspect>
</aop:config>
<bean name="timer" class="05_AOP.Timer" />
<bean name="longTime" class="05_AOP.LongTime" />
</beans>
The important piece to note, is that while Time.cfc is just a plain ol' CFC, for it to do the around advice, the method that is being used has to take a MethodInvocation as an argument, like so:
public any function timeMethod(required MethodInvocation invocation)
{
...
}
But there you go, there is an example of using AOP in CS2.
You can still use MethodInterceptors and the like as well, but you will be using <aop:advisor> rather than <aop:aspect>.
But overall, I'm working on the CS2 AOP documentation right now, so it should get filled out in the next day or so.
DOC RELEASED! http://sourceforge.net/apps/trac/coldspring/
I have following, very simple, XML config for PHPUnit:
<phpunit bootstrap="/_tests/TestAutoload.php">
<testsuites>
<testsuite name="Unit Tests">
<directory suffix=".php">_tests</directory>
</testsuite>
</testsuites>
</phpunit>
How to exclude certain file in this directory from test suite? I tried <exclude> and <blacklist>, but it doesn't seem to work in this context. Also couldn't find any other documentation than phpunit.de one, which doesn't mention anything about it. Else than that, this config works perfectly.
To exclude the file name TestCase.php.
add this to your phpunit.xml
<testsuites>
<testsuite name="BLABLA">
<directory suffix=".php">./tests</directory>
<exclude>./tests/TestCase.php</exclude>
</testsuite>
</testsuites>
Here is an additional excerpt from a real-live test-suite I can confirm it working with:
...
<testsuites>
<testsuite name="n98-magerun-tests">
<directory>./tests</directory>
<exclude>tests/N98/Magento/Command/Installer/UninstallCommandTest.php</exclude>
</testsuite>
...
There are a number of ways to not run a particular test - putting it into a blacklist so it's never run may not be the way - as changing it means editing the blacklist, and you'll often endup bouncing it in and out of version control.
There are several other ways that may be more appropriate:
If a test is not yet ready to run:
$this->markTestIncomplete('This test has not been implemented yet.');
If there's an outside reason it should not be run, skip it:
if (!extension_loaded('mysqli')) {
$this->markTestSkipped('The MySQLi extension is not available.');
}
You can also put that into the setUp() function, so it will skip all the tests in a test-class.
You can make a test dependant on a previous one succeeding:
public function testEmpty()
{
$stack = array();
$this->assertTrue(empty($stack));
return $stack; // also sends this variable to any following tests - if this worked
}
/**
* only runs if testEmpty() passed
*
* #depends testEmpty
*/
public function testPush(array $stack)
{
}
The #group -name- annotation is one of the best ways to specifically stop, or run one group of tests
/**
* #group database
* #group remoteTasks
*/
public function testSomething()
{
}
testSomething() is now in two groups, and if either is added on the command line (or in the config.xml) --exclude-group parameter. it won't be run. Likewise, you could run only tests that belong to a particular group - say named after a feature, or bug report.
2021 update
On top of some good single answers above here is the entire approach that allows you to apply a consistent, clean, more architecture-driven tests organization and a convinient fast testing workflow. With it you manage your tests, directories and testsuites from phpunit.xml and run tests in groups or by one as needed. So do the following:
Group your tests in the directories (e.g. tests/Unit, tests/Feature, tests/Integration);
Make testsuites grouping your directories with <testsuite> element (note you have to wrap multiple <testsuite> tags in <testsuites> tag in later versions of PHPUnit);
Make an all testsuite that combines all default tests you would run as a full test suite and assign it via defaultTestSuite="all" key within <phpunit> element like this:
<phpunit .. some other keys
defaultTestSuite="all">
<testsuite name="all">
<directory suffix="Test.php">./tests/Feature/</directory>
<directory suffix="Test.php">./tests/Unit/</directory>
</testsuite>
</phpunit>
If you need make a dedicated Tinker test suite with tests that you could use for tinkering, keeping example tests etc. you would exclude from normal testing workflow. Do not inlcude it in the all test suite.
So you will be able to:
use phpunit CLI command to always run the default all tests.
use CLI to filter on testsuite, test file or single test level for any of your test suites e.g.:
phpunit --testsuite SuiteOne,
phpunit --filter SomeTest or
phpunit --filter SomeTest::test_some_test_method
combine --testsuite with --filter arguments
Couple this workflow with the capability to run the current test or test file from within your IDE (for Sublime Text there are Mac and Linux/Windows plugins) and you will be completelly equipped to instantly chose what test to execute.
Whit this PHPUnit configuration-file I have made very good experiences.
<?xml version="1.0" encoding="UTF-8"?>
<phpunit
convertErrorsToExceptions="true"
convertNoticesToExceptions="true"
convertWarningsToExceptions="true"
colors="true"
processIsolation="true"
stopOnFailure="true"
syntaxCheck="false"
backupGlobals="false"
bootstrap="test-bootstrap.php">
<testsuites>
<testsuite name="php-dba-cache">
<directory suffix="Test.php">tests</directory>
</testsuite>
</testsuites>
<logging>
<log type="coverage-html"
target="build/coverage"
charset="UTF-8"
yui="true"
highlight="true"
lowUpperBound="35"
highLowerBound="70"/>
</logging>
<filter>
<whitelist addUncoveredFilesFromWhitelist="true">
<directory suffix=".php">src</directory>
<exclude>
<file>test-bootstrap.php</file>
</exclude>
</whitelist>
</filter>
</phpunit>
https://github.com/gjerokrsteski/php-dba-cache
The phpunit documentation is a bit minimalistic when it comes to exclusion in a testsuite. Apparently, only entire directories can be excluded but not individual files. I would be very happy to be proven wrong. The workaround seems to be using the #group feature as posted above by Alister Bulman.
It's kind of a pain needing to tag every single test in those test suites I'd like to keep.
For Phpunit 6.5, exclude is under whitelist
<filter>
<whitelist>
<directory suffix=".php">src</directory>
<exclude>
<directory>src/Migrations</directory>
<file>src/kernel.php</file>
</exclude>
</whitelist>
</filter>
I came across this with a test class that I wanted to extended for other tests. PHPUnit would issue warnings about the test file containing no tests. Simply declaring the class inside the file abstract caused PHPunit to quiet down about that.
abstract class SetupSomeCommonThingsTestCase {
protected function setUp(): void {
parent:setUp();
...
}
}
Hey there, Make sure that you put your exclusions in the Whitelist. Example:
<phpunit>
<filter>
<blacklist>
<directory suffix=".php">/not/even/looked/at/</directory>
</blacklist>
<whitelist>
<directory suffix=".php">/path/to/test/dir/</directory>
<exclude>
<file suffix=".php">/path/to/fileToExclude.php</file>
</exclude>
</whitelist>
</filter>
</phpunit>
http://www.phpunit.de/manual/current/en/appendixes.configuration.html#appendixes.configuration.blacklist-whitelist