Create Unit test methods dynamically during runtime in MSTest - unit-testing

Is there an equivalent of SuiteBuilder in MSTest? couldn't find one so far.
I have a bunch of xml files, each to be seen as mapped to a test method. As there are 100s of these and to manually write tests for each of these, is not a good idea.
So in nunit you could implement ISuiteBuilder and have the Test cases run dynamically and show up as those many test methods.
I am looking for a way to do the same thing in MSTest.
I've looked at DataSource attribute, but it caters to 1 datasource xml file/csv per test method, forcing me to write 100s of test methods. I also want to keep each xml file separate and don't club them all in to 1 huge file, in which case it would become unmaintainable.
Has someone tried this or has any suggestions?

Not exactly what you asked for, but you can use pex for automated and parametrizable white box tests. In that way, you dont need to manually do all that stuff. Pex supports MSTest as well as NUnit. Generated Tests use an extra file, you don't need any xml files.
But I think you can't easily use your existing .xml files from NUnit and share them with MSTest using pex - if that is what you intended.

I have done this already. Here is what you would need to do:
The Test:
[TestMethod]
[DeploymentItem("MyTestData")]
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.XML",
"|DataDirectory|\\MyTestData.xml",
"Test",
DataAccessMethod.Sequential)]
public void MyTest()
{
string file = TestContext.DataRow[0].ToString();
string expectedResult = TestContext.DataRow[1].ToString();
// TODO: Test something
}
MyTestData.xml:
<?xml version="1.0" encoding="utf-8" ?>
<Rows>
<Test>
<File>test1.xml</File>
<Result>1</Result>
</Test>
<Test>
<File>test2.xml</File>
<Result>2</Result>
</Test>
</Rows>
test1.xml and test2.xml must exist in the MyTestData directory.

Related

How to test a xml files during maven(3) test phase

Is there any way to perform a basic test/unittest for a xml file included in a module?
E.g. I like to "lint" test my persistence.xml or check if the schema to a xml-file is valid.
What about using the xml-maven-plugin
It could be that the links will become invalid cause codehaus is shutting down it's services so you can take a look at https://github.com/mojohaus/xml-maven-plugin.

Is it possible to get a report of unit tests run in TFS builds, grouped by solution?

We have a few thousand native and .NET unit tests. In Visual Studio 2012, I can run and see the results, grouped by the C++/C# project.
I'd like to get something like this view, preferably grouped by solution (product) and then project (.dll), to the business people. At the bare minimum I'd like to at least have number of tests run and failed per solution.
Is there any proper way to do this with TFS?
I've looked everywhere and keep running into walls,
TFS build test results don't seem to store any information about the test categories, so I can't use those to group by solution
.vsmdi lists and .testsettings files have been phased out in VS 2012 and TFS 2012. We had separate lists for each solution before...now it's just *test*.dll
Test Plans and Custom SSRS reports seem to be completely useless for this much granularity of test results (why?). TfsTestWarehouse has barely anything - just enough for total tests passed/failed per build.
Parsing TRX files and writing HTML reports seems to work best using tools like trx2html, but I still can't run tests by solution.
TRX files are just XMLs, there's no need to parse them. You can write an XSLT transformation to present the data in the format you need. A nice thing about XSLT is that it has built-in aggregation, grouping, sorting etc capabilities.
In case TRX files themselves do not contain solution information (which is likely), then you'll have to do a two-stage report generation: prepare the data, generate the report.
The preparation would be a relatively simple command line tool, which would go over your sln files and build a map of which projects belong to while solutions (search the web, i bet there're already a bunch of scripts for that).
And the generation part would be using that mapping as an argument to the transformation and report generation to properly aggregate the data.
I know, it's a bit of a generic response, but hope it helps at least a bit.
I ended up solving this by adding the Project and Solution information in a custom Assembly Attribute (i.e. to the test .dll) at build time, through a custom MSBuild task. Here are roughly the steps I followed (from memory).
First, I created the custom Attribute:
[AttributeUsage(AttributeTargets.Assembly)]
public class ProjectAttribute: Attribute {
public string Project { get; set; }
public string Solution { get; set; }
public ProjectAttribute(string project, string solution)
{
this.Project = project;
this.Solution = solution;
}
}
This custom attribute was defined in an Assembly that was referenced by all unit test projects.
I then created a very simple/rudimentary inline MSBuild task, CreateProjectAttribCs that would dynamically create an extra C# file with one line. Something like:
[assembly: ProjectAttribute(Project="$(ProjectName)") Solution="$(Solution)"]
And I then added this file to the <Compile> Item Group, inside a custom MSBuild target called before Compile (again, just going from memory):
<Target Name="CreateProjectAttribCs" BeforeTargets="Compile">
<CreateProjectAttribCs File="ProjectAttribute.cs" />
<ItemGroup>
<Compile Include="ProjectAttribute.cs" />
</ItemGroup>
</Target>
<Target Name="CleanupProjectAttribCs" AfterTargets="Compile>
<Delete Files="ProjectAttribute.cs" />
</Target>
For C++ projects I'd added the Project and Solution info to a String Table resource "injected" in a similar way to the ProjectAttrib.cs file.
One major annoyance with all of this was that developers would have to add this custom MSBuild .targets file (which would contain the custom targets and the assembly reference) by editing the .csproj or .vcxproj.
To circumvent that a bit, I also created a custom Visual Studio Project Template for our team's unit tests so that everything was already added and my fellow devs would never have to see the innards of an MSBuild project.
The hardest part was adding the Project/Solution info. Once I had that it was easy to read the custom attributes on the test assemblies or String Table resource in a native .dll, and add the info to data parsed/transformed from the test results to a custom test result database and report.

Multiple XSLT files in a single pipeline with ant

I have multiple XSLT files that I'm using to process my source XML in a pipeline. I know about the trick with exsl:node-set but after having some issues with this workflow, I took the decision to split the various passes into separate XSL files. I'm much happier with the structure of the files now and the workflow works fine in Eclipse. Our release system works with ant. I can process the files like this:
<xslt basedir="src-xml" style="src-xml/preprocess_1.xsl" in="src-xml/original.xml" out="src-xml/temp_1.xml" />
<xslt basedir="src-xml" style="src-xml/preprocess_2.xsl" in="src-xml/temp_1.xml" out="src-xml/temp_2.xml" />
<xslt basedir="src-xml" style="src-xml/preprocess_3.xsl" in="src-xml/temp_2.xml" out="src-xml/temp_3.xml" />
<xslt basedir="src-xml" style="src-xml/finaloutput.xsl" in="src-xml/temp_3.xml" out="${finaloutput}" />
But this method, going via multiple files on disk, seems inefficient. Is there a better way of doing this with ant?
Update following Dimitre's suggestion
I've created myself a wrapper around the various other XSLs, as follows:
<xsl:stylesheet version='1.0' xmlns:xsl='http://www.w3.org/1999/XSL/Transform' xmlns:fn='http://www.w3.org/2005/xpath-functions' xmlns:exslt="http://exslt.org/common">
<xsl:import href="preprocess_1.xsl"/>
<xsl:import href="preprocess_2.xsl"/>
<xsl:import href="preprocess_3.xsl"/>
<xsl:import href="finaloutput.xsl"/>
<xsl:output method="text" />
<xsl:template match="/">
<xsl:apply-imports />
</xsl:template>
</xsl:stylesheet>
This has... not worked well. It looks like the document had not been preprocessed before the final output XSL ran. I should perhaps have been clearer here: the preprocess XSL files are modifying the document, adding attributes and the like. preprocess_3 is based on the output of ..._2 is based on ..._1. Is this import solution still appropriate? If so, what am I missing?
The more efficient method is to perform a single, multipass transformation.
The files can remain as they are -- they will be imported using xsl:import instructions.
The savings are obvious:
Just one initiation (loading of the XSLT processor).
Just one termination.
Eliminates the two intermediate files and their creation, writing into, closing and deleting.
Hmm, you say I know about the trick with exsl:node-set, but you don't use it in your attempt ("Update following Dimitre's suggestion"). In case you don't know it, or for the others (like me) who don't know how to perform multipass transformation, here is a nice article: Multipass processing.
The drawback of this approach is that it requires engine specific xsl code. So if you know the engine, you could try this. If you don't know the engine, you could try with solutions from result tree fragment to node-set: generic approach for all xsl engines.
Looking at these sources one conclusion is sure: your current solution is more readable. But you are seeking efficiency, so some readability may be sacrificed.

How would I produce JUnit test report for groovy tests, suitable for consumption by Jenkins/Hudson?

I've written several XMLUnit tests (that fit in to the JUnit framework) in groovy and can execute them easily on the command line as per the groovy doco but I don't quite understand what else I've got to do for it to produce the xml output that is needed by Jenkins/Hudson (or other) to display the pass/fail results (like this) and detailed report of the errors etc (like this). (apologies to image owners)
Currently, my kickoff script is this:
def allSuite = new TestSuite('The XSL Tests')
//looking in package xsltests.rail.*
allSuite.addTest(AllTestSuite.suite("xsltests/rail", "*Tests.groovy"))
junit.textui.TestRunner.run(allSuite)
and this produces something like this:
Running all XSL Tests...
....
Time: 4.141
OK (4 tests)
How can I make this create a JUnit test report xml file suitable to be read by Jenkins/Hudson?
Do I need to kick off the tests with a different JUnit runner?
I have seen this answer but would like to avoid having to write my own test report output.
After a little hackage I have taken Eric Wendelin's suggestion and gone with Gradle.
To do this I have moved my groovy unit tests into the requisite directory structure src/test/groovy/, with the supporting resources (input and expected output XML files) going into the /src/test/resources/ directory.
All required libraries have been configured in the build.gradle file, as described (in its entirety) here:
apply plugin: 'groovy'
repositories {
mavenCentral()
}
dependencies {
testCompile group: 'junit', name: 'junit', version: '4.+'
groovy module('org.codehaus.groovy:groovy:1.8.2') {
dependency('asm:asm:3.3.1')
dependency('antlr:antlr:2.7.7')
dependency('xmlunit:xmlunit:1.3')
dependency('xalan:serializer:2.7.1')
dependency('xalan:xalan:2.7.1')
dependency('org.bluestemsoftware.open.maven.tparty:xerces-impl:2.9.0')
dependency('xml-apis:xml-apis:2.0.2')
}
}
test {
jvmArgs '-Xms64m', '-Xmx512m', '-XX:MaxPermSize=128m'
testLogging.showStandardStreams = true //not sure about this one, was in official user guide
outputs.upToDateWhen { false } //makes it run every time even when Gradle thinks it is "Up-To-Date"
}
This applies the Groovy plugin, sets up to use maven to grab the specified dependencies and then adds some extra values to the built-in "test" task.
One extra thing in there is the last line which makes Gradle run all of my tests every time and not just the ones it thinks are new/changed, this makes Jenkins play nicely.
I also created a gradle.properties file to get through the corporate proxy/firewall etc:
systemProp.http.proxyHost=10.xxx.xxx.xxx
systemProp.http.proxyPort=8080
systemProp.http.proxyUser=username
systemProp.http.proxyPassword=passwd
With this, I've created a 'free-style' project in Jenkins that polls our Mercurial repo periodically and whenever anyone commits an updated XSL to the repo all the tests will be run.
One of my original goals was being able to produce the standard Jenkins/Hudson pass/fail graphics and the JUnit reports, which is a success: Pass/Fail with JUnit Reports.
I hope this helps someone else with similar requirements.
I find the fastest way to bootstrap this stuff is with Gradle:
# build.gradle
apply plugin: 'groovy'
task initProjectStructure () << {
project.sourceSets.all*.allSource.sourceTrees.srcDirs.flatten().each { dir ->
dir.mkdirs()
}
}
Then run gradle initProjectStructure and move your source into src/main/groovy and tests to test/main/groovy.
It seems like a lot (really it's <5 minutes of work), but you get lots of stuff for free. Now you can run gradle test and it'll run your tests and produce JUnit XML you can use in build/test-reports in your project directory.
Since you're asking for the purposes of exposing the report to Jenkins/Hudson, I'm assuming you have a Maven/Ant/etc build that you're able to run. If that's true, the solution is simple.
First of all, there's practically no difference between Groovy and Java JUnit tests. So, all you need to do is add the Ant/Maven junit task/plugin to your build and have it execute your Groovy junit tests (just as you'd do if they were written in Java). That execution will create test reports. From there, you can simply configure your Hudson/Jenkins build to look at the directory where the test reports get created during the build process.
You can write your own custom RunListener (or SuiteRunListener). It still requires you to write some code, but it's much cleaner than the script you've provided a link to. If you'd like, I can send you the code for a JUnit reporter I've written in JavaScript for Jasmine and you can 'translate' it into Groovy.

XSLT Unit testing

Does anyone know of a way to write unit tests for the XSLT transformation?
I've a lot of XSLT files and it's getting harder to test them manually. We have an example XML and can compare it to the resulting output XML from the XSL transormation. However, I'm looking for a better test method.
I am currently looking for some good options to do this as well. As a result, I came across this question, and a few other potential candidate solutions. Admittedly, I haven't tried any of them yet, so I can't speak to their quality, but at least they are some other avenues potentially worthy of researching.
Jenni Tennison's Unit Testing Package
UTF-X Unit Testing Framework
Juxy
XTC
Additionally, I found the following article to be informative in terms of a general methodology for unit testing XSLT.
Unit test XSL transformations
Try XSpec, a testing framework for XSLT. It allows you to write tests declaratively, and test templates and functions.
Looks like Oxygen editor has Unit Testing available as well. It "provides XSLT Unit Test support based on XSpec".
I haven't tried it myself, but will soon.
Here are a few simple solutions:
Use xsltproc with a mock XML file:
xsltproc test.xsl mock.xml
XSLT Cookbook - Chapter 13
Create a document() placeholder variable and comment/uncomment it manually:
<xsl:variable name="Data" select="descendant-or-self::node()"/>
<!--
<xsl:variable name="Data" select="document('foo.xml')" />
-->
<xsl:if test="$Data/pagename='foo'">
<p>hi</p>
</xsl:if>
Create a condition to swap the comment programmatically:
<xsl:variable name="Data">
<xsl:choose>
<!-- If source XML is inline -->
<xsl:when test="descendant-or-self::node()/pageName='foo'"/>
<xsl:value-of select="descendant-or-self::node()"/>
</xsl:when>
<!-- If source XML is external -->
<xsl:otherwise>
<xsl:value-of select="document('foo.xml')" />
</xsl:otherwise>
</xsl:choose>
</xsl:variable>
Use a shell script to inline the data programmatically in the build to automate the tests completely.
References
Transformiix Test Cases
Running XSLT at the Department: Command Line XSLT Processing
Building TransforMiiX standalone - Archive of obsolete content | MDN
OASIS XSLT Conformance TC Public Documents
Using XSLT to Assist Regression Testing
MicroHowTo: Process an XML document using an XSLT stylesheet
Tip: Debug stylesheets with xsl:message
Batch XSLT Processing
Embedded Stylesheet Modules: XSL Transformations (XSLT) Version 3.0
Multi layer conditional wrap HTML with XSLT
XPath 1.0: Axes
CentOS 7.0 - man page for xsltproc
XMLStarlet command line XML toolkit download | SourceForge.net
We have been using Java based unit test cases, in which we provide expected xml string after transformation and input xml string which needs to be transformed using some XSL.
Refer to following package if you want to explore more.
org.custommonkey.xmlunit.Transform
org.custommonkey.xmlunit.Diff
org.custommonkey.xmlunit.DetailedDiff
I´m using this tool: jxsltunit.
The test is defined by an XML file which is then passed to the tool. This is an example of the test configuration:
<xsltTestsuite xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="jxsltunit jxslttestsuite.xsd" xmlns="jxsltunit"
description="Testsuite Test"
xml="min-test.xml"
xslt="min-test.xslt"
path="pa > ch">
<xsltTestcase match_number="0">
<![CDATA[<ch>child 1</ch>]]>
</xsltTestcase>
<xsltTestcase match_number="1">
<![CDATA[<ch>child 2</ch>]]>
</xsltTestcase>
</xsltTestsuite>
It takes the XML, the XSL and a path in the transformed XML which gets tested. The path can contain a list which elements are identified by their index.
One benefit of this tool is that it can output the results as a junit XML file. This file can be picked up by your Jenkins to show the XLST-tests in your test results. Just add the call to the tool as a build step.
Try Jenni Tennison's Unit Testing Package (XSpec), which is a unit test and behaviour-driven development (BDD) framework for XSLT, XQuery, and Schematron. It is based on the Spec framework of RSpec, which is a BDD framework for Ruby.
With XSpec you can test XLT template wise or XPath wise per your need.
For an overview on how to use/handle/write (installation|execution) click https://github.com/xspec/xspec/wiki/What-is-XSpec