How can I access The TestCategory attribute in the TFS API or TFS Report? - unit-testing

We have a rather complicated library of automated test. We are using TestCategory and TestProperty help us keep things clean. Now I want the ability to report on the categories after they run.
Does anyone know if the Test Category or Test Property in MSTest is accessible from the TFS Warehouse or better yet, through the TFS API?

The TestCategory is not available in the tests results and thus isn't available via warehouse or API. TestCategory can be used to filter the tests that are run during a build but that's it. So you could create different test builds based on test category and then report based on each build.
You could alternatively associate automation to your test cases, then organize your test cases into suites and organize your reports based on suites.

Related

Importing test reports from external system to Azure DevOps?

I know that in Azure DevOps I can create test cases for my features and I can collect those test cases in a test suite to keep track of test results for a particular release or iteration of my software.
I also know that on a code level I can integrate a test framework to run unit or integration tests. Depending on the technology stack and language that I use the frameworks differ (i.e., Mocha, Junit, NUnit, PyTest, etc.), but usually they produce a common format for test results such as an XML Test Report using the JUnit Schema.
Now, I run my unit tests in different tools (i.e., Gitlab CI/CD or Jenkins) and I would like to link the test results from those unit and integration tests to test cases that I am collecting in test suites on Azure DevOps.
If I understand this correctly, then the link to bring this together is the test case ID on Azure DevOps which I somehow need to correlate with the individual unit tests in my test framework.
Once my test suite has run in the external tool, how can I publish the test report (i.e., a JUnit XML) to Azure DevOps to correlate the report with my test suite?
Is there a specific Azure DevOps API that I can use (i.e., https://learn.microsoft.com/en-us/rest/api/azure/devops/test/) to publish a JUnit test report?

Is there a way to macro unit test in kdb+?

I have seen a few different unit testing approaches online using qunit and k4unit but I can only get them testing on single functions. I was hoping that I could run a unit test that check the daily checks I execute each day such as, "has the nightjobs ran correctly?", "are the dashboards on the WebUI up?", "did the deploy script run with no errors?". Is there built in kdb+ functionality for these kind of tests or a clean way to adapt the qunit or k4unit unit tests? Or will it require a script written from scratch?
Thanks
I don't think a unit test is what you're looking for here. Some kind of reporting mechanism for jobs would be more appropriate. Within your existing jobs you could generate some kind of alert to indicate the job's success/failure. This qmail library may be useful for that.
I'm not sure what kind of system you're using, but AquaQ Analytics' TorQ system has a reporter process which can (amongst other things) email alerts for specific processes.
(Disclaimer: I'm an employee of AquaQ Analytics)

Do I need to regression test existing methods on a Web service if new methods are added

We have a SOAP web service that is in production and contains a large number of methods. As part of a project we are adding new methods to that web service, note we are not amending the existing methods.
What I am trying to determine is whether I need to regression test the existing methods to test if they have been impacted by adding new methods?
Yes, if you change your webservice the only proper way to make sure none of the changes have impacted existing operations is a regression test.
If you use a testing tool like SOAPUI you can automate this for every build you make. (Regression) testing should be a standard step after any new build to ensure software quality.

Webmethods mocking in flow services

In webmethods (Software AG), is there a way to Mock object during unit testing?
or any available tool to test flow service.
You could have a look at the Open Source http://www.wmaop.org test framework that allows general mocking and unit testing along with a host of other functionality. The framework allows you to:
Create mocks of IS services
Apply conditions to mocks so that they only execute when the pipeline contents meet that condition
Raise an exception based on a condition or in place of a service
Capture the pipeline to file before or after a service is called
Modify or insert content into the pipeline
Have a series of conditions for a mocked service with a default if none of the conditions match
Create assertions that can apply before or after a service so that its possible to prove a service has been executed. Assertions can also have conditions to verify that the pipeline had the expected content.
Return either random or sequenced content from a mock to very its output every time its called
Create mocks using RESTful calls so you can use alternative test tools, such as SOAPui, to create them as part of your integrations test
Use the JBehave functionality for Behaviour Driven Unit Testing within Designer and execute tests with the in-built JUnit.
WmTestSuite could be a good tool for you (Why reinvent the wheel), your company chose webMethods to speedup devs, i advice you to keep going.
What wmTestSuite does:
Create unit tests Graphically for you flows in the Designer
Generate the related TestUnit class (you can complete it to add some asserts)
Add a hook the Integration server to "register" data to create test data
Mock endpoints to ease tests (db, ws...)
I got this slide from a SoftwareAG guy. From the version 9.10 (April 2016) you should be able to download it from empower.
You cannot define mocks in webMethods directly, as it requires you to hook into the invoke chain. This is a set of methods that are called between every flow or java service invocation. They take care of things like access control, input/output validation, updating statistics, auditing etc.
There are various tools and products available that leverage this internal mechanism and let you create mocks (or stubs) for your unit or system test cases:
IwTest, commercial, from IntegrationWise
WmTestSuite, commercial, from SoftwareAG
CATE, commercial, from Cloudgensys
WmAOP, open source, www.wmaop.org
With all four you can create test cases for webMethods flow/java services and define mocks for services that access external systems. All four provide ways to define assertions that the results should satisfy.
By far the easiest to work with is IwTest as it lets you generate test suites, including mocks (or stubs), based on input/output pipelines that it records for you. In addition to this it also supports pub/sub (asynchronous) scenario's.
Ask your Software AG liaison about webMethods Test Suite (WmTestSuite), which plugs into the Eclipse-based Designer and provides basic Unit testing capabilities.
Mocks per se are lightweight services that can be configured in the WmTestSuite dialog alongside the (test) input and (expected) output pipelines.

Test Results and Documentation Site

Currently i am using these tools to run my tests,code coverage and documentation:
Unit testing:
jasmine
xUnit
Code Coverage:
Istanbul
dotCover
Documentation:
Typedoc
As i'm trying to do everything modular for both frontend and backend we have multiple bower components and nuget packages where of course each components runs different type of tests and documentation.
Now what i want to do is to have a dedicated site which grabs all test results and documentation and have a dedicated site where all developers etc. can use it as a point of reference.
Is there any plugin available that can help me achieve it?
if not do you have any idea from where can i start as i tried googling a bit but with no luck.
I'm using roughly the same technologies.
As a build server I use TeamCity.
In a nutshell: your build is composed by steps, e.g (simplified):
build .sln
gulp build
xUnit tests (*A: publishing coverage)
karma run
remap coverage from Javascript to Typescript (*B: publish coverage)
The only problem I had so far is with the coverage (*A + *B). The last data will overwrite the first one, (not average it all). So in that case I use custom reports page to display the istanbul generated html report and only use the xUnit coverage report.
You could have the coverage.json from istanbul as an artifact of your build, and a second build picks up and reports that coverage through teamcity. It would be simply a coverage report build (only 1 step, report code coverage). The trigger is a successful build generating the coverage.
For your generated documentation you can also use custom reports page.
About the unit tests execution (both jasmine (karma?) and xunit), both report its numbers and the final Test report will show them combined.