Run TestNG tests in random order - unit-testing

Similarly to How can I make my JUnit tests run in random order? , I'd like TestNG to run my tests in random order, so unintended dependencies cannot creep in.
The TestNG manual states:
By default, TestNG will run the tests found in your testng.xml file in
a random order.
However, I created a small test project, with a simple testng.xml:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" >
<suite name="My suite">
<test name="Simple test">
<packages>
<package name="testngtests"></package>
</packages>
</test>
</suite>
The package testngtests contains two test classes (MyTest1, MyTest2), and these contain a few empty methods like this:
#Test
public void testOne(){
}
The test mehods are all empty, and only differ by name.
When I run them (using the Eclipse TestNG runner, or on the command line), the tests consistenly run in the same order (namely sorted alphabetically, first by class and then by method name).
So is the documentation wrong?
Or does "in random order" simply mean "there is no guaranteed order"? Then how can I make TestNG actively randomize test order?

Yes, 'random' should really be 'non predictable'.
If you want true randomization, look up IMethodInterceptor, where TestNG will give you an opportunity to change the ordering to anything you like.

To expand on Cedric Beust's answer, here's my code for doing this, with some help from this example from GitHub:
import org.testng.IMethodInstance;
import org.testng.IMethodInterceptor;
import org.testng.ITestContext;
import java.util.Collections;
import java.util.List;
import java.util.Random;
public class TestOrderRandomizer implements IMethodInterceptor {
#Override
public List<IMethodInstance> intercept(List<IMethodInstance> methods, ITestContext context) {
long seed = System.nanoTime();
System.out.println(String.format("Randomizing order of tests with seed: %s", seed));
Collections.shuffle(methods, new Random(seed));
return methods;
}
}
And to use it, add this before your test class:
import org.testng.annotations.Listeners;
#Listeners(TestOrderRandomizer.class)
public class TesterClass {
...
Printing the seed allows to reproduce a run-order by planting the same seed again.

Related

How can I unit test my log messages using testng

We use testng as out testing framework. We also use Lombok #Log4j2 to instantiate our log objects. I need to test some code that it logs certain messages under certain conditions.
I have seen examples using junit and Mockito. But I cannot find how to do it in testng. Switching to junit is not an option.
Edit
I have implemented a class (CaptureLogger) which extends AbstractLogger
import org.apache.logging.log4j.spi.AbstractLogger;
public class CaptureLogger extends AbstractLogger {
...
}
I am unable to to hook it up to the logger for the class under test.
CaptureLogger customLogger = (CaptureLogger) LogManager.getLogger(MyClassUnderTest.class);
generates an error message:
java.lang.ClassCastException: org.apache.logging.log4j.core.Logger cannot be cast to CaptureLogger
I have found out that LogManager.getLogger returns the Logger interface, not the Logger object (which implements the Logger interface).
How can I create an instance of my CaptureLogger?
You can define your own appender like this:
package com.xyz;
import static java.util.Collections.synchronizedList;
import java.util.ArrayList;
import java.util.List;
import org.apache.logging.log4j.core.Appender;
import org.apache.logging.log4j.core.Filter;
import org.apache.logging.log4j.core.LogEvent;
import org.apache.logging.log4j.core.appender.AbstractAppender;
import org.apache.logging.log4j.core.config.plugins.Plugin;
import org.apache.logging.log4j.core.config.plugins.PluginAttribute;
import org.apache.logging.log4j.core.config.plugins.PluginElement;
import org.apache.logging.log4j.core.config.plugins.PluginFactory;
#Plugin(name = "LogsToListAppender", category = "Core", elementType = Appender.ELEMENT_TYPE)
public class LogsToListAppender extends AbstractAppender {
private static final List<LogEvent> events = synchronizedList(new ArrayList<>());
protected LogsToListAppender(String name, Filter filter) {
super(name, filter, null);
}
#PluginFactory
public static LogsToListAppender createAppender(#PluginAttribute("name") String name,
#PluginElement("Filter") Filter filter) {
return new LogsToListAppender(name, filter);
}
#Override
public void append(LogEvent event) {
events.add(event);
}
public static List<LogEvent> getEvents() {
return events;
}
}
Then create a file called log4j2-logstolist.xml in the root of the classpath where the appender will be referenced:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN" packages="com.xyz" >
<Appenders>
<LogsToListAppender name="LogsToListAppender" />
</Appenders>
<Loggers>
<Root level="TRACE">
<AppenderRef ref="LogsToListAppender" />
</Root>
</Loggers>
</Configuration>
You should take special care (to update it properly) of the attribute packages="com.xyz" (the package of your appender) or it won't be available. For more information check https://www.baeldung.com/log4j2-custom-appender
And finally create TestNG test:
package com.xyz;
import static org.testng.Assert.assertTrue;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.core.config.Configurator;
import org.testng.annotations.Test;
#Test
public class LogsTest {
static {
Configurator.initialize(null, "classpath:log4j2-logstolist.xml");
}
#Test
public void testLogs() {
// call your code that produces log, e.g.
LogManager.getLogger(LogsTest.class).trace("Hello");
assertTrue(LogsToListAppender.getEvents().size() > 0);
}
}
As you can see we are forcing Log4j2 to use the custom configuration with Configurator.initialize(null, "classpath:log4j2-logstolist.xml"); when the class is initialized (static{} block).
Keep in mind that it will be useful for you to check logger name as well, e.g. LogsToListAppender.getEvents().stream().filter(a -> CLASS_THAT_PRODUCES_LOG.class.getName().equals(a.getLoggerName())).collect(toList());
you can access the actual message using LogEvent::getMessage() method
As Long as you're using Lombok for logger generation you can't do much at the level of the source code itself with the given tools. For example, if you place #Log4j2 annotation, it generates:
private static final org.apache.logging.log4j.Logger log = org.apache.logging.log4j.LogManager.getLogger(LogExample.class);
The compiled code already comes with this line.
You can try to mock LogManager.getLogger method with PowerMockito but I don't really like this kind of tools. Stating this though since it can be a viable direction.
There are couple of ways to work with the framework itself.
One way (and I'm not familiar with Log4j2 specifically but it should offer this capability - I did something similar with Log4j 1.x many years ago) is to provide your own implementation of logger and associate it with the logger factory at the level of Log4j2 configurations.
Now if you do this, then the code generated by lombok will return your instance of logger that can memorize the messages that were logged at different levels (it's the custom logic you'll have to implement at the level of Logger).
Then the logger will have a method public List<String> getResults() and you'll call the following code during the verification phase:
public void test() {
UnderTest objectUnderTest = ...
//test test test
// verification
MyCustomLogger logger = (MyCutomLogger)LogManager.getLogger(UnderTest.class);
List<String> results = logger.getResults();
assertThat(results, contains("My Log Message with Params I expect or whatever");
}
Another somewhat similar way I can think of is to create a custom appender that will memorize all the messages that were sent during the test. Then you could (declaratively or programmatically bind that appender to the Logger obtained by the LogFactory.getLogger for the class under test (or also for other classes depending on your actual needs).
Then let the test work and when it comes to verification - get the reference to the appender from the log4j2 system and ask for results with some public List<String> getResults() method what must exist on the appender in addition to the methods that it must implement in order to obey the Appender contract.
So the test could look something like this:
public void test () {
MyTestAppender app = createMemorizingAppender();
associateAppenderWithLoggerUnderTest(app, UnderTest.class);
UnderTest underTest = ...
// do your tests that involve logging operations
// now the verification phase:
List<String> results = app.getResults();
assertThat(results, contains("My Log Message with Params I expect or whatever");
}

Grails : Spock : Unit testing GORM domain class hooks

Usually I was ending up writing test cases for a Domain by writing them for constraints and any custom methods(created by us in application) as we know we shouldn't test obvious.
But the time we started using coverage plugin, we found that our domains line of code is not fully covered which was due to gorm hooks(onInsert, beforeUpdate) that we never wrote test cases for.
Is there a way we can test these. One possible way that seems obvious but not suitable is to call another method(containing all code which was earlier in hooks) within these hooks and test that method only and be carefree for hooks.
Any solutions...
Edit
Sample code in domain that I want to unit-test:
class TestDomain{
String activationDate
def beforeInsert() {
this.activationDate = (this.activationDate) ?: new Date()//first login date would come here though
encodePassword()
}
}
How can I unit-test beforeInsert or I would end up writing integration test case?
Perhaps a unit test like:
import grails.test.mixin.TestFor
#TestFor(TestDomain)
class TestDomainSpec extends Specification {
def "test beforeSave"() {
given:
mockForConstraintsTests(TestDomain)
when:
def testDomain = new TestDomain().save(flush:true)
then:
testDomain.activationDate != null
}
}

How to run only selected test cases of a class in Selenium Automation testing project?

I have prepared a testng.xml file where I put number of test classes to run, for example:
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd">
<!-- REMOTE PROJECT -->
<suite name="Suite1" preserve-order="true">
<test name="Test1">
<parameter name="browsers" value="Chrome">
</parameter>
<classes>
<class name="com.project.live.Class1" />
<class name="com.project.live.Class2" />
<class name="com.project.live.Class3" />
...
...
...
<class name="com.project.live.Class...Nth" />
</classes>
</test>
<!-- Test -->
</suite> <!-- Suite -->
There are test cases in these classes with #Test Annotation, I want to run selected test cases only i.e. I will skip some tests of these classes.
1. One way to do this is put #Ignore Annotation and remove #Test Annotation from tests which I don't want to run (but that's lengthy work, very time consuming)
2. Another way is to use groupsbut again it is lengthy to select tests and put them in groups.
Query:Is there any optimal way to achieve this, may be some customized config file?
One way can be to use Iannotationtransformer.
Put another file which contains the list of cases to exclude (or include whichever list is shorter). Implement the transform method to check whether the current method falls in this exclude list, if yes, then set the enabled property to false for the annotation and it would be excluded.
Could you try IMethodInterceptor for this purpose?
Configure your test class with a listener implementing IMethodInterceptor and decide the tests to be run dynamically
A sample listener below.
public class MyTestListener implements IMethodInterceptor {
#Override
public List<IMethodInstance> intercept(List<IMethodInstance> methods, ITestContext context) {
List<IMethodInstance> methodlist = new ArrayList<IMethodInstance>();
// Read these flags from somewhere, system var / test context / file or
// where ever
Boolean shouldRunTest1 = Boolean.valueOf(context.getAttribute("SHOULD_RUN_TEST1")
.toString());
Boolean shouldRunTest2 = Boolean.valueOf(context.getAttribute("SHOULD_RUN_TEST2")
.toString());
for (IMethodInstance iMethodInstance : methods) {
// decide based on method name / group / priority / description or
// what ever
String methodName = iMethodInstance.getMethod().getMethodName();
if (iMethodInstance.getMethod().isTest()) {
if (shouldRunTest1 && methodName.equals("testCase1")) {
methodlist.add(iMethodInstance);
} else if (shouldRunTest2 && methodName.equals("testCase2")) {
methodlist.add(iMethodInstance);
}
}
}
// Here we return the test cases to be run
return methodlist;
}
see the below example ,when u wanted to run only "sanity3" ,you can comment the other methods(sanity0 ....sanity2) and run using xml file (right click and Run as ->testng)
Eg:
<run>
<!-- <include name="Sanity0"/>
<include name="Sanity1"/>
<include name="Sanity2"/> -->
<include name="Sanity3"/>

TestNG #Factory and group-by-instances and preserve-order

I'm trying to use for the first time TestNG with #Factory and for me doesn't work, I'll say you why.
I have a class called Extend in which I have some tests, "launch site", "login", "check if the useris in his own dashboard" and so on and I wanted that for all datas passed from the factory the order of theese test are always the same "launch site">>"login">>"check user is in his dashboard">>"logout" ok? So I have the following extend.xml file and classes:
<suite name="ExtendFactory" group-by-instances="true">
<test name="Factory" preserve-order="true" group-by-instances="true">
<classes>
<class name="net.whaooo.ExtendFactory">
<methods>
<include name="launchSite"></include>
<include name="loginTest" />
<include name="userIsInHisOwnDashboardTest" />
<include name="logoutTest" />
</methods>
</class>
</classes>
</test>
</suite>
Extend class:
public class Extend extends BaseTest{
protected static FirefoxDriver driver;
private String a_driver;
private String password;
public Extend(String a_driver, String pwd){
this.a_driver = a_driver;
this.password = pwd;
}
#BeforeTest
public void stDriver() {
DesiredCapabilities caps = DesiredCapabilities.firefox(); caps.setCapability(CapabilityType.ForSeleniumServer.ENSURING_CLEAN_SESSION, true);
driver = new FirefoxDriver(caps);
driver.manage().timeouts().implicitlyWait(30, TimeUnit.SECONDS);
}
#AfterTest
public void stopDriver() {
driver.close();
}
#Test
public void launch() {
launchSite(driver);
}
#Test (description = "Enter a valid login as driver")
public void loginTest() {
login(driver, a_driver, password);
}
#Test (description = "Check the driver is in his own dashboard")
public void userIsInHisOwnDashboardTest(){
userIsInHisOwnDashboardTest(driver, a_driver, password);
}
#Test(description="logout")
public void logout(){
logoutTest(driver);
}
}
Semplified Factory:
public class ExtendFactory {
#Factory
public Object[] createInstances() {
Object[] result = new Object[2];
result[0] = new Extend("test1#test.com","tester");
result[1] = new Extend("test2#test.com","tester");
return result;
}
}
But my problem is that the order in which the tests are launched doesn't follow the one specified in the xml file even if I insert the clause preserve-order="true" group-by-instances="true", I tryed also with order-by-instances="true". Can anyone help me?
I see many issues... first of all #Factory with group-by-instance="true" messes up whole test (it executes just one instance and only non-dependent methods).
#Factory works without group-by-instance but it executes all non-dependent methods first irrespective of number of instances. Eg.. Class A {#Test public void a() {} #Test(dependsOnMethod="a") public void b() {}}... along with #Factory that returns two instances.. then the execution is ref1.a, ref2.a, ref1.b, ref2.b. this has serious issue.. say class A uses large amount of memory then sure it will run out before executing all.
ps: not sure if it is eclipse issue. I am using testng 6.8.1
ps2: seems like testng intends for regression.. but it is still not there.. nor its regression (#Factory) is supported by its own classes (like #Listeners who will read only #Parameters.. but #Factory cannot set same) or by third party.
I think what you need to use is dependsOnMethods in your testcases, coz the flow that you mention, if the first method doesn't execute, there is no point in executing the second testcase. i.e. if "launch site" fails, there's no need to execute "login". This would also ensure order of execution. Check out Dependent Methods
I've been using the
#Test(dependsOnMethods = "TestName")
Where "TestName" is the prerequisite test to run. So for your login test, it should have the following annotation:
#Test(dependsOnMethods = "launchSite")
I'm running 9 tests in a row, and since adding the dependsOnMethods, all have ran in order with no issue
Thank you for your answer, I ended up to use a #Factory specifing "order-by-instances="true"" and than in the dynamic object I insert the dependencies!
Using depends in the TestClass file is not a solution as the functions which are not dependent on any other functions are still been executed randomly.
I need to execute the Test Cases in the order which i have mentioned. This can be achieved using "preserve-order" when executed using TestNG but it fails when grouping is used in TestNG.
If anyone can help in this concern, please revert.

How to run scala tests with junit 4?

I can't run following code with IDEA
#Test
class CompanyURLTest extends Assert {
#Test
def test = assert(false);
}
It runs, but J-Unit says that there are not test to run
I generally use ScalaTest in combination with the Junit4 runner so that Maven sees and executes my tests. I like the Spec/FlatSpec/WordSpec semantics for organizing tests. I'm experimenting with the ShouldMatchers but I have used JUnit for so long that asserts just seem a bit more natural to me.
Here's an example:
import org.junit.runner.RunWith
import org.scalatest.junit.JUnitRunner
import org.scalatest.FlatSpec
import org.scalatest.matchers.ShouldMatchers
#RunWith(classOf[JUnitRunner])
class BlogFeedHandlerTest extends FlatSpec with ShouldMatchers with Logging {
"the thingy" should "do what I expect it to do" in {
val someValue = false;
assert(someValue === false)
}
}
The ScalaTest docs are at http://www.scalatest.org/
The following works for me
import org.junit._
import Assert._
class MyTest {
#Test
def test = assert(false)
}
The #org.junit.Test annotation is only applicable for methods: #Target({ElementType.METHOD}).
Also keep in mind that test methods must return Unit.
import org.junit.Assert._
import org.junit.Test
class CompanyURLTest {
#Test def test = assertFalse(false)
}
Even though it's late - Gradle site surprisingly has a lot of good tutorials showing how to support Scala tests.
https://guides.gradle.org/building-scala-libraries/#review_the_generated_project_files