How to update test case results at bulk in Testlink - testlink

I've recently started using testlink test management tool in my project and I'm facing a new problem on bulk updation of test cases in testlink.
This is not a problem for manual test cases but for automation it is tedious to update the result (Pass or Fail) of every single test case that you executed.
I have around 5000+ test cases and 50% of them are automated so when the automated.
So, when the automation script is finished executing 2500+ test cases for a particular release, I need to update the result of all these test cases manuall in testlink as Pass or Fail.
I also tried to link the automation tool with testlink but that didn't worked out.
So, I just want to know if there's any easy way to bulk update the test cases in testlink. May be using some DB queries and all?

I think it's difficult because you should know several data in order to insert the correct information for your project, test plan, build, testsuite, testcase and testcase version. So, you will need all this information.
Anyway, the table to insert the information is "executions" with the following query you can check the needed information:
select * from executions;
Regards, David.

1) you can use the XML-RPC API
2) generate a XML file with Format for importing results, then upload it manually
avoid using direct access to DB via SQL

iam also updating the test link using Selenium webdriver
the code is below :-
public class appFunctions extends Keywords {
// Substitute your Dev Key Here
public static String DEV_KEY= "1eab09b6158d9df31e76142b85253243";
public static String SERVER_URL = "https://testlink.fondsdepotbank.de/testlink/lib/api/xmlrpc/v1/xmlrpc.php";
public static void clearXLResults() throws IOException
{
try
{
int testStepsRow=DriverScript.xlObj.getRowCount("TestSteps");
int controllerRow=DriverScript.xlObj.getRowCount("Controller");
//Clear previous results
for(int i=2;i<=testStepsRow;i++)
{
DriverScript.xlObj.setCellData("TestSteps",DriverScript.testStepsStatusCol, i, "");
}
for(int j=2;j<=controllerRow;j++)
{
DriverScript.xlObj.setCellData("Controller", DriverScript.controllerStatusCol, j, "");
}
}catch(Exception e)
{
e.printStackTrace();
log.writeLog("Unable to clear previous test results in excel");
}
}
public static void updateResultsTestLink() throws IOException
{
try
{
TestLinkAPIClient api=new TestLinkAPIClient(DEV_KEY, SERVER_URL);
String result;
//read controller status
int controllerRow=DriverScript.xlObj.getRowCount("Controller");
for(int k=2;k<=controllerRow;k++)
{
String currentRowStatus=DriverScript.xlObj.getCellData("Controller",DriverScript.controllerStatusCol,k);
String currentRowProject=DriverScript.xlObj.getCellData("Controller",DriverScript.controllerProjectCol,k);
String currentRowPlan=DriverScript.xlObj.getCellData("Controller",DriverScript.controllerPlanCol,k);
String currentRowBuild=DriverScript.xlObj.getCellData("Controller",DriverScript.controllerBuildCol,k);
String currentRowTCID=DriverScript.xlObj.getCellData("Controller",DriverScript.controllerTCIDCol,k);
if(currentRowStatus.equalsIgnoreCase("pass"))
{
result= TestLinkAPIResults.TEST_PASSED;
api.reportTestCaseResult(currentRowProject, currentRowPlan, currentRowTCID, currentRowBuild, null, result);
}
if(currentRowStatus.equalsIgnoreCase("fail"))
{
result= TestLinkAPIResults.TEST_FAILED;
api.reportTestCaseResult(currentRowProject, currentRowPlan, currentRowTCID, currentRowBuild, null, result);
}
}
}catch(Exception e)
{
e.printStackTrace();
log.writeLog("Unable to update results in Testlink");
}
}

Related

Unable to import Google Test metrics in Sonarqube

I am using TFS 2015 (update 2), C++, Google test and Sonarqube 5.6 (with Cxx community plugin).
I am able to import the coverage, compute duplication, create issues using cppcheck but the number of tests is not importing in sonarqube.
I need to generate a Junit-like XML file using <test executable> --gtest_output=xml:<filename> but in TFS (vNext), I use the VSTestTask which uses vstest.console.exe to run my *Test.exe and there seems to be no way to output as xml (it defaults to .trx).
Has anyone managed to correctly import GTest test metrics into sonarqube? Is a XSLT to transform from trx to xunit the only way...?
May be i need to properly fill in the sonar.cxx.vstest.reportsPaths but the filename of the trx is dynamically set by the vstest.console.exe...
Thanks,
Jon
I know this topic is a little bit old but I got the same problem than you to import gtest test report into SonarQube.
I've finally found a converter that transforms a gtest test report to the Generic Format. It supports also junit xml format but I did not test by myself. The original script was made by another guy, bloodle, but I forked his repository to migrate to Python 3. All the thanks go to him.
The simplest way is converting the test result to XML format. After that you just used the default import functionality.To achieve this, use CoverageCoverter.exe with below code.
class Program
{
static int Main(string[] args)
{
if ( args.Length != 2)
{
Console.WriteLine("Coverage Convert - reads VStest binary code coverage data, and outputs it in XML format.");
Console.WriteLine("Usage: ConverageConvert <sourcefile> <destinationfile>");
return 1;
}
CoverageInfo info;
string path;
try
{
path = System.IO.Path.GetDirectoryName(args[0]);
info = CoverageInfo.CreateFromFile(args[0], new string[] { path }, new string[] { });
}
catch (Exception e)
{
Console.WriteLine("Error opening coverage data: {0}",e.Message);
return 1;
}
CoverageDS data = info.BuildDataSet();
try
{
data.WriteXml(args[1]);
}
catch (Exception e)
{
Console.WriteLine("Error writing to output file: {0}", e.Message);
return 1;
}
return 0;
}
}
More detail info and ways please refer Publishing vstest results? & MSTest Plugin
I just put **/TestResults/*.trx in Visual Studio Test Reports Paths (sonar.cxx.vstest.reportsPaths) and now it is being loaded correctly... go figure.

Owin TestServer logs multiple times while testing - how can I fix this?

I'm trying to write unit tests for an OWIN service, but any log statements in my tests start duplicating once I run the tests all at once and really make the log output on the build server useless due to all the noise. I've distilled the problem down to a very simple repro:
[TestFixture]
public class ServerTest
{
[Test]
public void LogOnce()
{
using (TestServer.Create(app => { }))
{
Debug.WriteLine("Log once");
}
}
[Test]
public void LogTwice()
{
using (TestServer.Create(app => { }))
{
Debug.WriteLine("Log twice");
}
}
}
If I run one test at a time I get the expected output:
=> ServerTest.LogOnce
Log once
=> ServerTest.LogTwice
Log twice
If I run the tests all at once I get:
=> ServerTest.LogOnce
Log once
=> ServerTest.LogTwice
Log twice
Log twice
Initializing the TestServer once will solve the problem, but I am looking for a solution that allows me to continue instantiating as many TestServer instances as I choose.
This post points out how HostingEngine is defaulting on the TraceListener and ways to disable this:
TraceListener in OWIN Self Hosting
With that insight, I traced through the source code of TestServer.Create and confirmed that it is internally creating a HostingEngine which turns on a TraceListener that ultimately outputs results to the console. I have confirmed the highest voted (at the time of this writing) fix on that page works for the TestServer and believe the other solutions there are also excellent choices.
It was very time consuming and annoying having to figure this out. It is difficult to discover and non-obvious to wire up an opt-out. An opt-in solution would be better.

How can I unit test a method with database access?

I have already had some difficulties on unit tests and I am trying to learn it while using a small project I currently am working on and I ran into this problem these two problems that I hope you can help me with
1- My project is an MVC project. At which level should my unit tests start? They should focus only on the business layer? Should they also test actions on my controllers?
2- I have a method that verifies an username format and then access the DB to check if it is available for use. The return is a boolean whether this username is available or not.
Would one create a unit test for such a method?
I would be interested on testing the format verification, but how would I check them without querying the DB? Also, if the formats are correct, but the username is already in use, I will get a false value, but the validation worked. I could decouple this method, but the DB verification should only happen if the format is correct, so they should somehow be tied.
How would someone with unit tests knowledge solve this issue. Or how would someone refactor this method to be able to test it?
I could create a stub for the DB access, but how would I attach it to my project on the user testing but detach it when running locally?
Thanks!
In your specific case, one easy thing you could do is decompose your verification method into 3 different methods: one to check formatting, one to check DB availability, and one to tie them both together. This would allow you to test each of the sub-functions in isolation.
In more complex scenarios, other techniques may be useful. In essence, this is where dependency injection and inversion of control come in handy (unfortunately, those phrases mean different things to different people, but getting the basic ideas is usually a good start).
Your goal should be to decouple the concept of "Check if this username is available" from the implementation of checking the DB for it.
So, instead of this:
public class Validation
{
public bool CheckUsername(string username)
{
bool isFormatValid = IsFormatValid(username);
return isFormatValid && DB.CheckUsernameAvailability(username);
}
}
You could do something like this:
public class Validation
{
public bool CheckUsername(string username,
IUsernameAvailabilityChecker checker)
{
bool isFormatValid = IsFormatValid(username);
return isFormatValid && checker.CheckUsernameAvailability(username);
}
}
And then, from your unit test code, you can create a custom IUsernameAvailabilityChecker which does whatever you want for testing purposes. On the other hand, the actual production code can use a different implementation of IUsernameAvailabilityChecker to actually query the database.
Keep in mind that there are many, many techniques to solve this kind of testing problem, and the examples I gave are simple and contrived.
Testing against outside services can be done using mocking. If you've done a good job using interfaces it's very easy to mock various parts of your application. These mocks can be injected into your unit and used like it would normally handle it.
You should start unit testing as soon as possible. If your application is not complete or code needed for testing is absent you can still test against some interface you can mock.
On a sidenote: Unit testing is about testing behavior and is not an effective way to find bugs. You will find bugs with testing but it should not be your goal.
For instance:
interface UserService {
public void setUserRepository(UserRepository userRepository);
public boolean isUsernameAvailable(String username);
}
class MyUserService implements UserService {
private UserRepository userRepository;
public vois setUserRepository(UserRepository userRepository) {
this.userRepository = userRepository;
}
public boolean isUsernameAvailable(String username) {
return userRepository.checkUsernameAvailability(username);
}
}
interface UserRepository {
public boolean checkUsernameAvailability(String username);
}
// The mock used for testing
class MockUserRepository {
public boolean checkUsernameAvailability(String username) {
if ("john".equals(username)) {
return false;
}
return true;
}
}
class MyUnitTest {
public void testIfUserNotAvailable() {
UserService service = new UserService();
service.setUserRepository(new MockUserRepository);
assertFalse(service.isUsernameAvailable('john')); // yep, it's in use
}
public void testIfUserAvailable() {
UserService service = new UserService();
service.setUserRepository(new MockUserRepository);
assertTrue(service.isUsernameAvailable('mary')); // yep, is available
}
}

EJB repository testing with OpenEJB - how to rollback changes

I try to test my EJB-based repositories using OpenEJB. Every time new unit test is runned I'd like to have my DB in an "initial" state. After the test, all changes should be rolled back (no matter if test succeeded or not). How to accomplish it in a simple way? I tried using UserTransaction - beginning it when test is starting and rolling back changes when finishing (as you can see below). I don't know why, but with this code all changes in DB (which were done during unit test) are left after line rolling changes back has been executed.
As I wrote, I'd like to accomplish it in the simplest way, without any external DB schema and so on.
Thanks in advance for any hints!
Piotr
public class MyRepositoryTest {
private Context initialContext;
private UserTransaction tx;
private MyRepository repository; //class under the test
#Before
public void setUp() throws Exception {
this.initialContext = OpenEjbContextFactory.getInitialContext();
this.repository = (MyRepository) initialContext.lookup(
"MyRepositoryLocal");
TransactionManager tm = (TransactionManager) initialContext.lookup(
"java:comp/TransactionManager");
tx = new CoreUserTransaction(tm);
tx.begin();
}
#After
public void tearDown() throws Exception {
tx.rollback();
this.initialContext = null;
}
#Test
public void test() throws Exception {
// do some test stuff
}
}
There's an example called 'transaction-rollback' in the examples zip for 3.1.4.
Check that out as it has several ways to rollback in a unit test. One of the techniques includes a trick to get a new in memory database for each test.

SQLite database verification after nhibernate schema generation

What is the simplest most effective way to verify that your SQLite db is actually out there after using NHib's schema generation tool?
Cheers,
Berryl
EDIT
I am hoping there is something tied to the ISession (like the connection property) that can be tested; sometimes when running a series of tests it seems like a good session (IsOpen & IsConnected are true) but the db is not there (a query against it gets an error like 'no such table').
EDIT - WHAT I AM DOING NOW
Connection string & other cfg properties
public static Configuration GetSQLiteConfig()
{
return new Configuration()
.SetProperty(ENV.Dialect, typeof (SQLiteDialect).AssemblyQualifiedName)
.SetProperty(ENV.ConnectionDriver, typeof (SQLite20Driver).AssemblyQualifiedName)
.SetProperty(ENV.ConnectionString, "Data Source=:memory:;Version=3;New=True;Pooling=True;Max Pool Size=1")
.SetProperty(ENV.ProxyFactoryFactoryClass, typeof (ProxyFactoryFactory).AssemblyQualifiedName)
.SetProperty(ENV.ReleaseConnections, "on_close")
.SetProperty(ENV.CurrentSessionContextClass, typeof (ThreadStaticSessionContext).AssemblyQualifiedName);
}
How I test the db now, for lack of something 'better' (this tests the mappings)
public static void VerifyAllMappings(ISessionFactory sessionFactory, ISession session)
{
Check.RequireNotNull<ISessionFactory>(sessionFactory);
Check.Require(session.IsOpen && session.IsConnected);
_verifyMappings(sessionFactory, session);
}
private static void _verifyMappings(ISessionFactory sessionFactory, ISession session) {
try {
foreach (var entry in sessionFactory.GetAllClassMetadata())
{
session.CreateCriteria(entry.Value.GetMappedClass(EntityMode.Poco))
.SetMaxResults(0).List();
}
}
catch (Exception ex) {
Console.WriteLine(ex);
throw;
}
}
public static void VerifyAllMappings(ISessionFactory sessionFactory, ISession session)
{
Check.Require(!sessionFactory.IsClosed);
Check.Require(session.IsOpen && session.IsConnected);
try {
foreach (var entry in sessionFactory.GetAllClassMetadata())
{
session.CreateCriteria(entry.Value.GetMappedClass(EntityMode.Poco))
.SetMaxResults(0).List();
}
}
catch (Exception ex) {
Debug.WriteLine(ex);
throw;
}
}
I generate the schema in a session provider whenever a new session is opened:
public ISession Session
{
get
{
var session = (ISession)CallContext.GetData(_lookupSessionKey);
try
{
if (session == null)
{
_log.Debug("Opening new Session for this context.");
session = FactoryContext.Factory.OpenSession();
if(RunTypeBehaviorQualifier != RunType.Production)
SchemaManager.GenerateNewDb(FactoryContext.Cfg, session.Connection);
CallContext.SetData(_lookupSessionKey, session);
}
}
catch (HibernateException ex)
{
throw new InfrastructureException(ex);
}
return session;
}
}
Now this is all probably way over engineered, but I need multiple database connections and I've been having trouble keeping it simpler & working. It's also a lot of info for one question, but maybe someone else has actually got this all down to a science. The test below runs fine within it's own test fixture, but not in conjunction with other tests.
[Test]
public void Schema_CanGenerateNewDbWithSchemaApplied()
{
DbMappingTestHelpers.VerifyAllMappings(_dbContext.FactoryContext.Factory, _dbContext.Session);
}
Berryl,
As far as I can see you're strugling against mapped entities because you are using different connections. Is there any requirement that obligates you to use more than one "real" DB connection? I mean, can your tests share the same session (logically)? If not, you can simply configure your DB as:
<property name="connection.connection_string">Data Source=NonTransactionalDB.txt;Version=3;New=True;Pooling=True;Max Pool Size=1;</property>
The important part of it are the pooling options. As every session will aways use the same connection, you won't have problems with recreating the schema everytime.
It's important to remeber, though, that it introduces to you some limitations about transactions. As SQLite can't handle more than one transaction per connection, running your tests in parallel can bring you problems (something like a "database file is locked" Exception").
Cheers,
Filipe
Berryl, just to make it easir to visualize, I'll post as another answer. Feel free to give me another up if it helps you. :)
Below is the code that I use to check if my NH configuration object was properly configured.
// assert: verify some properties just to see if connection properties were applyed ane entities mapped
Assert.AreEqual<string>(cfg.Properties["connection.connection_string"], #"Server=localhost;Initial Catalog=MoveFrameworkDataNHibernate;User Id=sa;Password=sa");
Assert.AreEqual<string>(cfg.Properties["dialect"], "NHibernate.Dialect.MsSql2000Dialect");
Assert.IsNotNull(cfg.GetClassMapping(typeof(MappedEntity)));
Sincerely, I don't fell safe too that the DB is available checking the configuration object, but that's a way to know: yeah, my entities are there and I'm pointing to the right DB.
I understand that you are afraid of using a second SQLite connection and the DB was exposed in a previous one, so you will get undesired exceptions, but as far as I can see, the only other option to check if your entities are there would be something like the code below. As it refers to the SessionFactory, though, it helps nothing more than the previous option.
tx.Session.SessionFactory.GetClassMetadata(typeof(MappedEntity)) != null
The last option that I can think, in this case, would be to execute a SQL directly to your DB with an EXISTS check. I don't know how agnostic the EXISTS command is between ALL DBs implementations, but to a simple check like we're talking here it shouldn't be a big problem.
Hope this helps!
BTW: it's jfneis. Neis is a surname. Nothing to do with fries, french fries or something like. :)
Cheers.
Filipe