Quarkus tests with equivalent of spring #Sql annotation - unit-testing

I'm trying to move Spring Boot app to Quarkus. In tests we have #Sql annotations which are called before / after particular #Test method. Is there anything what resembles this functionality in Quarkus? All I can find are examples of creating whole database before tests, but what I want is to be able to insert / delete records to / from db before / after each test method.

At the moment, we don't have that feature in Quarkus yet.
I wrote a JUnit5 Extension which uses Flyway to do exactly that. Please check: https://github.com/radcortez/flyway-junit5-extensions and the Quarkus example: https://github.com/radcortez/flyway-junit5-extensions/tree/master/examples/quarkus

Related

Testing Spark: how to create a clean environment for each test

When testing my Apache Spark application, I want to do some integration tests. For that reason I create a local spark appliciation (with hive support enabled), in which the tests are executed.
How can I achieve that after each test, the derby metastore is cleared, so that the next test has a clean environment again.
What I don't want to do is restarting the spark application after each test.
Are there any best practices to achieve what I want?
I think that introduction of some application level logic for integration testing kind of breaks concept of integration testing.
From my point of view correct approach is to restart application for each test.
Anyway I believe another option is to start/stop SparkContext for each test. It should clean any relevant stuff.
UPDATE - answer to comments
Maybe it's possible to do a cleanup by deleting tables/files?
I would ask more general question - what do you want to test with your test?
In a software development is defined unit testing and integration testing. And nothing in between. If you desire to do something that is not integration and not unit test - then you're doing something wrong. Specifically, with your test you try to test something that is already tested.
For the difference and general idea of unit and integration tests you can read here.
I suggest you to rethink your testing and depending on what you want to test do either integration or unit test. For example:
To test application logic - unit test
To test that your application works in environment - integration test. But here you shouldn't test WHAT is stored in Hive. Only that the fact of storage is happened, because WHAT is stored shall be tested by unit test.
So. The conclusion:
I believe you need integration tests to achieve your goals. And the best way to do it - restart your application for each integration test. Because:
In real life your application will be started and stopped
In addition to your Spark stuff - you need to make sure that all your objects in code are correctly deleted/reused. Singletones, Persistent objects, Configurations.. - it all may interfere with your tests
Finally, the code that will perform integration tests - where is a guarantee, that it will not break production logic at some point?

Do I need to regression test existing methods on a Web service if new methods are added

We have a SOAP web service that is in production and contains a large number of methods. As part of a project we are adding new methods to that web service, note we are not amending the existing methods.
What I am trying to determine is whether I need to regression test the existing methods to test if they have been impacted by adding new methods?
Yes, if you change your webservice the only proper way to make sure none of the changes have impacted existing operations is a regression test.
If you use a testing tool like SOAPUI you can automate this for every build you make. (Regression) testing should be a standard step after any new build to ensure software quality.

How to write unit tests for openldap?

I'm using the library openldap for c++ to implement some authenticattion and queries for an ldap DB. I want to write unit tests for my code.
My question is, is it done like with sql DBs? for instance with sql, in each unit test you do something like that: drop the test DB, create a new one, add some users, assert your apis.... etc.
All in all I want to know the convention for writing ldap-db unit tests.
If you're talking about unit tests then you should mock your LDAP API and test only your code, not the LDAP API implementation. You can use Google Mock for your mocks.
But I think you're referring to integration tests, for that same strategy as with database integration tests apply. You setup the environment - bring up the server, populate the entries, assert that the code works against it and then tear down that environment.
In Java I would use in-memory LDAP server for integration tests, you could try and find one that you can embed and run only from memory in C/C++.
See What's the difference between unit, functional, acceptance, and integration tests?.

JUnit with Glassfish and JPA

I've a web application that runs in Glassfish v3. It's realized with JSF v2 and JPA (so there's a persistence.xml where is declared a JTA-data-source).
If i try to test my repositories with JUnit, it fails the lookup and gives me this error:
javax.naming.NamingException: Lookup failed for 'java:comp/env/persistence/em' in SerialContext[myEnv=
java.naming.factory.initial=com.sun.enterprise.naming.impl.SerialInitContextFactory,
java.naming.factory.url.pkgs=com.sun.enterprise.naming,
java.naming.factory.state=com.sun.corba.ee.impl.presentation.rmi.JNDIStateFactoryImpl}
[Root exception is javax.naming.NamingException: Invocation exception: Got null ComponentInvocation ]
It seems to ask for a transaction-type="RESOURCE_LOCAL" that i can't provide it, since it'd be in conflict with Glassfish's transaction-type="JTA".
So, what i'd like to ask is if it's possible to find a way to run JUnit without [strongly] change my webapp's configuration.
Thanks,
AN
For real in-container tests you should have a look at Arquillian. It allows you to run your unit tests within the container.
You should have a look at the documentation at http://arquillian.org/guides/ and the showcases at GitHub at https://github.com/arquillian/arquillian-showcase/. There is also a JSF related showcase.
Regarding your configuration. I would strongly suggest to configure your project in such a way, that you can use a different configuration as in production.
If you need only a working JPA environment for your tests, then you should do the following:
Create a second JPA configuration with transaction-type="RESOURCE_LOCAL".
Add a setter for the entity manager to your beans.
Create the entity manager within your test setup as you would do it in an standalone Java application.
Inject the entity manager manually in the beans.
Try to use a mocking framework like Mockito to mock all other parts of the application which are not a part of the current test but required for the test.
The second approach depends on your architecture and the possibilities it offers to you. It allows you to write very fine-grained unit tests. The first approach is very usefull to test the real behaviour of your application in the container.

Is there a point Unit Testing a Repository? Entity Framework 4.1

I have been watching various videos and reading various blogs where they go about unit testing a repository.
The most common pattern is to create a Fake repository that implements the same interface as the real one. Then the fake one uses an internal Dictionary or something.
So in effect you are unit testing the logic of the fakerepository which will never go into production.
Now you may use dependency injection to inject a mock DBContext by using some IDBContext interface. However then you are just testing each repository method which in effect just forward to the dbcontext (which is mocked).
So unless each repository method has lots of logic before calling on the dbcontext then it seems a bit pointless?
I think it would be better to have the tests on repository as integration tests and actually have them hitting the Database?
The new EF 4.1 makes this easy as it can create the database on the fly based on a connection string in your test project, then you can delete it after tests are run using the dbcontext.Database methods.
Your objections are partially correct. Their correctness depends on the way how the repository is defined.
First faking or mocking repository is not for testing repository itself but for testing layers using the repository.
If the repository exposes IQueryable and upper layer can build linq-to-entities query then mocking repository means testing non existing logic. You need integration test and run the query against a real testing database. You can either redeploy database for each test which will make it very slow or you can run each test in a transaction and rollback it when the test completes.
If the repository doesn't exposes IQueryable you can still think about it as a black box and mock it. Query logic will be inside the repository and it will be tested separately with integration tests.
I would refer you to set of other answers about repository itself and testing.
The best approach I have seen is from Sharp Architecture where they use a SQLLite database, created in the TestFixtureSetup based on the NHibernate mapping info.
The repository tests then use this In-Memory database.
Technically this is still integration test as database involved, but practically, it ticks all the boxes for a unit test since:
1) The database is transient - no connection string configs to worry about, nor do you need a complete db sitting on a server somewhere for the unit test to use.
2) The setup is fast, and the tests equally so as all in memory.
3) As it uses the NHibernate mapping info to generate the schema, you don't have to worry about keeping the unit test setup synchronised with code changes.
http://wiki.sharparchitecture.net/default.aspx?AspxAutoDetectCookieSupport=1
It may be possible to use the same approach with EF.