I'm trying to write a simple integration test, but having some trouble with Domain Objects. I've read on unit testing but can't figure it out.
This is my simple test:
User user = User.get(1)
controller.params.userid = "1"
controller.session.user = user
controller.save();
The error message is:
groovy.lang.MissingMethodException: No
signature of method: static
com.baufest.insside.user.User.get() is
applicable for argument types:
(java.lang.Integer) values: 1
My guess is that I should mock the user object, but don't know how.
You say that you're integration testing, but it looks like you're unit testing. Is the test under test/integration or test/unit? Unit tests need mocking, but integration tests have an initialized Spring application context and Hibernate, and run against an in-memory database.
This is described in the user guide, which is at http://grails.org/doc/latest/ (you reference an older 1.1 version).
To mock the User class, just call mockDomain with one or more test instances either in setUp or in the test method:
def users = [new User(...), new User(...), ...]
mockDomain User, users
...
User user = User.get(1)
Related
I've got some password reset functionality that I'm re-writing but adding tests.
My test should:
Create a user entry in the DB
Create a hash token entry in the DB for that user
Verify that the hash token matches an entry in the DB.
The last one (3), I'm having trouble testing, because typically, I wouldn't return the hash token to the user (client) in production. Instead, I'd fire off an event to my taskworker with the hash token string being passed around in memory and then send them an email with the secret link.
My route for verification looks like:
server.get('/api/v1/users/password/reset/:token', userPasswordReset.fetch)
Such that, as you can see, I need to grab token from the request parameter, which means I need to have it sent back to me somehow (but only in a test environment, NOT in production).
To currently solve for this, in my controller I'm currently doing this:
return process.env.NODE_ENV === 'test'
? res.status(201).send({ token: record.token })
: res.status(201).end()
However, I'd like to know if there is a safer way to go about this. I don't love the idea of putting fragile code in my production code that is dependent upon an env variable.
I've thought about writing the token to a file on the system, but because these tests will run within different environments I'm not sure that's entirely reliable based upon virtual filesystems.
You should only be testing your public API. You don't test a scenario that is invalid. You can test hash creation and storage of token at DB layer. Hope this helps.
I'm developing a C++ web application and I'm using PostgreSQL with libpqxx, on Ubuntu 16.04. The issue I'm having is that I need a portable (over Linux systems) way to run my unit tests.
What I have now is a class that controls database calls. I'd like to test how this class acts under my unit tests. For that, for every time I run a unit test, I'd like to:
Create a temp user
Create a dummy db
run my tests on it
delete the db
delete the user
Now doing the steps is fine with Google tests. I can create a fixture that will do them all reproducibly. BUT...
How can I create a user with a password in one call (without being prompted for the password), so that I can create that user on the go and run my tests?
The manual doesn't seem to provide a way to provide the password in an argument. I was hoping to do something like this:
system("createuser " + username + " -password abcdefg");
where system() runs a terminal command. Then I can connect to the DB server with that username and password to do my unit tests.
Another failed attempt for me was to pass an sql query that would create the user through the terminal:
system("psql -c \"CREATE ROLE joe PASSWORD 'aabbccdd';\"")
When I do this, I get the error:
psql: FATAL: database "user" does not exist
where user is my unix username.
Remember that I cannot connect to the server with libpqxx because I don't have credentials yet (as the temporary user). Right? Please correct me if I'm wrong.
Is there any way to do what I'm planning here and make unit-tests runnable without user intervention?
The problem with the system("psql -c \"CREATE ROLE joe PASSWORD 'aabbccdd';\"") call is that it doesn't specify a database to connect to which will then default to trying a database of the same name as the user logging in. Try adding -d postgres and see if it works.
Please note that this only creates the user, it doesn't create any database or give privileges to anything.
After learning JUnit and experienced its benefits for both programmer and the project, I wanted now to unit test the service layer of each entities and test if each methods works properly.
As of now, I already have created a unit test for all of my service classes but the problem is that the datasource's data isn't suited for testing. Thus I have to created another database for service layer testing and configure the datasource for the unit test of the service layers. But the things is I don't know how to configure another datasource which only the src/test/java could access and couldn't be accessed upon production. I'm still new to SpringBoot and SpringData so I'm asking how to configure such requirements here.
As of now I have this application.properties configuration.
spring.datasource.url=<DatabaseURL>
spring.datasource.username=<DatabaseUsername>
spring.datasource.password=<DatabasePassword>
spring.datasource.driver-class-name=<DatabaseDriver>
// another datasource configuration
And here's a sample code for a service class. Which uses the application.properities - dataSource configuration.
#Service
public class FooService {
#PersistenceContext
private EntityManager entityManager;
public List<Foo> findAllByFooForm(FooForm fooForm) {
// JPA CriteriaBuilder query accroding to FooForm
return entityManager.createQuery(query).getResultList();
}
}
Finally, here's a sample code for unit test of a service class.
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = Application.class)
public class FooServiceTest {
#AutoWired
private FooService fooService
#Test
public void testFindAllByFooForm() {
// Test statements
}
}
There are a few approaches which can be combined to give you good control over this.
First of all, if you create src/test/resources/application.properties, then that will only be available on the classpath during testing. It will override any properties that you have defined in src/main/resouces/application.properties.
If you are using an in-memory database to support those tests, then you can ensure that different import.sql files are loaded, through the use of the following property:
spring.jpa.properties.hibernate.hbm2ddl.import_files=import-test1.sql
That annotation takes a comma-separated list of import scripts, so you can have a base set of data loaded by one script and additional (test-specific perhaps) data loaded by others.
If you wish to connect to a different database in each test, or cause different import scripts to be used, then you can use profiles to trigger this. If you create a properties file application-test1.properties, then the test itself can cause that to be loaded using the annotation: #ActiveProfiles({"test1"}).
In django-webtest, every test TestCase subclass comes with self.app, which is an instance of webtest.TestApp, then I could make it login as user A by self.app.get('/',user='A').
However, if I want to test the behavior if for both user A and user B in a test, how should I do it?
It seems that self.app is just DjangoTestApp() with extra_environ passed in. Is it appropriate to just create another instance of it?
I haven't tried setting up another instance of DjangoTestApp as you suggest, but I have written complex tests where, after making requests as user A I have then switched to making requests as user B with no issue, in each case passing the user or username in when making the request, e.g. self.app.get('/', user'A') as you have already written.
The only part which did not work as expected was when making unauthenticated requests, e.g. self.app.get('/', user=None). This did not work as expected and instead continued to use the user from the request immediately prior to this one.
To reset the app state (which should allow you to emulate most workflows with several users in a sequential manner) you can run self.renew_app() which will refresh your app state, effectively logging the current user out.
To test simultaneous access by more than one user (your question does not specify exactly what you are trying to test) then setting up another instance of DjangoTestApp would seem to be worth exploring.
How do I run a unit test against the production database instead of the test database?
I have a bug that's seems to occur on my production server but not on my development computer.
I don't care if the database gets trashed.
Is it feasible to make a copy the database, or part of the database that causes the problem? If you keep a backup server, you might be able to copy the data from there instead (make sure you have another backup, in case you messed the backup database).
Basically, you don't want to mess with live data and you don't want to be left with no backup in case you mess something up (and you will!).
Use manage.py dumpdata > mydata.json to get a copy of the data from your database.
Go to your local machine, copy mydata.json to a subdirectory of your app called fixtures e.g. myapp/fixtures/mydata.json and do:
manage.py syncdb # Set up an empty database
manage.py loaddata mydata.json
Your local database will be populated with data and you can test away.
Make a copy the database... It's really a good practices!!
Just execute the test, instead call commit, call rollback at the end of.
The first thing to try should be manually executing the test code on the shell, on the production server.
python manage.py shell
If that doesn't work, you may need to dump the production data, copy it locally and use it as a fixture for the testcase you are using.
If there is a way to ask django to use the standard database without creating a new one, I think rather than creating a fixture, you can do a sqldump which will generally be a much smaller file.
Short answer: you don't.
Long answer: you don't, you make a copy of the production database and run it there
If you really don't care about trashing the db, then Marco's answer of rolling back the transaction is my preferred choice as well. You could also try NdbUnit but I personally don't think the extra baggage it brings is worth the gains.
How do you test the test db now? By test db do you mean SQLite?
HTH,
Berryl
I have both a full-on-slow-django-test-db suite and a crazy-fast-runs-against-production test suite built from a common test module. I use the production suite for sanity checking my changes during development and as a commit validation step on my development machine. The django suite module looks like this:
import django.test
import my_test_module
...
class MyTests(django.test.TestCase):
def test_XXX(self):
my_test_module.XXX(self)
The production test suite module uses bare unittest and looks like this:
import unittest
import my_test_module
class MyTests(unittest.TestCase):
def test_XXX(self):
my_test_module.XXX(self)
suite = unittest.TestLoader().loadTestsFromTestCase(MyTests)
unittest.TextTestRunner(verbosity=2).run(suite)
The test module looks like this:
def XXX(testcase):
testcase.assertEquals('foo', 'bar')
I run the bare unittest version like this, so my tests in either case have the django ORM available to them:
% python manage.py shell < run_unit_tests
where run_unit_tests consists of:
import path.to.production_module
The production module needs a slightly different setUp() and tearDown() from the django version, and you can put any required table cleaning in there. I also use the django test client in the common test module by mimicking the test client class:
class FakeDict(dict):
"""
class that wraps dict and provides a getlist member
used by the django view request unpacking code, used when
passing in a FakeRequest (see below), only needed for those
api entrypoints that have list parameters
"""
def getlist(self, name):
return [x for x in self.get(name)]
class FakeRequest(object):
"""
an object mimicing the django request object passed in to views
so we can test the api entrypoints from the developer unit test
framework
"""
user = get_test_user()
GET={}
POST={}
Here's an example of a test module function that tests via the client:
def XXX(testcase):
if getattr(testcase, 'client', None) is None:
req_dict = FakeDict()
else:
req_dict = {}
req_dict['param'] = 'value'
if getattr(testcase, 'client', None) is None:
fake_req = FakeRequest()
fake_req.POST = req_dict
resp = view_function_to_test(fake_req)
else:
resp = testcase.client.post('/path/to/function_to_test/', req_dict)
...
I've found this structure works really well, and the super-speedy production version of the suite is a major time-saver.
If you database supports template databases, use the production database as a template database. Ensure that you Django database user has sufficient permissions.
If you are using PostgreSQL, you can easily do this specifying the name of your production database as POSTGIS_TEMPLATE(and use the PostGIS backend).