Please help, i'm fairly new to Django and not sure what's the best way to proceed with my unit-tests.
So, i have a large django app, and it has dozens of views-methods, and the postgresql schemas get pretty complex. I've read that if I use "from django.test import TestCase" then the test database is flushed after running each unit-test. I wanted to prevent from flushing the db in between unit-tests within the same class, so i started using "from unittest import TestCase". That did the trick and the db is preserved in between unit-tests, but now the statement
self.assertTemplateUsed(response, 'samplepage.html') gives me errors AttributeError: 'TestViews' object has no attribute 'assertTemplateUsed'.
What can I do? Is there an alternative to 'assertTemplateUsed' that can be used with unittest.TestCase? Many thanks in advance!
Related
I am attempting to write test cases in Django using Selenium. I want to use existent fixtures so my test database (SQLite3) has test data for every test.
I have some model test cases (just using the TestCase class) as follows;
from django.test import TestCase
from django.test import LiveServerTestCases
from missions.models import Mission, MissionDataRecord
class MissionModelTests(TestCase):
fixtures = ['myproject_common/fixtures/auth_initial_load.json', 'worklog/fixtures/worklogs_initial_load',
'missions_initial_load.json']
def test_object_name_is_mission_title(self):
mission = Mission.objects.get(id=1)
self.assertEqual(mission.title, str(mission))
def test_object_name_is_mission_title_again(self):
mission = Mission.objects.get(id=1)
self.assertEqual(mission.title, str(mission))
This works as expected when run like this (I get two test passes). However, for Selenium testing I need to use LiveServerTestCase instead of TestCase.
The simple example above is a model test, but for illustrative purposes of the issue I'm facing with Selenium, if I simply replace "TestCase" with "LiveServerTestCase" the first test passes, but the second test fails, with the error
django.db.utils.IntegrityError: Problem installing fixture
'[...]/fixtures/auth_initial_load.json': Could not load
auth.User(pk=1): UNIQUE constraint failed: auth_user.username
This error occurs in the _fixture_setup of /django/test/testcases.py. This seems to suggests that my fixtures (specifically the auth_initial_load fixture) is attempting to load again ON TOP of existing data. However, from django docs reading this should not be taking place, because each test should run in its own transaction (which I believe means the fixtures are loaded for each transaction).
What is going on here, and more importantly, how can I use LiveServerTestCase with my existing fixtures (similar to how I am using TestCase currently)? In reality I need to use StaticLiveServerTestCase, but Im guessing the code will be the same.
It turns out the way I was loading fixtures correctly after all. The issue was with my fixtures themselves, using hard coded primary (and foreign) keys. In my case two users were being created before the fixtures were loaded, so when fixtures tried to load with the same primary key, a UNIQUE constraint violation occurred. The solution was to re-generate my fixtures using the --natural-primary and --natural-foreign flags as suggested in this SO answer.
I'm having trouble moving away from django_nose.FastFixtureTestCase to django.test.TestCase (or even the more conservative django.test.TransactionTestCase). I'm using Django 1.7.11 and I'm testing against Postgres 9.2.
I have a Testcase class that loads three fixtures files. The class contains two tests. If I run each test individually as a single run (manage test test_file:TestClass.test_name), they each work. If I run them together, (manage test test_file:TestClass), I get
IntegrityError: Problem installing fixture '<path>/data.json': Could not load <app>.<Model>(pk=1): duplicate key value violates unique constraint "<app_model_field>_49810fc21046d2e2_uniq"
To me it looks like the db isn't actually getting flushed or rolled back between tests since it only happens when I run the tests in a single run.
I've stepped through the Django code and it looks like they are getting flushed or rolled back -- depending on whether I'm trying TestCase or TransactionTestCase.
(I'm moving away from FastFixtureTestCase because of https://github.com/django-nose/django-nose/issues/220)
What else should I be looking at? This seems like it should be a simple matter and is right within what django.test.TestCase and Django.test.TransactionTestCase are designed for.
Edit:
The test class more or less looks like this:
class MyTest(django.test.TransactionTestCase): # or django.test.TestCase
fixtures = ['data1.json', 'data2.json', 'data3.json']
def test1(self):
return # I simplified it to just this for now.
def test2(self):
return # I simplified it to just this for now.
Update:
I've managed to reproduce this a couple of times with a single test, so I suspect something in the fixture loading code.
One of my basic assumptions was that my db was clean for every TestCase. Tracing into the django core code I found instances where an object (in one case django.contrib.auth.User) already existed.
I temporarily overrode _fixture_setup() to assert the db was clean prior to loading fixtures. The assertion failed.
I was able to narrow the problem down to code that was in a TestCase.setUpClass() instead of TestCase.setUp(), and so the object was leaking out of the test and conflicting with other TestCase fixtures.
What I don't understand completely is I thought that the db was dropped and recreated between TestCases -- but perhaps that is not correct.
Update: Recent version of Django includes setUpTestData() that should be used instead of setUpClass()
Is there an easy way to load fixture data that I usually use in automated test runs in the interactive Django shell?
It might be awkward to have a mixture of model data that come from the database and others that come from a fixture. In my case, I have some read-only tables and wand to experiment with some data that I can discard afterwards.
I can probably load the fixture files like described here, but that's a bit cumbersome for repeated use...
ilardm's answer points in the right direction, specifically what you want is:
from django.core.management import call_command
call_command('loaddata', 'fixture_name.json')
Edit: But the correct way to include fixtures in test cases is like this:
class TestThis(TestCase):
fixtures = ['myfixture.json']
def setUp(self):
# Ready to test
I expect ./manage.py loaddata fixture_name.json is what you want.
Perhaps this link: http://testedwebdev.blogspot.ru/2012/05/django-shell-testing.html might help.
I have a project that uses a SOLR search engine through django-haystack. The search engine is on the different live server and touching it during the test run is undesirable (actually, it's impossible, since the access to that host is firewalled)
I'm using standard django testrunner. Luckily, it gives me the object test-settings I can modify to my liking, but turns out it's not the end of the story.
A lot of stuff in django-haystack is instantiated at the import-time, so by the time I change test-settings in my test runner it is too late, and despite the fact that I change the SEARCH_BACKEND to dummy, tests still make call to SOLR. The problem is not specific to HAYSTACK - same issue happens with mongoengine. Any class-level statements (eg CharField(default=Blah.objects.find(...))) are executed at the instantiation-time before django has a chance to change settings.
Of course the root of the problem is the fact that Django settings is a scary globally mutable mess and that Django provides no centralized place for the instantiation code. Given that, are there any suggestions on what testing solution will be the easiest? At the moment I'm thinking about a shell script which will change DJANGO_SETTINGS environment variable to test_settings and run ./manage.py test afterwards. It would be nicer if I could still do things via ./manage.py though.
Any better ideas? People with similar problems?
I took the answer from here and modified it slightly. This works great for me:
from contextlib import contextmanager
#contextmanager
def connection(**kwargs):
from haystack import connections
for key, new_value in kwargs.items():
setattr(connections, key, new_value)
connections['default'].options['URL'] = connections.connections_info['default']['URL']
yield
My test, then, looks like:
def test_job_detail_by_title_slug_job_id(self):
with connection(connections_info=solr_settings.HAYSTACK_CONNECTIONS):
resp = self.client.get('/0/rts-crb-unix-production-engineer/27216666/job/')
self.assertEqual(resp.status_code, 404)
resp = self.client.get('/indianapolis/indiana/usa/jobs/')
self.assertEqual(resp.status_code, 200)
I'm trying to use fixtures as a DB-agnostic way to get the data into my database, but this is much harder than it should be. I'm wondering what I'm doing wrong...
Specifically, when I do a syncdb followed by a migrate followed by a loaddata I run into trouble, since syncdb already creates data that loaddata tries to read from the dump. This leads to double entries and hence a crashing script.
This seems to be the same problem as described here: https://code.djangoproject.com/ticket/15926
But it's weird to me that this seems to be an ignored issue. Are fixtures not meant to actually put real (live) data in?
If so: is there any Django-format that is meant for this? Or is everyone just dumping data as SQL? And, if so, how would one migrate development data in SQLite to a production database?
syncdb will also load data from fixtures if you have the fixtures named correctly and in the correct location. See this link for more info.
https://docs.djangoproject.com/en/1.3/howto/initial-data/#automatically-loading-initial-data-fixtures
If you do not want the data to load on every syncdb then you will need to change the name of the fixture.
fixtures are an OK way to load your data, I have used it on a number of projects. On some projects when I have a ton of data I sometimes write a special load script that will take the data from my data source and load up my new django models, the custom script is a little more work, but gives you more flexibility.
I tend to stay away from using sql to load if I can, since SQL is usually DB specific, if you have to worry about loading on different database versions, stay away if you can.
"In general, using a fixture is a cleaner method since it’s database-agnostic, but initial SQL is also quite a bit more flexible."
OP here; this is what I came up with so far:
# some_app/management/commands/delete_all_objects.py
from django.core.management.base import BaseCommand, CommandError
from django.db.models import get_models
class Command(BaseCommand):
help = 'Deletes all objects'
def handle(self, *args, **options):
for model in get_models():
model.objects.all().delete()
And then just run delete_all_objects between after syncdb & migrate and before loaddata. I'm not sure I like it, I'm very surprised it's necessary, but it works.