In my Django projects I use sqlite database to run tests. Since it uses only memory, it's much faster than MySQL, but it's still not fast enough. During tests, only one of 4 processors is used, and not much memory is consumed. So, I'd like to have 4 sqlite databases in memory to run 4 tests in parallel.
Has anyone tried this?
Since Django 1.9 it's possible to run the tests in parallel by Django with its built-in unit-test features.
Django Docs: https://docs.djangoproject.com/en/3.0/ref/django-admin/#cmdoption-test-parallel
According to the Django 3.0 documentation there is a --parallel option that you can use.
--parallel [N]
Runs tests in separate parallel processes. Since modern processors have multiple cores, this allows running tests significantly faster.
So you can use the following command to execute the tests in parallel.
python manage.py test --parallel
You can adjust the number of processes either by providing it as the option’s value, e.g. --parallel=4, or by setting the DJANGO_TEST_PROCESSES environment variable.
This can help considerably reduce your test execution time if you have a large Django project with quite a few test unit cases.
You can use a parallel test runner for Django and Twisted described here: http://www.tomaz.me/2011/04/03/making-django-and-twisted-tests-faster.html (the source lives here https://github.com/Kami/parallel-django-and-twisted-test-runner - link at the end of the post). You can use it as described in Django docs on testing.
There is also a nose parallel test runner.
You can easily split the testing for apps on parrallalel on linux by:
$ python manage.py test cms & \
python manage.py test blog & \
python manage.py test store & \
python manage.py test registration
$
Could be helpfull for big projects with a lot of apps, the best would be a bash scripts that runs tests every four apps.
Related
We're a small team of developers working on a version-controlled Django 3 project. We are worried about utilizing migrations and the possibility of overwriting one another's migration files.
Options we've considered:
Using the --name / -n option of makemigrations to have a naming convention, but that seems cumbersome
Django documentation mentions tools for version control, but I don't see much on the specifics on it.
How do other teams handle this?
Using version control (git, etc.) will solve your problem.
Just make sure the developers work on a branch rather than directly on master. This way, the migrations will be visible, and if there's a conflict, git will let you know when you try to merge to your master branch. It's even more obvious if you're using GitHub as you can see the conflicts in the UI for the pull request.
The only problem you will have here is if two migrations are generated in the same app in two PRs. The one that gets merged into master last will cause a problem as there will be multiple leaf nodes, but this can be solved easily with ./manage.py makemigrations --merge, and ideally you should run ./manage.py makemigrations --check to see if there are problems before merging stuff - in continuous integration, for exaple.
I'm a hobby web developer looking to build my first product with the django web framework which I will host on Heroku. Cypress looks like a great modern tool for UI testing which I certainly want to use. In local development I am using it within django testing so that the django staticliveservertestcase runs first - which creates a separate development server process, and a test database, if one hasn't been created already. Within this test I then call a bash script which executes a cypress UI test. See the example below.
Bash Script -
#!/bin/bash
echo $1 # the 1st command line argument
echo $2 # the 2nd command line argument
./node_modules/.bin/cypress run --env baseUrl=$1 --spec $2
Django Test Code -
class ExampleTest(StaticLiveServerTestCase):
def test_view_url(self):
from subprocess import run
exit_code = run([
'./cypress/run.sh',
f'http://{self.server_thread.host}:{self.server_thread.port}',
'cypress/integration/sample_spec.js'
])
self.assertEqual(0, exit_code.returncode)
From my understanding this is the opposite to the standard cypress approach. Normally cypress runs outside of the app independently and simply runs through all the tests which interact with the app. Here the cypress tests are called by the django process. Unless I am mistaken there are obvious advantages to this approach. I can test the UI and whether objects where created in the database after the UI test has run.
Is it possible, and if this how, to do this within Heroku? So Heroku would simply call the test script -
python manage.py test
And I get UI testing and unit testing all done in one go. It would also be nice to see the tests running in interactive mode. Which may be a challenge because Heroku obviously runs everything in a linux container - I don't plan on using Docker incidentally because I think that limits a lot of the features Heroku offers.
Since upgrading my Django project to Django 1.8 using the default Django test runner, I'm finding that my tests take between 10 and 15 seconds to start up because of the new default behavior of running migrations before tests. It now takes longer for my test suite to start up then it does to run the actual tests themselves. I only have seven models, none of which are that complex, and over 180 tests. I'm running my tests against a PostgreSQL 9.4.4. database on a 1.86 GHz Mac laptop with 4 GB of RAM. I found this question which said to create a settings_test.py file that contains a MIGRATION_MODULE setting and then specify that settings_test file when running your tests. Could someone explain further how to implement this solution? Is the settings_test file supposed to look like this if you have two apps?
MIGRATION_MODULES={'app1': 'app1.migrations_not_used_in_tests',
'app2': 'app2.migrations_not_used_in_tests'}
If that's true, how would you run your tests as described if you have multiple apps?
DJANGO_SETTINGS_MODULE='???.settings_test' python manage.py test
I tried creating the exact settings_test.py file that specified 'myapp' even though I don't have an app named 'myapp' and while my tests ran, the migrations also ran prior to the tests as I expected they would. I'm sorry if this seems like a dumb question but I'm relatively new to Django.
Thanks.
I have a app that is deployed to Heroku, and I'd like to be able to run the test suite post-deployment on the target environment. I am using the Heroku Postgres add-on, which means that I have access to a single database only. I have no rights to create new databases, which in turn means that the standard Django test command fails, as it can't create the test_* database.
$ heroku run python manage.py test
Running `python manage.py test` attached to terminal... up, run.9362
Creating test database for alias 'default'...
Got an error creating the test database: permission denied to create database
Is there any way around this?
Turns out I was in the wrong. I was not testing what I thought was being tested... Since Heroku's Routing Mesh was sending requests to different servers, the LiveServerTestCase was starting a web server on one machine and Selenium was connecting to other machines altogether.
By updating the Heroku Procfile to:
web: python src/manage.py test --liveserver=0.0.0.0:$PORT
overriding the DATABASES setting to point to the test database, and customizing the test suite runner linked to below (the same idea still holds: override setup_databases so that it only drops/re-creates tables, not the entire database), I was able to run remote tests. But this is even more hacky/painful/inelegant. Still looking for something better! Sorry about the confusion.
(updated answer below)
Here are the steps that worked for me:
Create an additional, free Postgres database using the Heroku toolbelt
heroku addons:add heroku-postgresql:dev
Use the HerokuTestSuiteRunner class which you'll find here.
This custom test runner requires that you define a TEST_DATABASES setting which follows the typical DATABASES format. For instance:
TEST_DATABASES = {
'default': dj_database_url.config(env='TEST_DATABASE_URL')
}
Then, have the TEST_RUNNER setting be a Python path to wherever HerokuTestSuiteRunner can be found.
You should now be able to run Django tests on Heroku using the given database. This is very much a quick hack... Let me know how it could be improved / made less hackish. Enjoy!
(original answer below)
A few relevant solutions have been discussed here. As you can read in the Django docs, "[w]hen using the SQLite database engine, the tests will by default use an in-memory database".
Although this doesn't thoroughly test the database engine you're using on Heroku (I'm still on the lookout for a solution that does that), setting the database engine to SQLite will at least allow you to run your tests.
See the above-linked StackOverflow question for some pointers. There are at least two ways out: testing if 'test' in sys.argv before forcing SQLite as the database engine, or having a dedicated settings file used in testing, which you can then pass to django manage.py test using the --settings option.
Starting with version 1.8, Django now has an option called keepdb, which allows for the same database to be reused during tests.
The --keepdb option can be used to preserve the test database between test runs.
This has the advantage of skipping both the create and destroy actions which can greatly decrease the time to run tests, especially those in a large test suite.
If the test database does not exist, it will be created on the first run and then preserved for each subsequent run.
Any unapplied migrations will also be applied to the test database before running the test suite.
Since it also allows for the test database to exist prior to running tests, you can simply add a new Postgres Heroku instance to your dyno and configure the tests to use that particular database.
Bonus : you can also use the failfast option, which exits as soon as your first test crashes, so that you don't have to wait for all tests to complete.
However, if you are deploying things to Heroku and you are using Heroku Pipelines, an even better option is available : Heroku CI.
I have a django app running on heroku. I would like to run my South migrations before any code that depends on them goes live. After a quick review of current recommended practices I have found two suggested migration procedures.
Recommendation 1
Commit and push all changes
Run heroku run python manage.py migrate <APP_NAME> for each app
This suffers from having a period in between steps 1 and 2 where my code is assuming the latest schema is in place, but the db hasn't yet been updated.
Recommendation 2
commit and push all database changes.
Migrate.
Push all code changes.
This solves the previous problem, but adds a lot more complexity to the deployment process, and some day I will mess this up.
Potential Solution?
It seems that I can avoid the problem in Recommendation 1 and keep my deployment to a single step by utilising a custom post_compile script
that calls python $MANAGE_FILE migrate <APP_NAME> for each of my apps (in dependency order).
I have not seen this recommended anywhere, so my question is twofold. Can you see any potential problem with this approach, and do you have a better method?
If your application can afford some downtime, the easiest way seems to me to
Pause your app using $ heroku maintenance:on
Migrate all the apps at once with heroku run python manage.py migrate
Restart your app: $ heroku maintenance:off
Is it enough or do you have more complex needs ?