I use production database for tests (actually it's test database in docker). The question is: how to make tests run in transactions against this database. I need the same behavior like #pytest.mark.django_db(transaction=True) but with production database.
Current setup:
conftest.py
#pytest.fixture(scope='session')
def django_db_setup():
"""Avoid creating/setting up the test database"""
pass
#pytest.fixture
def db(request, django_db_setup, django_db_blocker):
django_db_blocker.unblock()
#pytest.fixture
def myfixture(db):
...
return SomeObject
test_example.py
def test_something(db, myfixture):
assert ...
Finally I've found the solution.
Add fixtures loading code to db fixture:
conftest.py
from django.core.management import call_command
#pytest.fixture
def db(request, django_db_setup, django_db_blocker):
django_db_blocker.unblock()
call_command('loaddata', 'fixture.json')
And use #pytest.mark.django_db(transaction=True) with tests:
test_example.py
#pytest.mark.django_db(transaction=True)
def test_something(db, myfixture):
assert ...
After each test pytest will flush your database and fill it with fixtures data.
Related
I am new to flask and I have set up a simple flask example and two tests using pytest(see here). When I let run only one test it works, but if I run both tests it does not work.
Anyone knows why? I think I am missing here some basics of how flask works.
code structure:
app/__init__.py
from flask import Flask
def create_app():
app = Flask(__name__)
with app.app_context():
from app import views
return app
app/views.py
from flask import current_app as app
#app.route('/')
def index():
return 'Index Page'
#app.route('/hello')
def hello():
return 'Hello World!'
tests/conftest.py
import pytest
from app import create_app
#pytest.fixture
def client():
app = create_app()
yield app.test_client()
tests/test_app.py
from app import create_app
def test_index(client):
response = client.get("/")
assert response.data == b"Index Page"
def test_hello(client):
response = client.get("/hello")
assert response.data == b"Hello World!"
The problem is with your registration of the routes in app/views.py when you register them with current_app as app. I'm not sure how you would apply the application factory pattern without using blueprints as the pattern description in the documentation implies they are mandatory for the pattern:
If you are already using packages and blueprints for your application [...]
So I adjusted your code to use a blueprint instead:
app/main/__init__.py:
from flask import Blueprint
bp = Blueprint('main', __name__)
from app.main import views
app/views.py -> app/main/views.py:
from app.main import bp
#bp.route('/')
def index():
return 'Index Page'
#bp.route('/hello')
def hello():
return 'Hello World!'
app/__init__.py:
from flask import Flask
def create_app():
app = Flask(__name__)
# register routes with app instead of current_app:
from app.main import bp as main_bp
app.register_blueprint(main_bp)
return app
Then your tests work as intended:
$ python -m pytest tests
============================== test session starts ==============================
platform darwin -- Python 3.6.5, pytest-6.1.0, py-1.9.0, pluggy-0.13.1
rootdir: /Users/oschlueter/github/simple-flask-example-with-pytest
collected 2 items
tests/test_app.py .. [100%]
=============================== 2 passed in 0.02s ===============================
I am trying to get a simple test to work against the real django_db not the test database using the django rest framework.
Basic test setup:
import pytest
from django.urls import reverse
from rest_framework import status
from rest_framework.test import APIClient
#pytest.mark.django_db
def test_airport_list_real():
client = APIClient()
response = client.get(reverse('query_flight:airports-list'))
assert response.status_code == 200
assert len(response.json()) > 0
Running this test I get:
___________________________ test_airport_list_real ____________________________
#pytest.mark.django_db
def test_airport_list_real():
client = APIClient()
response = client.get(reverse('query_flight:airports-list'))
assert response.status_code == 200
> assert len(response.json()) > 0
E assert 0 > 0
E + where 0 = len([])
E + where [] = functools.partial(<bound method Client._parse_json of <rest_framework.test.APIClient object at 0x000001A0AB793908>>, <Response status_code=200, "application/json">)()
E + where functools.partial(<bound method Client._parse_json of <rest_framework.test.APIClient object at 0x000001A0AB793908>>, <Response status_code=200, "application/json">) = <Response status_code=200, "application/json">.json
query_flight\tests\query_flight\test_api.py:60: AssertionError
When just running in the shell using pipenv run python manage.py shell I get the expected results:
In [1]: from django.urls import reverse
In [2]: from rest_framework.test import APIClient
In [3]: client = APIClient()
In [4]: response = client.get(reverse('query_flight:airports-list'))
In [5]: len(response.json())
Out[5]: 100
Using the following packages:
pytest-django==3.2.1
pytest [required: >=2.9, installed: 3.5.1]
djangorestframework==3.8.2
django [required: >=1.8, installed: 2.0.5]
Is there anyway to get pytest to access the real database in this way?
The django_db marker is only responsible to provide a connection to the test database for the marked test. The django settings passed to pytest-django are solely responsible for the selection of database used in the test run.
You can override the database usage in pytest-django by defining the django_db_setup fixture. Create a conftest.py file in the project root if you don't have it yet and override the db configuration:
# conftest.py
import pytest
#pytest.fixture(scope='session')
def django_db_setup():
settings.DATABASES['default'] = {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'path/to/dbfile.sqlite3',
}
However, you shouldn't use the real database in tests. Make a dump of your current db to get a snapshot of test data (python manage.py dumpdata > testdata.json) and load it into an empty test database to populate it before the test run:
# conftest.py
import pytest
from django.core.management import call_command
#pytest.fixture(scope='session')
def django_db_setup(django_db_setup, django_db_blocker):
with django_db_blocker.unblock():
call_command('loaddata', 'testdata.json')
Now, you can't possibly corrupt your real db when running tests; any future changes in real db will not cause the tests to fail (for example, when some data was deleted) and you always have a deterministic state on each test run. If you need some additional test data, add it in JSON format to testdata.json and your tests are good to go.
Source: Examples in pytest-django docs.
You've got a couple options. Using Django's TestClient or DRF's APIClient will use the test database and local version of your app by default. To connect to your live API, you could use a library like Requests to perform HTTP requests, then use those responses in your tests:
import requests
#pytest.mark.django_db
def test_airport_list_real():
response = requests.get('https://yourliveapi.biz')
assert response.status_code == 200
assert len(response.json()) > 0
Just be extra careful to perform exclusively read-only tests on that live database.
I am writing a Flask application that uses SQLAlchemy for its database backend.
The Flask application is created with an app factory called create_app.
from flask import Flask
def create_app(config_filename = None):
app = Flask(__name__)
if config_filename is None:
app.config.from_pyfile('config.py', silent=True)
else:
app.config.from_mapping(config_filename)
from .model import db
db.init_app(app)
db.create_all(app=app)
return app
The database model consists of a single object called Document.
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
class Document(db.Model):
id = db.Column(db.Integer, primary_key=True)
document_uri = db.Column(db.String, nullable=False, unique=True)
I am using pytest to do unit testing. I create a pytest fixture called app_with_documents that calls the application factory to create an application and adds some Document objects to the database before the test is run, then empties out the database after the unit test has completed.
import pytest
from model import Document, db
from myapplication import create_app
#pytest.fixture
def app():
config = {
'SQLALCHEMY_DATABASE_URI': f"sqlite:///:memory:",
'TESTING': True,
'SQLALCHEMY_TRACK_MODIFICATIONS': False
}
app = create_app(config)
yield app
with app.app_context():
db.drop_all()
#pytest.fixture
def app_with_documents(app):
with app.app_context():
document_1 = Document(document_uri='Document 1')
document_2 = Document(document_uri='Document 2')
document_3 = Document(document_uri='Document 3')
document_4 = Document(document_uri='Document 4')
db.session.add_all([document_1, document_2, document_3, document_4])
db.session.commit()
return app
I have multiple unit tests that use this fixture.
def test_unit_test_1(app_with_documents):
...
def test_unit_test_2(app_with_documents):
...
If I run a single unit test everything works. If I run more than one test, subsequent unit tests crash at the db.session.commit() line in the test fixture setup with "no such table: document".
def do_execute(self, cursor, statement, parameters, context=None):
> cursor.execute(statement, parameters)
E sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: document [SQL: 'INSERT INTO document (document_uri) VALUES (?)'] [parameters: ('Document 1',)] (Background on this error at: http://sqlalche.me/e/e3q8)
What I expect is that each unit test gets its own brand-new identical prepopulated database so that all the tests would succeed.
(This is an issue with the database tables, not the unit tests. I see the bug even if my unit tests consist of just pass.)
The fact that the error message mentions a missing table makes it look like the db.create_all(app=app) in create_app is not being called after the first unit test runs. However, I have verified in the debugger that this application factory function is called once for every unit test as expected.
It is possible that my call to db.drop_all() is an incorrect way to clear out the database. So instead of an in-memory database, I tried creating one on disk and then deleting it as part of the test fixture cleanup. (This is the technique recommended in the Flask documentation.)
#pytest.fixture
def app():
db_fd, db_filename = tempfile.mkstemp(suffix='.sqlite')
config = {
'SQLALCHEMY_DATABASE_URI': f"sqlite:///{db_filename}",
'TESTING': True,
'SQLALCHEMY_TRACK_MODIFICATIONS': False
}
yield create_app(config)
os.close(db_fd)
os.unlink(db_filename)
This produces the same error.
Is this a bug in Flask and/or SQLAlchemy?
What is the correct way to write Flask test fixtures that prepopulate an application's database?
This is Flask 1.0.2, Flask-SQLAlchemy 2.3.2, and pytest 3.6.0, which are all the current latest versions.
In my conftest.py I was importing the contents of model.py in my application like so.
from model import Document, db
I was running the unit tests in Pycharm using Pycharm's pytest runner. If instead I run tests from the command line with python -m pytest I see the following error
ModuleNotFoundError: No module named 'model'
ERROR: could not load /Users/wmcneill/src/FlaskRestPlus/test/conftest.py
I can get my tests running from the command line by fully-qualifying the import path in conftest.py.
from myapplication.model import Document, db
When I do this all the unit tests pass. They also pass when I run the unit tests from inside Pycharm.
So it appears that I had incorrectly written an import statement in my unit tests. However, when I ran those unit tests via Pycharm, instead of seeing an error message about the import, the scripts launched but then had weird SQL errors.
I still haven't figured out why I saw the strange SQL errors I did. Presumably something subtle about the way global state is being handled. But changing the import line fixes my problem.
I am new to programming in general and this is my first web application in python (flask, sqlalchemy, wtforms, etc). I have been using the realpython.com course 2 as my study material on this subject. I have gotten to the point where i am learning about unit testing and i having trouble getting it to work correctly. I have compared the course example to the examples i found online and i am not seeing the issue with my code.
The problem i am encountering is that the test.py script correctly creates my test.db database but when it attempts to insert a test customer and it puts it into my production db (madsenconcrete.db) instead of my test db (test.db). If i remove the production db from the script directory it will raise this error when it cant find the db because its looking for madsenconcrete.db not test.db.
OperationalError: (sqlite3.OperationalError) no such table: customer [SQL: u'INSERT INTO customer (name, email, telephone, created_date) VALUES (?, ?, ?, ?)'] [parameters: ('Acme Company', 'acme#domain.com', '6125551000', '2016-01-03')]
I am not sure how to troubleshoot this issue. I have doing a lot of stare and compares and i do not see the difference.
import os
import unittest
import datetime
import pytz
from views import app, db
from _config import basedir
from models import Customer
TEST_DB = 'test.db'
class AllTests(unittest.TestCase):
############################
#### setup and teardown ####
############################
# executed prior to each test
def setUp(self):
app.config['TESTING'] = True
app.config['WTF_CSRF_ENABLED'] = False
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///' + os.path.join(basedir, TEST_DB)
app.config['SQLALCHEMY_ECHO'] = True
self.app = app.test_client()
db.create_all()
# executed after each test
def tearDown(self):
db.session.remove()
db.drop_all()
# each test should start with 'test'
def test_customer_setup(self):
new_customer = Customer("Acme Company", "acme#domain.com", "6125551000",
datetime.datetime.now(pytz.timezone('US/Central')))
db.session.add(new_customer)
db.session.commit()
if __name__ == "__main__":
unittest.main()
There would be an extensive amount of code i would have to paste so show all the dependencies. You can find the source code here.
https://github.com/ande0581/madsenconcrete
Thanks
Ultimately, the problem is that you are creating your db object from an already configured app:
# config
app = Flask(__name__)
app.config.from_object('_config')
db = SQLAlchemy(app)
If you use the create_app pattern (documented in more detail in this answer) you will be able to alter the configuration you are loading for your test application.
I have a problem to run selenium tests with separate django command. Default "test" command looks into "tests" folder and runs unittests ok. Problem is, i want to make folder "seleniumtests" and place there test files to run them with command "test_selenium". And i want this command to do the same as default django "test" but in another dir.
The tests.py with selenium:
from django_liveserver.testcases import LiveServerTestCase
from selenium.webdriver.firefox.webdriver import WebDriver
class MySeleniumTests(LiveServerTestCase):
# fixtures = ['test-data.json']
#classmethod
def setUpClass(cls):
cls.selenium = WebDriver()
super(MySeleniumTests, cls).setUpClass()
#classmethod
def tearDownClass(cls):
super(MySeleniumTests, cls).tearDownClass()
cls.selenium.quit()
def test_admin(self):
self.selenium.get(self.live_server_url +'/admin/')
self.assertIn("Django", self.selenium.title)
Follow this tutorial on how to put your tests into folders: http://www.pioverpi.net/2010/03/10/organizing-django-tests-into-folders/
in general:
from [Project Name].[App Name].tests.[filename] import *
from [Project Name].[App Name].seleniumtests.[selenium] import *
#starts the test suite
__test__= {
'your_django_tests': [filename],
'selenium': [selenium],
}