Importing multiple models in create_app method - flask

I have a base flask app that I have put together to build future projects from. I have it set up with an application factory method "create_app".
https://github.com/thenetimp/flask_base_v2/blob/master/app/init.py#L1-L32
In the create_app method, I am initializing the application object. Then pass it to the previously initialized db object eventually calling db.create_all to create the database from my model(s).
In order for this to work I have to import any model I may have into the create_app function. This isn't problematic for a small database with a few tables, but if I have a database with a large number of tables it seems like there should be a better way. from app.models import * doesn't work inside functions, so I have to ask is there another way to manage this?

in the code I was trying to do
def create_app():
from app.models import *
...
when all I needed to do was
def create_app():
import app.models
...
get so use to doing it one way I forgot about the other.

Related

Services and flask_sqlalchemy inside a flask application

I'm building a large, complex Flask application. It has lots of route functions (which need to do database calls) and lots of services (which also need to do database calls).
I'm using flask_sqlalchemy to do the database calls in the route functions as normal with model syntax (User.query).
I would also like to make database calls in the services. However, I can't do this without setting an app context:
with app.app_context():
User.query
which requires importing the app, which leads to circular imports (because the app imports the routes, which import the services, which have to import the app).
QUESTION 1: Do you know a way to get round this circular import problem?
Another approach could be to use flask_sqlalchemy in the routes and sqlalchemy in the services. This means nothing needs to be shared except for the database url and.... the models.
QUESTION 2: Do you know a way to use the same model files with both flask_sqlalchemy and normal sqlalchemy?
I've tried this: https://dev.to/nestedsoftware/flask-and-sqlalchemy-without-the-flask-sqlalchemy-extension-3cf8
...however it breaks my relationships.
I am following the flask app pattern shown here: https://github.com/sloria/cookiecutter-flask/blob/master/%7B%7Bcookiecutter.app_name%7D%7D/README.rst (application factory)
Thanks,
Louise

Custom faker provider for usage with factory boy and pytest

I am attempting to add some custom faker provider to use with factory_boy and pytest.
I put the provider in
faker_providers/foo.py/Provider.
In my factories.py file, I have to import foo.py and then register by running:
factory.Faker.add_provider(foo.Provider)
I am thinking of using pytest_sessionstart(session) to auto-register all the custom provider under faker_providers. Is there a way to do that?
Any suggestions for other ways to organize and register custom providers would also be appreciated.
Instead of instantiating a Faker to import from conftest, as of 2022, you can do the following inside your conftest file:
from factory import Faker
Faker.add_provider(CustomProvider)
And now you can just import from factory import Faker wherever you go. This works because the Faker class has a class attribute of type dictionary that saves your new providers.
The reason I put it in conftest is because this is where code is run initially for all pytest. You can also put this in a pytest plugin setup method as well, but I found this the easiest.
The other way of doing this that I implemented was to have a utils folder with all my custom providers. Then, in a providers.py, I would add all my providers:
from factory import Faker
from .custom_provider import CustomProvider
for provider in (CustomProvider,):
Faker.add_provider(provider)
Then in conftest.py, I would simply import this file:
import path.to.provider # noqa
This way, I don't clutter my conftest too much.
It seems like a design choice and only you know the best answer to it.
However, I would recommend instantiating faker = Faker() once for all tests after that adding all providers in a configuration file. And import faker from that place to everywhere it's needed.
It seems like conftest.py is a good choice for that.

Import Django models from outside the project

I'm working on an app which uses two tables from different databases.I manage to make the connection and make the tables structures in models.py, but now one I change the models.py file, I copy one of the tables in another python script, and I put the file elsewhere for other people to use it.My question it is possible in Django to import a model from outside the project? or the package?
The App is called banner_manager and in views.py I want to import a model called user from another project called django_models
when I try to import like this:
from ....models_django import models.py(in models.py it's the class "user" defined) it says: ValueError: Attempted relative import beyond top-level package
You can add this directory to PYTHONPATH for example:
export PYTHONPATH=$PYTHONPATH:/var/python/your-libs
And then just import package as normal:
import models_django

Where to initialize MongoDB connection in Django projects?

I wonder where I should initialize my MongoDB connection in my Django projects.
Currently I am initializing the client and db before my view functions in views.py:
import pymongo
from django.conf import settings
client = pymongo.MongoClient(settings.MONGO_URI)
db = client.get_default_database()
def some_view(request):
pass
However I also need to use MongoDB in my models.py in conjunction with Django signals. What do you suggest?
Maybe settings.py? Or even root __init__.py? Then you can import client and db everywhere you need it.
I've decided to use project/mongodb.py (same folder as settings.py)
import pymongo
from django.conf import settings
client = pymongo.MongoClient(settings.MONGO_URI)
mongodb = client.get_default_database()
I am using two different settings files for local and production. Therefore, this approach makes it possible to use environment dependent settings, while enabling me to access mongodb variable from anywhere in the project.

Using django-discover-runner without database

I'm trying to use django-discover-runner to test my app. It's basically a WebService frontend, so it doesn't include a database, and, apparently, django-discover-runner doesn't like that.
Looking in other questions, I've seen that with plain Django, I should inherit from DjangoTestSuiteRunner and set settings.TEST_RUNNER. It works fine. But django-discover-runner uses its own discover_runner.DiscoverRunner class, so I tried this:
from discover_runner import DiscoverRunner
class DBLessTestRunner(DiscoverRunner):
def setup_databases(self):
pass
def teardown_databases(self, *args):
pass
But it doesn't work. I get this error message:
ImproperlyConfigured: settings.DATABASES is improperly configured. Please supply the ENGINE value. Check settings documentation for more details.
Any idea how to get django-discover-runner working without a DataBase?
In Django 1.6 the standard Django TestCase inherits from TransactionTestCase which attempts to access the database.
To fix the problem in your test class inherit from SimpleTestCase rather then TestCase:
from django.test import SimpleTestCase
class TestViews(SimpleTestCase):
...
You should now be able to run your tests with out setting up the database.