Flask: what happened to the script_info_option? - flask

I wanted it to add common app-configuration options such as --config, --loglevel, --logfile, etc. to my flask.cli group and read them from the scriptinfo in my app factory function.
The #script_info_option decorator was apparently removed after 0.11 with a cryptic commit message like "implementing simplified interface".
So... how do I do add app-factory-time configuration options now?

Found out how you do it now:
Decorate your app factory with #click.pass_context so it gets the context as the first argument.
In the app factory, use ctx.find_root().params to get what was passed to the group.

You can create an app factory and pass arguments to it with the #pass_script_info decorator like so...
manage.py
#!/usr/bin/env python
import click
import config
from flask import Flask
from flask.cli import FlaskGroup, pass_script_info
def create_app(script_info):
app = Flask(__name__)
if script_info.config_mode:
obj = getattr(config, kwargs["config_mode"])
flask_config.from_object(obj)
...
return app
#click.group(cls=FlaskGroup, create_app=create_app)
#click.option('-m', '--config-mode', default="Development")
#pass_script_info
def manager(script_info, config_mode):
script_info.config_mode = config_mode
if __name__ == "__main__":
manager()
config.py
class Config(object):
TESTING = False
class Production(Config):
DATABASE_URI = 'mysql://user#localhost/foo'
class Development(Config):
DATABASE_URI = 'sqlite:///app.db'
class Testing(Config):
TESTING = True
DATABASE_URI = 'sqlite:///:memory:'
Now in the command line (after running pip install manage.py) you can do manage -m Production run.

Related

Using global variable with application context in flask

I defined a customized logger in mylogger.py, and the log dir is defined is config.py as LOG_DIR='/var/log'. In mylogger.py, the logger does some init work with LOG_DIR.
//mylogger.py
from flask import current_app
log_path = os.path.join(current_app.config['LOG_DIR'],"all.log")
handler = logging.FileHandler(filename = logPath)
....
//config.py
LOG_DIR = "/var/log"
Flask says an error message:
This typically means that you attempted to use functionality that needed
to interface with the current application object in some way. To solve
this, set up an application context with modules.app_context(). See the
documentation for more information.
How could I use variable in config.py within app_context? Thank you.
def register_logging(app):
log_path = os.path.join(app.config['LOG_DIR'],"all.log")
handler = logging.FileHandler(filename = logPath)
def create_app():
app = Flask()
register_logging(app)
return app
app = create_app()
or just import config.py
from config import LOG_DIR

Moving Custom CLI commands to another file

I have a few custom cli commands for a flask app I'm writing. I'm following the instructions here:
Command Line Interface
The problem is I do not want to put them all in my app.py file, it will get overbloated. What I would like to do is have my project structure:
project
|_ app.py
|_ cli.py
I thought about using a blueprint, but I get "Blueprint has no attribute 'cli'"
This is what I tried:
cli = Blueprint('cli', __name__) # I knew this would not work but I had to try
#cli.cli.command()
#click.argument('name')
def create_user(name):
print("hello")
Thanks
I would do something like this:
cli.py:
from flask import Flask
import click
def register_cli(app: Flask):
#app.cli.command()
#click.argument('name')
def create_user(name):
print("hello", name)
app.py:
from flask import Flask
from cli import register_cli
app = Flask(__name__)
register_cli(app)
It's common to create and configure (or just configure) app in factory functions.

using flask-migrate with flask-script and application factory

I'm building flask application and decided to try application factory approach this time, but got into trouble with flask-migrate and can't figure out simple solution.
Please note that I want to pass config location as an option to the script
manage.py:
manager = Manager(create_app)
manager.add_option("-c", "--config", dest="config_module", required=False)
then i need to create migrate instance and add command to the manager:
with manager.app.app_context():
migrate = Migrate(current_app, db)
manager.add_command('db', MigrateCommand)
but app instance is not created yet, so it fails
I know I can pass config in environment variable and create application before creating manager instance, but how to do it using manager options?
When you use a --config option you have to use an application factory function as you know, as this function gets the config as an argument and that allows you to create the app with the right configuration.
Since you have an app factory, you have to use the deferred initialization for all your extensions. You instantiate your extensions without passing any arguments, and then in your app factory function after you create the application, you call init_app on all your extensions, passing the app and db now that you have them.
The MigrateCommand is completely separate from this, you can add that to the Manager without having app and db instances created.
Example:
manage.py:
from app import create_app
from flask_migrate import MigrateCommand, Manager
manager = Manager(create_app)
manager.add_option("-c", "--config", dest="config_module", required=False)
manager.add_command('db', MigrateCommand)
app.py (or however your app factory module is called)
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_migrate import Migrate
db = SQLAlchemy()
migrate = Migrate()
def create_app(config):
app = Flask(__name__)
# apply configuration
# ...
# initialize extensions
db.init_app(app)
migrate.init_app(app, db)
return app

Importing Flask app when using app factory and flask script

This is Flask app context
app = Flask(__name__)
with app.app_context():
# insert code here
Most of the use cases of app context involves having 'app' initialized on the same script or importing app from the base.
My application is structured as the following:
# application/__init__.py
def create_app(config):
app = Flask(__name__)
return app
# manage.py
from application import create_app
from flask_script import Manager
manager = Manager(create_app)
manager.add_command("debug", Server(host='0.0.0.0', port=7777))
This might be really trivial issue, but how I should call 'with app.app_context()' if my application is structured like this?
Flask-Script calls everything inside the test context, so you can use current_app and other idioms:
The Manager runs the command inside a Flask test context. This means that you can access request-local proxies where appropriate, such as current_app, which may be used by extensions.
http://flask-script.readthedocs.org/en/latest/#accessing-local-proxies
So you don't need to use with app.app_context() with Manager scripts. If you're trying to do something else, then you'd have to create the app first:
from application import create_app
app = create_app()
with app.app_context():
# stuff here

How to Unit test with different settings in Django?

Is there any simple mechanism for overriding Django settings for a unit test? I have a manager on one of my models that returns a specific number of the latest objects. The number of objects it returns is defined by a NUM_LATEST setting.
This has the potential to make my tests fail if someone were to change the setting. How can I override the settings on setUp() and subsequently restore them on tearDown()? If that isn't possible, is there some way I can monkey patch the method or mock the settings?
EDIT: Here is my manager code:
class LatestManager(models.Manager):
"""
Returns a specific number of the most recent public Articles as defined by
the NEWS_LATEST_MAX setting.
"""
def get_query_set(self):
num_latest = getattr(settings, 'NEWS_NUM_LATEST', 10)
return super(LatestManager, self).get_query_set().filter(is_public=True)[:num_latest]
The manager uses settings.NEWS_LATEST_MAX to slice the queryset. The getattr() is simply used to provide a default should the setting not exist.
EDIT: This answer applies if you want to change settings for a small number of specific tests.
Since Django 1.4, there are ways to override settings during tests:
https://docs.djangoproject.com/en/stable/topics/testing/tools/#overriding-settings
TestCase will have a self.settings context manager, and there will also be an #override_settings decorator that can be applied to either a test method or a whole TestCase subclass.
These features did not exist yet in Django 1.3.
If you want to change settings for all your tests, you'll want to create a separate settings file for test, which can load and override settings from your main settings file. There are several good approaches to this in the other answers; I have seen successful variations on both hspander's and dmitrii's approaches.
You can do anything you like to the UnitTest subclass, including setting and reading instance properties:
from django.conf import settings
class MyTest(unittest.TestCase):
def setUp(self):
self.old_setting = settings.NUM_LATEST
settings.NUM_LATEST = 5 # value tested against in the TestCase
def tearDown(self):
settings.NUM_LATEST = self.old_setting
Since the django test cases run single-threaded, however, I'm curious about what else may be modifying the NUM_LATEST value? If that "something else" is triggered by your test routine, then I'm not sure any amount of monkey patching will save the test without invalidating the veracity of the tests itself.
You can pass --settings option when running tests
python manage.py test --settings=mysite.settings_local
Although overriding settings configuration on runtime might help, in my opinion you should create a separate file for testing. This saves lot of configuration for testing and this would ensure that you never end up doing something irreversible (like cleaning staging database).
Say your testing file exists in 'my_project/test_settings.py', add
settings = 'my_project.test_settings' if 'test' in sys.argv else 'my_project.settings'
in your manage.py. This will ensure that when you run python manage.py test you use test_settings only. If you are using some other testing client like pytest, you could as easily add this to pytest.ini
Update: the solution below is only needed on Django 1.3.x and earlier. For >1.4 see slinkp's answer.
If you change settings frequently in your tests and use Python ≥2.5, this is also handy:
from contextlib import contextmanager
class SettingDoesNotExist:
pass
#contextmanager
def patch_settings(**kwargs):
from django.conf import settings
old_settings = []
for key, new_value in kwargs.items():
old_value = getattr(settings, key, SettingDoesNotExist)
old_settings.append((key, old_value))
setattr(settings, key, new_value)
yield
for key, old_value in old_settings:
if old_value is SettingDoesNotExist:
delattr(settings, key)
else:
setattr(settings, key, old_value)
Then you can do:
with patch_settings(MY_SETTING='my value', OTHER_SETTING='other value'):
do_my_tests()
You can override setting even for a single test function.
from django.test import TestCase, override_settings
class SomeTestCase(TestCase):
#override_settings(SOME_SETTING="some_value")
def test_some_function():
or you can override setting for each function in class.
#override_settings(SOME_SETTING="some_value")
class SomeTestCase(TestCase):
def test_some_function():
#override_settings is great if you don't have many differences between your production and testing environment configurations.
In other case you'd better just have different settings files. In this case your project will look like this:
your_project
your_app
...
settings
__init__.py
base.py
dev.py
test.py
production.py
manage.py
So you need to have your most of your settings in base.py and then in other files you need to import all everything from there, and override some options. Here's what your test.py file will look like:
from .base import *
DEBUG = False
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'app_db_test'
}
}
PASSWORD_HASHERS = (
'django.contrib.auth.hashers.MD5PasswordHasher',
)
LOGGING = {}
And then you either need to specify --settings option as in #MicroPyramid answer, or specify DJANGO_SETTINGS_MODULE environment variable and then you can run your tests:
export DJANGO_SETTINGS_MODULE=settings.test
python manage.py test
For pytest users.
The biggest issue is:
override_settings doesn't work with pytest.
Subclassing Django's TestCase will make it work but then you can't use pytest fixtures.
The solution is to use the settings fixture documented here.
Example
def test_with_specific_settings(settings):
settings.DEBUG = False
settings.MIDDLEWARE = []
..
And in case you need to update multiple fields
def override_settings(settings, kwargs):
for k, v in kwargs.items():
setattr(settings, k, v)
new_settings = dict(
DEBUG=True,
INSTALLED_APPS=[],
)
def test_with_specific_settings(settings):
override_settings(settings, new_settings)
I created a new settings_test.py file which would import everything from settings.py file and modify whatever is different for testing purpose.
In my case I wanted to use a different cloud storage bucket when testing.
settings_test.py:
from project1.settings import *
import os
CLOUD_STORAGE_BUCKET = 'bucket_name_for_testing'
manage.py:
def main():
# use seperate settings.py for tests
if 'test' in sys.argv:
print('using settings_test.py')
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project1.settings_test')
else:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project1.settings')
try:
from django.core.management import execute_from_command_line
except ImportError as exc:
raise ImportError(
"Couldn't import Django. Are you sure it's installed and "
"available on your PYTHONPATH environment variable? Did you "
"forget to activate a virtual environment?"
) from exc
execute_from_command_line(sys.argv)
Found this while trying to fix some doctests... For completeness I want to mention that if you're going to modify the settings when using doctests, you should do it before importing anything else...
>>> from django.conf import settings
>>> settings.SOME_SETTING = 20
>>> # Your other imports
>>> from django.core.paginator import Paginator
>>> # etc
I'm using pytest.
I managed to solve this the following way:
import django
import app.setting
import modules.that.use.setting
# do some stuff with default setting
setting.VALUE = "some value"
django.setup()
import importlib
importlib.reload(app.settings)
importlib.reload(modules.that.use.setting)
# do some stuff with settings new value
You can override settings in test in this way:
from django.test import TestCase, override_settings
test_settings = override_settings(
DEFAULT_FILE_STORAGE='django.core.files.storage.FileSystemStorage',
PASSWORD_HASHERS=(
'django.contrib.auth.hashers.UnsaltedMD5PasswordHasher',
)
)
#test_settings
class SomeTestCase(TestCase):
"""Your test cases in this class"""
And if you need these same settings in another file you can just directly import test_settings.
If you have multiple test files placed in a subdirectory (python package), you can override settings for all these files based on condition of presence of 'test' string in sys.argv
app
tests
__init__.py
test_forms.py
test_models.py
__init__.py:
import sys
from project import settings
if 'test' in sys.argv:
NEW_SETTINGS = {
'setting_name': value,
'another_setting_name': another_value
}
settings.__dict__.update(NEW_SETTINGS)
Not the best approach. Used it to change Celery broker from Redis to Memory.
One setting for all tests in a testCase
class TestSomthing(TestCase):
def setUp(self, **kwargs):
with self.settings(SETTING_BAR={ALLOW_FOO=True})
yield
override one setting in the testCase
from django.test import override_settings
#override_settings(SETTING_BAR={ALLOW_FOO=False})
def i_need_other_setting(self):
...
Important
Even though you are overriding these settings this will not apply to settings that your server initialize stuff with because it is already initialized, to do that you will need to start django with another setting module.