django alter settings for one app - django

Well, I have a django project which works fine now.
I'd like to add a new app to it, in which I need to access multiple databases.
I know Django support multiple databases settings and know how to configure it. This is not the problem.
The issue is that, in the 90% of my project components, I don't need to maintain a multiple databases settings. The only usage for the second databases is in the new added app.
So I tried to alter the settings by calling:
django.conf.settings.configure(DATABASES = {....})
in the new app. And django said:
RuntimeError: Settings already configured.
Which makes sense, since I have the origin settings file and set the DJANGO_SETTINGS_MODULE up.
So I question is that what should be a good approach in this case.
I don't want to discard DJANGO_SETTINGS_MODULE variable.
I try to not include second database in the setting file initially, since the new app is also an independent module which should work independently outside the django project. So I want to have the similar config in new app to setup the database config.
Does anyone has any idea about this?
Thanks in advance!

Actually, I have the same issue in a current project. As you I have a totally independent app which uses an another database, and I could have other apps which could have the same behaviour.
The thing that I have done is to create a dir apps where I store my apps and then I add this at the end of my settings.py file :
DATABASE_ROUTERS = ['myproject.routers.MultiDBRouter']
import os
APPS_DIR = os.path.join(PROJECT_ROOT, 'apps')
for app_name in os.listdir(APPS_DIR):
print '\nLooking for settings in apps/%s :' % app_name
if os.path.exists(os.path.join(APPS_DIR, app_name, 'settings.py')):
print ' Settings file found...'
app = __import__('%s.settings' % app_name)
content = dir(app.settings)
if 'DATABASES' in content:
print ' Adding databases :'
for key, value in app.settings.DATABASES.iteritems():
if DATABASES.has_key(key):
print ' Can not add %s database config, because it already exists' % key
else:
DATABASES[key] = value
DATABASES[key]['APPS'] = [app_name]
print ' Added %s database config' % key
It will automatically look after settings.py file in all the apps/myapp/ directories. If it finds a new DATABASES variable in a app/myapp/settings.py file, it will add the other database configurations to your DATABASES variable (the true one).
I have also created a router to do not have to use the using command (the MultiDBRouter).
And then I add a settings.py file in all the app which requires another database :
DATABASES = {
'db': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'database',
'USER': 'username',
'PASSWORD': 'mysecretpassword',
}
}

Related

Is there a way to disable throttling while using Pytest in Django?

Problem:
I want to figure out a way to disable throttling when running my tests with pytest -vv
Details:
I have this default throttling policy in my settings.py file:
'DEFAULT_THROTTLE_RATES': {
'anon': '100/day',
'user': '1000/day'
}
I also have this fixture in my confest.py which returns an error whenever I exceed the limit of requests:
def get_token(user, client):
response = client.post(
"/email-login",
{"email":user.email, "password": "B9vX95phJDi3C4"},
)
return {
"HTTP_AUTHORIZATION": f"Bearer {response.json()['token']['access']}"
}
What I have tried:
I have attempted to use the solution in this GitHub Issue: https://github.com/encode/django-rest-framework/issues/1336, but it doesn't work in my case.
First you need to create a way to differentiate between test env and otherwise. Like we do for PROD and DEV using settings.DEBUG config.
My recommendation is to create an env variable test=Trueand then in your settings.py write -
if os.environ.get("test", False):
REST_FRAMEWORK = {
'DEFAULT_THROTTLE_CLASSES': [
'rest_framework.throttling.AnonRateThrottle',
'rest_framework.throttling.UserRateThrottle'
],
'DEFAULT_THROTTLE_RATES': {
'anon': '100/day',
'user': '1000/day'
}
}
else it does nothing, and drf will not throttle.
I was able to resolve this problem with the following steps:
I created a new settings file which inherited from the base settings file. i.e from settings import *
Then I deleted the DEFAULT_THROTTLE_RATES key i.e del REST_FRAMEWORK["DEFAULT_THROTTLE_RATES"]
Next thing I did was to point to the new settings file in pytest.ini i.e DJANGO_SETTINGS_MODULE="new_settings.py"
Now the tests will use the new settings file
#ra123 has the right idea in general. As another approach, with all Django projects I add something like this to my settings/__init__.py (or just settings.py if you do a one file thing). It looks at argv to see if its in test mode
IS_TESTING = bool(set(sys.argv[:2]) & {"pytest", "test", "jenkins"})
REST_FRAMEWORK = { "YOUR_CONFIG": "..." }
# at the very very end, AFTER your settings are loaded:
if IS_TESTING:
# override your rest framework settings in test mode
REST_FRAMEWORK["DEFAULT_THROTTLE_CLASSES"] = []
# some other handy things, for making tests faster/easier
PASSWORD_HASHERS = ("django.contrib.auth.hashers.MD5PasswordHasher",)
EMAIL_BACKEND = "django.core.mail.backends.locmem.EmailBackend"
DEFAULT_FILE_STORAGE = "inmemorystorage.InMemoryStorage"
I ended up with it this way so we don't have to worry about it ever getting the wrong settings. It also helps keep things centralized, so (for example) you don't call sentry.init in testing mode, even if there is a sentry_url in the environment.

Unit tests project with multiple application and databases - Circular dependency in TEST[DEPENDENCIES]

So I've got a django project with several applications.
Each application use its own database, and they share a common database containing django tables (such as auth, sessions).
For this I've got several database routers, so my settings looks like this :
DATABASES = {
'default': {
..
},
'app1_db': {
..
},
'app2_db':{
..
}
}
DATABASE_ROUTERS = ["site.db_router.App1Router", "site.db_router.App2Router"]
# no router for default database
Each application also got its unit tests. To troubleshoot my problem I extracted one of the application. It contains a tests module with four test files. Test file number one looks like below:
class ExcelTestCase(TransactionTestCase):
databases = ["app1_db"]
# some tests
Test case 1
python manage.py test app1.tests.testfile1 raises this error django.core.exceptions.ImproperlyConfigured: Circular dependency in TEST[DEPENDENCIES]
Test case 2
I comment the databases section : only default test database is created (not the application one) and this error is raised:
AssertionError: Database queries to 'app1_db' are not allowed in this test. Add 'app1_db' to app1.tests.testfile1.ExcelTestCase.databases to ensure proper test isolation and silence this fai
lure.
Test case 3
I uncomment databases setting and run python manage.py test app1.tests so every test are runned.
Three test files out of four are TransactionTestCase and therefore have databases settings.
I need to comment at least the first or third (second one is not TransactionTestCase) test file databases or I get Circulary dependency error, and by doing so I got AssertionError because I can't use the app database.
I'm using Django 2.2 and have been struggling on this issue for a while, any help will be appreciated !
Maybe setting your 'DEPENDECIES'.
controlling creation order for test databases
Check that they receive a list of dependencies and the databases without dependencies receive an EMPTY list.
Basically you have to add something like:
'default': {
..
'TEST': {
'DEPENDENCIES': ['app1_db'],
},
},
'app1_db': {
# ... db settings
'TEST': {
'DEPENDENCIES': [],
},
}

Does django-constance admin supports database backend?

I'm trying to setup the admin to show settings meant to be stored in database backend (Postgres 9.5.0). I manually created values in shell_plus as follows:
In [1]: from constance.backends.database.models import Constance
In [2]: first_record = Constance.objects.get(id=1)
In [3]: first_record
Out[3]:
pg-admin properly shows the entry although django admin doesn't show it at all. I ran migrate command for both databases (I have default and product databases) but the record still is not showing up. Certainly I can make it work with forcing to register with admin as follows:
admin.site.register(Constance)
but my question is if it's necessary?
Yes, they do.
You need to manage dependencies, but you can just use next command to install:
pip install "django-constance[database]"
Also you need to add some additionl settings to your settings.py :
CONSTANCE_BACKEND = 'constance.backends.database.DatabaseBackend'
INSTALLED_APPS = (
# other apps
'constance.backends.database',
)
#optional - in case you want specify table prefix
CONSTANCE_DATABASE_PREFIX = 'constance:myproject:'
Then you need to apply migrations by running command python manage.py migrate database
For displaying settings inputs in admin you should specify them in your settings.py. There are various types of fields and you even can add your own types of fields using CONSTANCE_ADDITIONAL_FIELDS parameter.
CONSTANCE_CONFIG = {
'THE_ANSWER': (42, 'Answer to the Ultimate Question of Life, '
'The Universe, and Everything'),
}
You can read more at documentation page.

Django multiple and dynamic databases

I've been evaluating django and wondered if the following is possible. I've already looked at the regular multiple database docs so please don't point me to that because this use case isn't mentioned as far as i can make out. If i'm wrong i take it back :)
I want one main database in which most of my app's models will reside, however one of the app's will need to dynamically create databases, these will be customer specific databases.
The database path (i plan to use sqlite) will be stored in primary database and so the cursor would need to be changed but the model will remain the same.
I would welcome any thoughts on ways to achieve this?
I will open with "You should not edit settings at runtime".
Having said that, I have exactly this same issue, where I want to create a unique database for each user. The reason for doing this is I am offering the ability for the user to save/access to/from a database not stored on my server, which entails having multiple databases, and thus one for each user.
This answer is NOT the recommended way to achieve the desired goal. I would love to hear from a django-guru how to best approach this problem. However, this is a solution I have been using and it has worked well so far. I am using sqlite however it can be easily modified for any of the databases.
In summary, this is the process:
Add the new database to settings (at runtime)
Create a file to store these settings for reloading when the server is restarted (at runtime)
Run a script which loads the saved settings files (whenever the server is restarted)
Now, how to achieve this:
1) Firstly, when a new user is created, I create a new database in the settings. This code lives in my view where new users are created.
from YOUR_PROJECT_NAME import settings
database_id = user.username #just something unique
newDatabase = {}
newDatabase["id"] = database_id
newDatabase['ENGINE'] = 'django.db.backends.sqlite3'
newDatabase['NAME'] = '/path/to/db_%s.sql' % database_id
newDatabase['USER'] = ''
newDatabase['PASSWORD'] = ''
newDatabase['HOST'] = ''
newDatabase['PORT'] = ''
settings.DATABASES[database_id] = newDatabase
save_db_settings_to_file(newDatabase) #this is for step 2)
This script loads the database settings 'at runtime' into the django project settings. However if the server is restarted, this database will no longer be in settings.
2) To facilitate reloading these settings automatically whenever the server is restarted, I create a file for each database which will be loaded whenever the server is started. Creating this file is performed by the function save_db_settings_to_file:
def save_db_settings_to_file(db_settings):
path_to_store_settings = "/path/to/your/project/YOUR_PROJECT_NAME/database_settings/"
newDbString = """
DATABASES['%(id)s'] = {
'ENGINE': '%(ENGINE)s', # Add 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME': '%(NAME)s', # Or path to database file if using sqlite3.
'USER': '', # Not used with sqlite3.
'PASSWORD': '', # Not used with sqlite3.
'HOST': '', # Set to empty string for localhost. Not used with sqlite3.
'PORT': '', # Set to empty string for default. Not used with sqlite3.
}
""" % db_settings
file_to_store_settings = os.path.join(path_to_store_settings, db_settings['id'] + ".py")
write_file(file_to_store_settings, newDbString) #psuedocode for compactness
3) To actually load these settings when the server is started, I add a single line to the very bottom of /path/to/your/project/YOUR_PROJECT_NAME/settings.py, which loads each file in the settings folder and runs it, having the effect of loading the database details into the settings.
import settings_manager
Then, import settings_manager will load the file at /path/to/your/project/YOUR_PROJECT_NAME/settings_manager.py, which contains the following code:
from settings import DATABASES
import os
path_to_store_settings = "/path/to/your/project/YOUR_PROJECT_NAME/database_settings/"
for fname in os.listdir(path_to_settings):
full_path = os.path.join(path_to_settings, fname)
f = open(full_path)
content = f.read()
f.close()
exec(content) #you'd better be sure that the file doesn't contain anything malicious
Note that you could put this code directly at the bottom of settings.py instead of the import statement, but using the import statement keeps the abstraction level of settings.py consistent.
This is a convenient way to load each database setting because to remove a database from the settings all you have to do is delete the settings file, and the next time the server restarts it won't load those details into the settings, and the database will not be accessible.
As I said, this works and I have had success using it so far, but this is NOT the ideal solution. I would really appreciate if someone could post a better solution.
What's bad about it:
It explicitly defies advice from django team not to modify settings at runtime. I do not know the reason for why this advice is given.
It uses an exec statement to load the data into settings. This should be OK, but if you get some corrupt or malicious code in one of those files you will be a sad panda.
Note that I still use the default database for auth and sessions data, but all the data from my own apps is stored in the user-specific database.
To augment #thedawnrider's answer, in some cases editing settings.DATABASES may not be enough. It might be more reliable to edit django.db.connections.databases, which serves as a cache and wrapper around settings.DATABASES.
e.g.
from django.db import connections
database_id = user.username #just something unique
newDatabase = {}
newDatabase["id"] = database_id
newDatabase['ENGINE'] = 'django.db.backends.sqlite3'
newDatabase['NAME'] = '/path/to/db_%s.sql' % database_id
newDatabase['USER'] = ''
newDatabase['PASSWORD'] = ''
newDatabase['HOST'] = ''
newDatabase['PORT'] = ''
connections.databases[database_id] = newDatabase
This question is pretty outdated and should be updated.
Django support multiple databases out of the box
Here is many good tutorials [ 1, 2 ] and answers how to manage connections dynamically
Also you may simply use django-dynamic-db-router
Database connection settings may be configured on the fly by using configure method of settings object:
from django.conf import settings
dbs = settings.DATABASES.copy()
dbs['some_new_db'] = {'ENGINE': 'dummy'}
settings.configure(DATABASES=dbs)

Changing Django settings at runtime

I'd like to expose some (app-specific) settings to the admin interface, so users can change them comfortably and also not have to restart Django.
How should I go about this?
I checked out the applications on http://djangopackages.com/grids/g/live-setting/ (btw django-constance was the most appealing) but really what all these apps are doing is storing values in a database, providing a web interface to change them, and caching. Aren't the first two features already built into Django?
The biggest drawbacks I see are that none of the apps are drop-in replacements for the old location of these settings (settings.py), and require me to migrate to their notation, and often add another context processor to access them in templates.
Couldn't I just do this?
Create a model for my settings (this gives me the various types and validation)
Instantiate one such object to hold my settings (this allows the users to edit them in the admin interface) - I could dump defaults as fixtures like for other models
Wrap settings.py so it makes a database query for my settings - http://www.loose-bits.com/2011/04/extending-django-settings-with-derived.html
From my current, naive point of view the only drawbacks I see would be:
Adding or changing the available settings requires a schema migration (south). - I can live with that.
I have a model with possibly multiple instances but really only need a singleton. - That could actually be a useful feature at some point.
Performance/Caching: Looking at http://code.djangoproject.com/svn/django/trunk/django/conf/ I'd have to put a little bit of cleverness into the settings wrapper and/or model, so that model changes clear or update cached values. - doesn't seem to be rocket science.
Doing the same in another project would require a similar effort again. - I think a single dictionary constant in settings.py, holding model name(s) and field names for the lookups is all that would differ.
Wouldn't this be the best of both worlds - runtime admin (with all its perks), database backend, caching, and none of my settings.USED_TO_BE_IN_SETTINGS_DOT_PY would need any changing. Am I missing something?
AFAIK, the Django settings are supposed to be immutable. There are multiple reasons for this, the most obvious being that Django is not aware of the server's execution model (prefork / multi-threaded).
Also, you can't load the settings themselves from a Django model because the settings need to be loaded before you can use anything in the ORM.
So, basically, you have two solutions:
you can bootstrap the settings from the database by using any lower-level database access mechanism to load them; or
you can just define your settings in some other model and fetch them directly when you need them.
The first is an incredible hack and I don't suggest it. The second is much more direct and cleaner, but requires you to change your usual habits (from django.conf import settings).
The second approach is probably what's implemented by the 3rd-party apps you linked to.
From Django 1.8 docs:
You shouldn’t alter settings in your applications at runtime.
DATABASES is a dict. So you can manipulate how a dictionary:
import django.conf as conf
conf.settings.DATABASES['default']['NAME'] = 'novo_banco'
Take a look: https://bitbucket.org/bkroeze/django-livesettings
*Django-Livesettings is a project split from the Satchmo Project_. It provides the ability to configure settings via an admin interface, rather than by editing "settings.py".*
Maybe it can be helpful for you.
Honestly I get more Django when I analyze his code. In version 1.4.5 did it (following the module below):
myproject\manage.py
django\core\management__init__.py ## method - execute_manager
django\conf__init__.py ## class - LazySettings; attr - _wrapped
django\utils\functional.py ## class LazyObject; important method -
new_method_proxy
Functional option, but it has its risks. In the python "_" considers the attribute as protected.
from django.conf import settings
settings._wrapped.INSTALLED_APPS = () ## *really work*
In the following project: https://github.com/alexsilva/DJPlugins
you can see this variable being modified at runtime. the idea of the project is already working.
You can use recomended .configure() method of settings module:
from django.conf import settings
settings.configure(DEBUG=True)
settings module has additional handy features. Check docs.
You cannot directly modify the settings.py file
For example:
If u want change the database at runtime, you should Separate the configuration of the database
# Projecr_name/user_database.py
user_database = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'name',
'USER': 'admin',
'PASSWORD': '111111',
'HOST': '127.0.0.1',
'PORT': '3306'
},
'user_db': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'user1',
'USER': 'admin',
'PASSWORD': '22222',
'HOST': '127.0.0.1',
'PORT': '3306'
}
}
# Projecr_name/settings.py
from .user_database import user_database
...
DATABASES = user_database
...
Call in your logical view
# view.py
from ../Projecr_name/user_database import user_database
class Some(...):
def Some(request):
user_database['user_db']['NAME']='user2'
then you can change any setting at runtime use this way
If you have a variable in the settings.py and you want it to change at any time there is 2 ways the
First one is to make a table in database then make for it a serializer then make a view set and any time you want to get it you could send a http request to this viewset then retrieve the data
Second one is to use caching it is so fast and it is familiar to localStorage in angular