In Django how can I influence the connection process to the DB? - django

What is the best method to change the way how django connects to the DB (i.e. MySQL)?
For instance what should I do if I needed django to connect to the db via ssh tunnel, whose settings may change dynamically? (I'm planning to use sshtunnel)
I understand I should sub-class the django.db.backends.mysql.base.DatabaseWrapper and probably super()/modify the get_new_connection(self, conn_params)? (see the example below)
But then how would I submit this custom class in the settings, as it seems that the settings expect path to the module, rather than the class?
Something among the lines:
class myDatabaseWrapper(DatabaseWrapper):
"""Oversimplified example."""
def get_new_connection(self, conn_params):
with open('path/to/json.js', 'rt') as file:
my_conn_params = json.load(file)
conn_params.update(my_conn_params)
return super().get_new_connection(my_conn_params)

Related

Django settings.py

At my project login, some settings.py environment variables are loaded to enable some behaviors:
unit_id = settings.COMPANY
When another user logged in the system changes the value of this variable, through a function, it reflects in all other users who are already active:
settings.COMPANY = "coke"
in this case, all users will see "coke" in settings.COMPANY. I believed this would be in memory and would only apply to the user section in question, because I did not write in the physical file.
I wonder if this is how Django handles the settings.py environment variables: Does it propagate dynamically to all instances opened by all users?
This variable is accessed by context_processors.py, below:
def units(request):
unit_id = settings.COMPANY
You should not change settings at runtime.
This is (mainly) because Django doesn't know anything about it's runtime, so it's definitely possible to run multiple instances of the same Django installation. Changing a setting like this will not propagate it to any other processes.
I wonder if this is how Django handles the settings.py environment
variables: Does it propagate dynamically to all instances opened by
all users?
Django doesn't run an instance for every user. There is one or more (if you for example use something like gunicorn or if you use multiple servers with a load balancer.) processes that listen on a certain port.
To have some changeable setting, you could specify a default value, but you should store something like the active company in the database.

How to override database settings in Django tests?

I'm on Django 1.8 (using pytest) and I have the following configuration:
A default and a readonly database managed by a MasterSlaveRouter that directs DB calls to one connection or the other depending on whether they're read or write operations.
In my development environment, both entries in the settings.DATABASES dictionary have the same setup (they just use a different connection, but the database is the same).
In my test environment, however, there's only a default database.
I have a post_save signal fired whenever a model Foo is saved.
I have an atomic operation (decorated with #transaction.atomic) that modifies a Foo instance and calls .save() on it twice. Since no custom using parameter is passed to the decorator, the transaction is only active on the default database.
The post_save callback creates a Bar record with a OneToOneField pointing to Foo, but only after checking whether a Bar record with this foo_id already exists (in order to avoid IntegrityError). This check is done by performing this query:
already_exists = Bar.filter(foo=instance).exists()
This is ok the first time the post_save callback is called. A Bar record is created and everything works fine. The second time, however, even though such a Bar instance was just created in the previous Foo save, since filtering is a read operation, it is performed using the readonly connection, and therefore already_exists ends up containing the value False and the creation of a new record is triggered, which eventually throws an IntegrityError because when the create operation is performed on the default connection, there is already a record with that foo_id.
I tried copying the DATABASES dictionary from dev_settings to test_settings, but this broke many tests. I then read about the override_settings decorator and thought it would be perfect for my situation. For my surprise, however, it didn't work. It seems that at some point, when the application is initiated, the DATABASES dictionary (the one only with default from the test_settings) is cached and then even though I change setting.DATABASES, the new value is simply not accessed anymore.
How can I properly override the database configuration for one specific test?
Hum... well if you are using only pytest, I think you'll need to cleanup your databases after tests.
Now, to override django settings, it's good to :
from django.test import override_settings
#override_settings(DATABASE_CONFIG=<new_config>)
def test_foo():
pass
You should try the pytest-django:
pytestmark = pytest.mark.django_db
#pytest.mark.django_db
def test_foo():
pass
When you run your tests, you can set the create-db param, to force py.test create a new database or if you want to reuse your db, you can set the reuse-db, like:
$ py.test --create-db
$ py.test --reuse-db
checkout:
Oficial docs

How can I know if the connection to an existing database Neo4j has been successful?

I am newbie in python / django. Using neo4django library and an existing Neo4j database I would like to connect to it and test if the connection is successful. How can I achieve this behavior?
You don't 'connect' to a database anymore. This is the Frameworks job. You just define the parameters and start writing Models.
Those Models are your entities with fields which can be used like a variable. In other words: your models are your definition of the Database tables.
You can test against http://host:7474/db/data/ if it returns a 200, Ok.
I don't know much about neo4django, but you can test if a database is accessible with py2neo, the general purpose Python driver (http://py2neo.org/2.0/).
One simple way is to ask the server for its version:
from py2neo import Graph
from py2neo.packages.httpstream import SocketError
# adjust as necessary
graph = Graph("http://localhost:7474/db/data/")
try:
print(graph.neo4j_version)
except SocketError:
print('No connection to database.')

Multi-tenant Django applications using Mongoengine

I want to build a multi tenant architecture for a SAAS system. We are using Django as our backend and mongoengine as our main database and gunicorn as our web-server.
Our clients are a few big companies, so the number of databases pre-allocating space shouldn't be a problem.
The first approach we took was to write a middleware to determine the source of the request to properly connect to a mongoengine database. Here is the code:
class MongoConnectionMiddleware(object):
def process_request(self, request):
if request.user.is_authenticated():
mongo_connect(request.user.profile.establishment)
And the mongo_connect method:
def mongo_connect(establishment):
db_name = 'db_client_%d' % establishment.id
connect(db_name)
This will register the "default" alias as the db_name for every mongoengine request.
But it seems that when many concurrent users from different companies are making requests, each one sets the default db_name to it's own name.
As an example:
Company A makes a request and connects to database A. While A is making it's work company B connects to database B. This makes A also connect to B's database in the process, so A fails to find some ids.
¿Is there a way to isolate the connection to the mongo database per request to avoid this problem?
Unfortunately MongoEngine seems to be designed around a very basic use case of a single primary connection and multiple auxiliary connections.
http://docs.mongoengine.org/en/latest/guide/connecting.html#connecting-to-mongodb
To get around the default connection logic, I define the first connection I come across as the default, I also add it as a named connection. I then add any subsequent connection as named connections only.
https://github.com/MongoEngine/mongoengine/issues/607#issuecomment-38651532
You can use the with_db decorator to switch from one connection to another, but it's a contextmanager call, which means as soon as you leave the with statement, it will revert. It also still requires a default connection.
http://docs.mongoengine.org/en/latest/guide/connecting.html#switch-database-context-manager
You might be able to put it inside a function and then yield inside the with to prevent it reverting immediately, I'm not sure if this is valid.
You could use a wrapper of some kind, either a function, class or a custom QuerySet, that checks the current django/flask session and switches the db to the appropriate connection.
I'm not sure if a QuerySet can do this, but it would probably be the nicest way if it can.
http://docs.mongoengine.org/en/latest/guide/querying.html#custom-querysets
I included some code in this issue here where I change the database connection for my models.
https://github.com/MongoEngine/mongoengine/issues/605
def switch(model, db):
model._meta['db_alias'] = db
# must set _collection to none so it is re-evaluated
model._collection = None
return model
MyDocument = switch(MyDocument, 'db-alias')
You'll also want to take a look at the code that mongoengine uses to switch dbs.
Beware that mongo engine likes to cache things, so changing a few variables here and there doesn't always cause an effect. It's full of surprises like this.
Edit:
I should also add, that the 'connect' call won't pick up value changes. So calling connect with new parameters wont take effect unless its a new alias. Even the disconnect function (which isn't exposed publically) doesn't let you do this as the models will cache the connection. I mention this in some of the issues linked above and also here: https://github.com/MongoEngine/mongoengine/issues/566

Django Celery Database for Models on Producer and Worker

I want to develop an application which uses Django as Fronted and Celery to do background stuff.
Now, sometimes Celery workers on different machines need database access to my django frontend machine (two different servers).
They need to know some realtime stuff and to run the django-app with
python manage.py celeryd
they need access to a database with all models available.
Do I have to access my MySQL database through direct connection? Thus I have to allow user "my-django-app" access not only from localhost on my frontend machine but from my other worker server ips?
Is this the "right" way, or I'm missing something? Just thought it isn't really safe (without ssl), but maybe that's just the way it has to be.
Thanks for your responses!
They will need access to the database. That access will be through a database backend, which can be one that ships with Django or one from a third party.
One thing I've done in my Django site's settings.py is load database access info from a file in /etc. This way the access setup (database host, port, username, password) can be different for each machine, and sensitive info like the password isn't in my project's repository. You might want to restrict access to the workers in a similar manner, by making them connect with a different username.
You could also pass in the database connection information, or even just a key or path to a configuration file, via environment variables, and handle it in settings.py.
For example, here's how I pull in my database configuration file:
g = {}
dbSetup = {}
execfile(os.environ['DB_CONFIG'], g, dbSetup)
if 'databases' in dbSetup:
DATABASES = dbSetup['databases']
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
# ...
}
}
Needless to say, you need to make sure that the file in DB_CONFIG is not accessible to any user besides the db admins and Django itself. The default case should refer Django to a developer's own test database. There may also be a better solution using the ast module instead of execfile, but I haven't researched it yet.
Another thing I do is use separate users for DB admin tasks vs. everything else. In my manage.py, I added the following preamble:
# Find a database configuration, if there is one, and set it in the environment.
adminDBConfFile = '/etc/django/db_admin.py'
dbConfFile = '/etc/django/db_regular.py'
import sys
import os
def goodFile(path):
return os.path.isfile(path) and os.access(path, os.R_OK)
if len(sys.argv) >= 2 and sys.argv[1] in ["syncdb", "dbshell", "migrate"] \
and goodFile(adminDBConfFile):
os.environ['DB_CONFIG'] = adminDBConfFile
elif goodFile(dbConfFile):
os.environ['DB_CONFIG'] = dbConfFile
Where the config in /etc/django/db_regular.py is for a user with access to only the Django database with SELECT, INSERT, UPDATE, and DELETE, and /etc/django/db_admin.py is for a user with these permissions plus CREATE, DROP, INDEX, ALTER, and LOCK TABLES. (The migrate command is from South.) This gives me some protection from Django code messing with my schema at runtime, and it limits the damage an SQL injection attack can cause (though you should still check and filter all user input).
This isn't a solution to your exact problem, but it might give you some ideas for ways to smarten up Django's database access setup for your purposes.