Django - Using multiple databases, one for each client - django

I am building a companion web/mobile application for an existing web app. The database is implemented in MySQL. I will be writing an API using Django/Django Rest Framework (DRF).
The application is used by various organizations. The interesting part about the implementation is that each organization has it's own database. In the existing web application, the user enters the database name along with the login credentials.
How to go about implementing this in Django? I am going to have lots of models - and they all need to read from the correct database depending on the current user.

in settings.py
DATABASES = {
'default': {},
'database_one': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'db1',
'PORT': '3306',
'HOST': '127.0.0.1',
'USER': 'db1_user',
'PASSWORD': 'db1_pwd'
},
'database_two': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'db2',
'PORT': '3306',
'HOST': '127.0.0.1',
'USER': 'db2_user',
'PASSWORD': 'db2_pwd'
}
}
For sync your database, you should do like this:
python manage.py migrate
python manage.py migrate --database=database_one
python manage.py migrate --database=database_two
The first command operates on the default database, and with the --database option you can operate the database you want.
More info. can be found here: https://docs.djangoproject.com/en/2.0/topics/db/multi-db/

Related

Django not using correct Postgres schema on Heroku

I have a Django app that I need to connect to a schema "new_schema" by default, rather than public. I've managed to get this working on my local instance by running
CREATE SCHEMA new_schema;
on the local database & then adding into my settings a database config like this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'OPTIONS': {
'options': '-c search_path=new_schema',
},
'NAME': 'database_name',
'USER': 'user_name',
'PASSWORD': 'password',
'HOST': 'db',
'PORT': '5432',
},
}
But, when I deploy to Heroku & create the schema there, I can't get the app to point to the new schema. In my production settings file, I have
DATABASES['default'] = dj_database_url.config(conn_max_age=600, ssl_require=True)
DATABASES['default']['OPTIONS']['options'] = '-c search_path=new_schema'
in order to get the Heroku database url into the right config format, and then add in my schema to the options. But for some reason this doesn't work - instead any migrations get applied to public as do db queries.
I can explicitly set the new_schema in the Postgres users search_path, but there are other servers that need different default schemas
I have been going in circles trying to resolve this for ages, I don't know what I'm missing! I can' tell if this is a problem with my Heroku/Postgres setup, or my Django app.

Django how to use separate database for default authentication

I have a database created from non-django app and have defined the database connection info like below
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.environ.get('POSTGRES_DB'),
'USER': os.environ.get('POSTGRES_USER'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
'HOST': os.environ.get('POSTGRES_HOST'),
'PORT': os.environ.get('POSTGRES_PORT'),
}
}
Unfortunately, when I migrate the django admin related tables, they all go into the default database.
I know that it is possible to declare multiple databases and separate read and write actions to different databases; however, what I would like to do is to have all the default django admin related tables to be created into another database. Say I declare a second database like below, how do I make sure that django admin related datas get migrated to the second database and also read from that when I login to django admin?
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.environ.get('POSTGRES_DB'),
'USER': os.environ.get('POSTGRES_USER'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
'HOST': os.environ.get('POSTGRES_HOST'),
'PORT': os.environ.get('POSTGRES_PORT'),
},
'admin':{
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.environ.get('ADMIN_POSTGRES_DB'),
'USER': os.environ.get('ADMIN_POSTGRES_USER'),
'PASSWORD': os.environ.get('ADMIN_POSTGRES_PASSWORD'),
'HOST': os.environ.get('ADMIN_POSTGRES_HOST'),
'PORT': os.environ.get('ADMIN_POSTGRES_PORT'),
}
}
first you must leave default dictionary as empty
then define your database for admin and non admin
for examample do this
DATABASES = {
'default':{},
'nonAdmin': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.environ.get('POSTGRES_DB'),
'USER': os.environ.get('POSTGRES_USER'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
'HOST': os.environ.get('POSTGRES_HOST'),
'PORT': os.environ.get('POSTGRES_PORT'),
},
'admin':{
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.environ.get('ADMIN_POSTGRES_DB'),
'USER': os.environ.get('ADMIN_POSTGRES_USER'),
'PASSWORD': os.environ.get('ADMIN_POSTGRES_PASSWORD'),
'HOST': os.environ.get('ADMIN_POSTGRES_HOST'),
'PORT': os.environ.get('ADMIN_POSTGRES_PORT'),
}
}
then after that you can migrate database using
./manage.py migrate --database=admin
or
./manage.py migrate --database=nonAdmin
If you don’t want every application to be synchronized onto a particular database, you can define a database router that implements a policy constraining the availability of particular models.
by using this method
allow_migrate(db, app_label, model_name=None, **hints)
you can reference this document https://docs.djangoproject.com/en/3.0/topics/db/multi-db/#topics-db-multi-db-routing

using Geodjango with oracle database

When working with GeoDjango I have a problem: when I make migrations then migrate new models the log said "no migrations to apply" and the oracle database still have no new table.
my settings.py is
DATABASES = {
"default": {
"ENGINE": "django.contrib.gis.db.backends.oracle",
"NAME":,
"USER":,
"PASSWORD":,
}
}
I need some help.
I can see you are learning, see:
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.oracle',
'NAME': 'mydatabase',
'USER': 'mydatabaseuser',
'PASSWORD': 'mypassword',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
First you need to create a user and password in your database, Oracle and set up this information in setting.py
You can see more information here.

Multiple databases (mongodb[mongoengine] and sql ) with django 1.8

I am using mongoengine with Django and my project needs to connect to one instances of MongoDB while another with sql .How my databse section of setting.py should be like ?
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'admin_db',
'USER': 'root',
'PASSWORD': 'root',
'HOST': 'localhost',
},
}
from mongoengine import connect
connect(
db='pom',
username='admin',
password='root',
host='mongodb://admin:root#localhost'
)
You could add multiple databases for your app in your settings.py like,
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'admin_db',
'USER': 'root',
'PASSWORD': 'root',
'HOST': 'localhost',
},
'your_desired_db_name' : {
'ENGINE' : 'django_mongodb_engine',
'NAME' : 'db_name'
}
For integration with mongodb, you may need to look up,
Django-nonrel
Django-MongoEngine
Also, you may need to look up Django documentation for multiple databases
MongoEngine does not support all Django contrib modules directly. If your projects dont need them (unlikely) you can use mongoengine directly. Otherwise you can also try
djongo
Which seems to work fine with the latest Django version.

Can I set up a database just for django testing?

Creating a temp database when running django testing is really time-consuming, so I have a thought to set up a separate database just for django testing. Is it possible? If so, how can i achieve it?
to be more specific: I don't want django to create a new database and do migration and test and destroy every single time I run ./manage.py test. I want it to connect to an existing database(When i already created for testing) and do all the work.
You can use the --keepdb option when running the test, so that the database is not destroyed at the end of each run.
./manage.py test --keepdb
I'm not sure if I understand your question, but when I do testing on development, I just use SQLite. SQLite is my testing database. When I put it on production, I use PostgreSQL. This is my settings.py when I'm testing on development:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
I used to use PostgreSQL for testing as well (I switched to SQLite because it is quicker to set up).
If I am testing with PostgreSQL, my settings.py would look like this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'TESTDB',
'USER': 'mydatabaseuser',
'PASSWORD': 'mypassword',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
and when I'm not testing, I'd use this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'ProductionDB',
'USER': 'mydatabaseuser',
'PASSWORD': 'mypassword',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
Like I mentioned earlier though, I started using SQLite for testing because it is quicker.