Is that possible to have model with foreign key fields on different databases?
example:
class MultiBDModel(models.Model):
db1_user = models.ForeignKey(User) # here suppose to be foreign key on `db1`
db2_user = models.ForeignKey(User) # and here on `db2`
maybe copy somehow User. Apply for it custom manager. Which returns query set with using='db1'
in settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3', # Add 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME': 'db1', # Or path to database file if using sqlite3.
'USER': '', # Not used with sqlite3.
'PASSWORD': '', # Not used with sqlite3.
'HOST': '', # Set to empty string for localhost. Not used with sqlite3.
'PORT': '', # Set to empty string for default. Not used with sqlite3.
},
'website': {
'ENGINE': 'django.db.backends.sqlite3', # Add 'postgresql_psycopg2', 'postgresql', 'mysql', 'sqlite3' or 'oracle'.
'NAME': 'db2', # Or path to database file if using sqlite3.
'USER': '', # Not used with sqlite3.
'PASSWORD': '', # Not used with sqlite3.
'HOST': '', # Set to empty string for localhost. Not used with sqlite3.
'PORT': '', # Set to empty string for default. Not used with sqlite3. # Set to empty string for default. Not used with sqlite3.
}
}
No. As written by #ignacio-vazquez-abrams, one model must have all fields in the same database.
BUT
As an alternative, you could use a proxy model to link between models from two different databases.
Aim
One model should provide the fields from db1 and db2 at the same time
General trick
You have the model ContactMessage from db1, that we will name legacy_db. We suppose you don't want to touch to this model since it comes from another project.
Create a proxy model ProxyContactMessage, it has the same attributes than ContactMessage.
Use a database router to tell Django where to look in legacy_db for ProxyContactMessage objects.
Add a new model ExtendedContactMessage with the fields you would like to add. Declare a OneToOneField with ProxyContactMessage. This data will be saved to your db2 django_db.
Your proxy model cannot hold the new fields since it's abstract, but it can have methods that ask the related ExtendedContactMessage object (if any). Add the callables you want.
Example
In your legacy_app/models.py, the model on db1 legacy_db is:
class ContactMessage(models.Model):
subject = models.CharField(max_length=255)
message = models.TextField()
created_at = models.DateTimeField()
created_by = models.CharField(max_length=255)
class Meta:
managed = False
db_table = 'contact_message'
def __unicode__(self):
return self.subject
Therefore you create in myapp/models.py:
class ProxyContactMessage(ContactMessage):
class Meta:
proxy = True
verbose_name = 'Contact message'
verbose_name_plural = 'Contact messages'
def add_extension(self):
e = ExtendedContactMessage(contact_message=self)
e.save()
return e
def mark_as_processed(self):
try:
e = self.extendedcontactmessage
except ExtendedContactMessage.DoesNotExist:
e = self.add_extension()
e.mark_as_processed()
def processed(self):
return self.extendedcontactmessage.processed
def processed_at(self):
return self.extendedcontactmessage.processed_at
class ExtendedContactMessage(models.Model):
contact_message = models.OneToOneField(ProxyContactMessage)
processed = models.BooleanField(default=False, editable=False)
processed_at = models.DateTimeField(null=True, default=None, editable=False)
def mark_as_processed(self):
self.processed = True
self.processed_at = timezone.now()
self.save()
Note that only the non abstract model ExtendedContactMessage will be saved in db2, since ProxyContactMessage is abstract.
In settings.py, set DATABASE_ROUTERS with the class
class LegacyRouter(object):
"""
A router to control all database operations on models in the
legacy database.
"""
def db_for_read(self, model, **hints):
if model.__name__ == 'ProxyContactMessage':
return 'legacy_db'
return None
def db_for_write(self, model, **hints):
"""
Attempts to write in legacy DB for ContactMessage.
"""
if model.__name__ == 'ProxyContactMessage':
return 'legacy_db'
return None
Your default router sends everything to db2.
Finally you may have an admin class like:
def mark_as_processed(modeladmin, request, queryset):
for obj in queryset:
obj.mark_as_processed()
mark_as_processed.short_description = "Mark as processed"
class ProxyContactMessageAdmin(admin.ModelAdmin):
list_display = (
'subject',
'message',
'created_at',
'created_by',
'processed',
'processed_at',
)
actions = (mark_as_processed,)
admin.site.register(ProxyContactMessage, ProxyContactMessageAdmin)
Related:
Use a router for the proxy class
"Hack" the app_name in Meta
Catch the queryset
No. The ORM cannot do anything the database engine isn't capable of.
Related
I am trying to implement multiple database support for my django (version 1.11) app. For that purpose I have included in my settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'labrin_dbase',
'USER': 'labrin_admin',
'PASSWORD': 'ndzwwZHv63STuvAF?C_$L#j#*#epZXaX',
'HOST': 'localhost',
'PORT': '5432',
},
'comment': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'labrin_dbase_comments',
'USER': 'labrin_admin_comments',
'PASSWORD': 'adzwaTHv63STuvAF!C_$L#j#*#epZXaY',
'HOST': 'localhost',
'PORT': '5433',
}
}
DATABASE_ROUTERS = [
'labrin_task.comment_router.CommentRouter',
]
And my database router is configured as below:
class CommentRouter(object):
def db_for_read(self, model, **hints):
if model._meta.db_table == 'todo_comment':
return 'comment'
return None
def db_for_write(self, model, **hints):
if model._meta.db_table == 'todo_comment':
return 'comment'
return None
def allow_relation(self, obj1, obj2, **hints):
return True
def allow_migrate(self, db, app_label, model_name=None, **hints):
if model_name == 'comment':
return db == 'comment'
return None
Models in my "todo" app(which is only app in project):
from django.db import models
from django.contrib.auth import get_user_model
UserModel = get_user_model()
class Todo(models.Model):
name = models.CharField(max_length=64)
description = models.TextField()
author = models.ForeignKey(UserModel, on_delete=models.CASCADE)
deadline = models.DateTimeField()
created_at = models.DateTimeField(auto_now_add=True)
class Comment(models.Model):
todo = models.ForeignKey(Todo, on_delete=models.CASCADE)
author = models.ForeignKey(UserModel, on_delete=models.CASCADE)
text = models.CharField(max_length=256)
created_at = models.DateTimeField(auto_now_add=True)
class ShareTodo(models.Model):
todo = models.ForeignKey(Todo, on_delete=models.CASCADE)
with_user = models.ForeignKey(UserModel, on_delete=models.CASCADE)
comment_allowed = models.BooleanField(default=False)
When I remove comment database and DATABASE_ROUTERS from settings.py, my app is working normally. After adding mentioned to settings.py, my app returns an error when I create Comment object. The error says:
Exception inside application: insert or update on table "todo_comment" violates foreign key constraint "todo_comment_author_id_bb272a3e_fk_auth_user_id"
DETAIL: Key (author_id)=(1) is not present in table "auth_user". What am I doing wrong?
Note: I am starting two postgres servers as separate docker containers and after running containers, I run python manage.py migrate and python manage.py migrate --database=comment for making all migrations.
Sorry, but cross-database relations are not possible to recreate in Django. You can find full explanation in Django docs.
Furthermore, you cannot even do cross-database relations in PostgreSQL so even trying to hack it or to achieve it outside of Django won't be possible. Maybe for other database engines it is possible, you can do your own research.
I'm trying to create an app with a form that when submitted, updates a table in my database based on the info submitted, but I'm not sure how to go about it. Currently, I have a simple mode:
class Client(models.Model):
company_name = models.CharField(max_length=200)
launchpad_id = models.PositiveIntegerField()
client_email = models.EmailField()
content_id = models.CharField(max_length=200)
def __str__(self):
return self.company_name + ' | ' + self.content_id
and my databases configured like so:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'django_project',
'USER': 'xxx',
'PASSWORD': 'xxx',
'HOST': 'xxxx',
'PORT': 'xxx',
},
'info': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'reporting_database',
'USER': 'xxx',
'PASSWORD': 'xxx',
'HOST': 'xxx',
'PORT': 'xxx',
}
}
What I want to happen is when I submit my fields through the Client model either in admin or a template, it updates my client_info table in the reporting_database. I cant seem to figure out how to make that connection work though. I would appreciate any direction you can give me. Thanks.
You need to have a funx that makes it possible to create or update Model. You can follow the django doc:
https://docs.djangoproject.com/en/2.1/ref/models/instances/
Use a model form:
from your_app import Client
class ClientUpdateForm(forms.ModelForm):
class Meta:
model = Client
fields = ('company_name', 'launchpad_id', 'client_email', 'content_id')
def __init__(self, *args, **kwargs):
super(ClientUpdateForm, self).__init__(*args, **kwargs)
self.fields['company_name'].required = True
self.fields['launchpad_id'].required = True
self.fields['client_email'].required = True
self.fields['content_id'].required = True
def clean(self):
cleaned_data = super(ClientUpdateForm, self).clean()
company_name = cleaned_data.get('company_name')
launchpad_id = cleaned_data.get('launchpad_id')
client_email = cleaned_data.get('client_email')
content_id = cleaned_data.get('content_id')
Then inherit from UpdateView:
from django.views.generic import UpdateView
from your_app import Client
class ClientUpdateForm(UpdateView):
model = Client
form_class = ClientUpdateForm
template_name_suffix = '_update_form'
def form_valid(self, form):
if form.is_valid():
client = form.save(commit=False)
client.save()
return HttpResponseRedirect('/redirect/')
The template_name_suffix means you should call your template where you render the form client_update_form.html.
I think you need a database_router for this. Hold https://docs.djangoproject.com/en/2.1/topics/db/multi-db/. Basically you need to set up a DatabaseRouter where you set up from which database you will read the model table ecc. then set your Custom database route in django settings. And you will probably need to run migration for this model with something like this ./manage.py migrate myapp 0005_migration_to_run --databse=your_target_databse_name, I would probably recommend you having different apps inside project per database for take it easier.
I was able to achieve my desired results by adding the following code to my model:
def update_mysql(self):
cursor = db.cursor()
sql = "UPDATE tb_reporting_info SET client_email = '%s' WHERE content_id = '%s' AND launchpad_id = '%s';"
cursor.execute( sql % (self.client_email, self.content_id, self.launchpad_id))
db.commit()
I set my form action to action="{% url 'contact:addClient' %}" and my view to:
def addClient(request):
if request.method == 'POST':
# Set POST data to variables
company_name = request.POST['company_name']
launchpad_id = request.POST['launchpad_id']
content_id = request.POST['content_id']
client_email = request.POST['client_email']
client_added = True # Tells the template to render a success message
# Pass POST data to Client object
c = Client(company_name = company_name,
launchpad_id = launchpad_id,
content_id = content_id,
client_email = client_email
)
c.update_mysql()
It's bare bones but it works perfectly for what I need.
I have implemented Automatic DB Routing in Django and using AWS Aurora for Database with replication. I have found a minor replication lag with my database which hampering the flow. Issue occurs let's say when a read queryset is getting executed with 'slave' then while updating value using that queryset showing error something like 'read-only access for that table.' that means for update it should route to master db.
Here is my DB Settings for Multiple DB:
DATABASES = {
'master': {
'ENGINE': 'django.db.backends.mysql',
'STORAGE_ENGINE': 'MyISAM / INNODB / ETC',
'NAME': 'db',
'USER': 'master',
'PASSWORD': 'master',
'HOST': 'localhost',
'PORT': '3306',
},
'slave': {
'ENGINE': 'django.db.backends.mysql',
'STORAGE_ENGINE': 'MyISAM / INNODB / ETC',
'NAME': 'db',
'USER': 'name',
'PASSWORD': 'pass',
'HOST': 'localhost',
'PORT': '3306',
},
'slave2': {
'ENGINE': 'django.db.backends.mysql',
'STORAGE_ENGINE': 'MyISAM / INNODB / ETC',
'NAME': 'db',
'USER': 'name',
'PASSWORD': 'pass',
'HOST': 'localhost',
'PORT': '3306',
}
}
DATABASE_ROUTERS = ['path.to.AuthRouter']
Please provide me the best way to handle multiple db route automatically in django.
"""
DB Router core class which auto selects required database configuration
"""
class AuthRouter:
def db_for_read(self, model, **hints):
"""
Reads go to a replica.
"""
print 'db_for_read'
print model
return 'slave'
def db_for_write(self, model, **hints):
"""
Writes always go to master ie default.
"""
print 'db_for_write'
print model
return 'master'
def allow_relation(self, obj1, obj2, **hints):
"""
Relations between objects are allowed if both objects are
in the default/replica pool.
"""
db_list = ('master', 'slave')
if obj1._state.db in db_list and obj2._state.db in db_list:
return True
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""
All non-auth models end up in this pool.
"""
return True
You can split the read operations and isolate the writing operations for master DB like:
AuthRouter should call master DB since that's the fresh information from users.
class AuthRouter:
def db_for_read(self, model, **hints):
"""
Reads go to a replica.
"""
return 'master'
def db_for_write(self, model, **hints):
"""
Writes always go to master ie default.
"""
return 'master'
def allow_relation(self, obj1, obj2, **hints):
"""
Relations between objects are allowed if both objects are
in the default/replica pool.
"""
db_list = ('master', 'slave')
if obj1._state.db in db_list and obj2._state.db in db_list:
return True
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""
All non-auth models end up in this pool.
"""
return True
class PrimaryReplicaRouter(object):
"""
A router to control all read/write database operations.
"""
def db_for_read(self, model, **hints):
"""
Reads go to a randomly-chosen replica.
"""
return select_rand_db()
def db_for_write(self, model, **hints):
"""
Writes always go to primary.
"""
return 'master'
def allow_relation(self, obj1, obj2, **hints):
"""
Relations between objects are allowed if both objects are
in the primary/replica pool.
"""
return True
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""
All non-auth models end up in this pool.
"""
return True
def select_rand_db():
from numpy.random import choice
"""
this function returns rand db or default if running tests
:return:
"""
return choice(['master', 'slave'], p=[0.5, 0.5])
I've a 2 databases which I've been running on a single machine. my question is how how can I add More database server along with that I wanna use django sharding so how can I shard my data from both database according to users means user 1 data from db1 and db2 on 1st server user 2's data on another
You need to define a router in settings:
DATABASE_ROUTERS = ['routers.routers.ModelDatabaseRouter']
in routers create file routers:
class ModelDatabaseRouter(object):
"""Allows each model to set its own destiny"""
def db_for_read(self, model, **hints):
# Specify target database with field in_db in model's Meta class
if hasattr(model._meta, 'in_db'):
return model._meta.in_db
return None
def db_for_write(self, model, **hints):
# Specify target database with field in_db in model's Meta class
if hasattr(model._meta, 'in_db'):
return model._meta.in_db
return None
def allow_syncdb(self, db, model):
# Specify target database with field in_db in model's Meta class
if hasattr(model._meta, 'in_db'):
if model._meta.in_db == db:
return True
else:
return False
else:
# Random models that don't specify a database can only go to 'default'
if db == 'default':
return True
else:
return False
(from ) https://djangosnippets.org/snippets/2687/
Now you can define the DB in your models like this:
class YourClass(models.Model):
name = models.CharField(primary_key=True, max_length=50)
creation_time = models.DateTimeField()
class Meta:
in_db = 'api' <-- THIS
The db must be defined in settings:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'XXXXX',
'USER': 'XXXXX',
'PASSWORD': 'XXXXXX',
'HOST': '127.0.0.1',
'PORT': '3306',
},
'api': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'xxxx',
'USER': 'xxxx',
'PASSWORD': 'xxxxxx',
'HOST': '127.0.0.1',
'PORT': '3306',
}
}
I have a hard time with creating data migrations. I use two databases for my apps. I configured databases in settings.py and also created a router like in Django docs.
# settings.py
DB_HOST = 'localhost'
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'helios',
'HOST': DB_HOST,
'OPTIONS': {
'read_default_file': join(dirname(__file__), 'default.cnf'),
},
},
'other': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'gala_pol',
'HOST': DB_HOST,
'OPTIONS': {
'read_default_file': join(dirname(__file__), 'other.cnf'),
},
},
DATABASE_APPS_MAPPING = {
'contenttypes': 'default',
'auth': 'default',
'admin': 'default',
'sessions': 'default',
'messages': 'default',
'staticfiles': 'default',
'woodsmen': 'default',
'helios': 'default',
'hush': 'default',
'hunt': 'other',
'meat': 'other',
'beast': 'other',
}
# routers.py
class DatabaseAppsRouter(object):
def db_for_read(self, model, **hints):
if model._meta.app_label in settings.DATABASE_APPS_MAPPING:
return settings.DATABASE_APPS_MAPPING[model._meta.app_label]
return None
def db_for_write(self, model, **hints):
if model._meta.app_label in settings.
return settings.DATABASE_APPS_MAPPING[model._meta.app_label]
return None
def allow_relation(self, obj1, obj2, **hints):
db1 = settings.DATABASE_APPS_MAPPING.get(obj1._meta.app_label)
db2 = settings.DATABASE_APPS_MAPPING.get(obj2._meta.app_label)
if db1 and db2:
return db1 == db2
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
if db in settings.DATABASE_APPS_MAPPING.values():
return settings.DATABASE_APPS_MAPPING.get(app_label) == db
elif app_label in settings.DATABASE_APPS_MAPPING:
return False
Here is the model and migrations of one of those apps:
# hunt.models.py
class Dish(models.Model):
"""
Investigation case
"""
display_name = models.CharField(max_length=64, unique=True)
department = models.ForeignKey(Kitchen, null=True)
case_type = models.PositiveSmallIntegerField(choices=CASE_TYPE_CHOICES, default=DEF_CASE_TYPE)
created_at = models.DateTimeField(blank=True, null=True)
comment = models.CharField(max_length=256, blank=True, null=True)
class Meta:
verbose_name = 'case'
app_label = 'hunt'
def __unicode__(self):
return (u'%s (%s)' % (self.display_name, self.created_at)).strip()
# hunt.migrations.0001_initial.py
class Migration(migrations.Migration):
app_label = 'hunt'
dependencies = [
]
operations = [
migrations.CreateModel(
name='Dish',
fields=[
('id', models.AutoField(verbose_name='ID', auto_created=True, primary_key=True, serialize=False)),
('display_name', models.CharField(max_length=64, unique=True)),
('case_type', models.PositiveSmallIntegerField(default=0, choices=[(0, 'Unknown'), (1, 'General'), (2, 'Terror'), (3, 'Narco'), (4, 'Fraud'), (5, 'Slavery'), (6, 'Traffic'), (7, 'RICO'), (8, 'War'), (9, 'Cyber'), (20, 'Other')])),
('created_at', models.DateTimeField(null=True, blank=True)),
('comment', models.CharField(max_length=256, null=True, blank=True)),
],
options={
'verbose_name': 'case',
},
),
]
# hunt.migrations.0002_add_hunts.py
def create_initial_hunts(apps, schema_editor):
if settings.DEBUG:
print('\nContent added')
class Migration(migrations.Migration):
dependencies = [
('hunt', '0001_initial'),
]
operations = [
migrations.RunPython(create_initial_hunts, hints={'schema_editor': 'other'}),
]
The problem is:
When i run "migrate" command, only applications that connected to default database are migrated. The migrations in rest of the apps are never run. If I launch migrate for such an app with --database option - it works fine.
How can I specify the database per migration? Isn't the router supposed to manage exactly this? Or I missed something else?
You have to run migrate once for each database, specifying the target with --database. Each time it will consult your router to see which migrations to actually perform on that database.
I'm guessing it was designed this way to favor explicitness over implicitness. For example, your workflow might require you to migrate the different databases at different times.
Note, though, that you won't be able to tell from the output which migrations were actually performed, since:
If allow_migrate() returns False, any migration operations for the model_name will be silently skipped when running migrate on the db.
Using these nice helpers you can run Python/SQL migrations on specific Database
[Helpers]
from django.db.migrations import RunPython, RunSQL
def run_python_specific_db_migration(migration_func, use_db):
"""calls RunPython command only for specific database """
return RunPython(migration_func, hints={'use_db': use_db})
def run_sql_specific_db_migration(sql_commands, use_db):
"""Runs one or list of sql commands only on specific database """
return RunSQL(sql_commands, hints={'use_db': use_db})
# ** Your specific db_helpers for your DB_KEY **
def run_sql_your_db_migration(sql_commands):
return run_sql_specific_db_migration(sql_commands, use_db=DB_KEY)
def run_python_your_db_migration(migration_func):
return run_python_specific_db_migration(migration_func, use_db=DB_KEY)
[Usage]
def data_migration(apps, schema_editor):
...your data migration logic..better to wrap with #atomic...
class Migration(migrations.Migration):
operations = [ run_python_your_db_migration(data_migration) ]