I have set up a database router to direct different apps and different models to different databases using the db_for_read and db_for_write router methods.
That works very well, except that ./manage.py syncdb does't respect those router settings.
When I syncdb my models, all of them are created in the default database.
The database router only provides an allow_syncdb method, but no sync_to method. Is there a way to tell the syncdb command where to create the new tables?
Note: I can't use the --database feature, as sometimes some of the model apps go to a different database than the rest of the app.
When you write your router make sure you've written the allow_syncdb() method. It takes both a database and a model. When you run manage.py syncdb you're essentially setting the --database=default. If you don't want your models to sync to the default database then your allow_syncdb() method should return False for the condition that db==default and model._meta.app_label==myapp.
You'll need to run syncdb with the --database=your_other_db option to get myapp into that db. But make sure in that case that allow_syncdb() returns True only for the case that db==your_other_db and model._meta.app_label==myapp.
Does that make sense? Basically you have to run the manage.py syncdb method twice, once for each database. You cannot run it only once and have it update both databases.
Related
Let's say that we have a Django app that looks on a legacy database.
If someone make changes on some database tables from a db client as DBeaver for example and not through Django models, is there a way to identify these changes?
You can do in a terminal, inside your Django project directory : python manage.py inspectdb > models.py
You will have models related to your tables.
By default, inspectdb creates unmanaged models. That is, managed = False in the model’s Meta class tells Django not to manage each table’s creation, modification, and deletion.
If you do want to allow Django to manage the table’s lifecycle, you’ll need to change the managed option above to True (or remove it because True is its default value).
Is there any way I can use my default local database for testing in Django 1.9. I also don't want to run any migrations, and I want to test it locally.
The reason I want to do it this way is that in my migrations, I have a data migration referring to some entry from a model and when tests run and create a test_database the migrations fail as there are no entries in the test model and this data migration use .get()
I don't know how I should resolve this issue. The best way I could think of is my default database for testing.
I have two databases configured, one for all but one tables, table, which I have in the second database, database2. I store data, and retrieve data perfectly without using any router, just the 'using' attribute on the save and the get. I was upgrading our app to django 1.8 a while ago, and it seemed to be working, table in database2 doesn't change that often, but it has been working ok since the upgrade.
But now I need to make a change to table in database2.
So I try manage.py migrate table --database database2. The migration seems to take, but I get an error in the end.
RuntimeError: Error creating new content types. Please make sure contenttypes is migrated before trying to migrate apps individually.
I've been trying to add contentypes and auth as fake migrations in database2, but I still get the error. Will adding a router make it go away? Are there any other ways?
Thanks!
I run migration on server in this way:
Upload models.py file to server with some new field sfield im model Mobject
Perform makemigration command in manage.py
perform migrate command in manage.py
But there are some requests between end of first step and end of third step which are failed with django.core.exceptions.FieldDoesNotExist: Mobject has no field named 'sfield' (Which is obvious, becouse django ORM can't fetch this field from DB but field already in Class, so django will try to do it)
Is it possible to make all 3 steps "Atomic"? Or globaly ignore this exceptions, becouse for now I don't need sfield, I only want perform migration without Exceptions. Or may be I can temporarely mark new field in some way to prevent django fetching it from DB, but it must be visible for makemigrations/migrate?
if you do select * from yourtable then django tries to fetch all fields defined in the model.
you can use only() in your orms to select specific fields, so that no exception will be raised while migrating new fields that are not used in orm yet
btw, you should create migration files locally, test the new field on your local machine and then commit the migration files to server. In server while deploying, you then need only migrate right after deployment, which makes the time shorter where exceptions can happen.
from the django docs:
The reason that there are separate commands to make and apply
migrations is because you’ll commit migrations to your version control
system and ship them with your app; they not only make your
development easier, they’re also useable by other developers and in
production.
I'm not sure how it works but I need two different models/apps in separated databases. So if I try syncdb --database db1 will get all installed apps into db1. So question is how to get just one app in another db?
Thanks
You need to write a database router which will filter models based on application and database in its allow_syncdb method. You cas start from the AuthRouter example in the documentation.