I am trying to create a management command which will create two migrations, one for adding models, fields etc, and the other for deleting, so that I can apply one of the migrations before deploying the app and the other after deployment has been on all the servers.
Is there any simple way of achieving this without human intervention?
You could import the migration file, iterate over Migration.operations and split it into two files according to your requirements. You will also have to check if there any dependencies on that migration and adjust them.
This would still be a hack and most likely doesn't cover all the edge cases. A proper solution would probably involve rewriting large chunks django.db.migrations.
Related
I'm just beginning my journey with Django framework and I read that Django developers have made using migrations mandatory beginning from version 2.0. I might be old school but I like my database separate from my code. I have always kept my database separate from my code models. I think that the migrations won't scale with the engineering team size.
So my question is 2 fold.
Can you not use Django 2.0 without the migrations as I don't think it will scale well and won't fit the CI/CD pipeline?
If we can't avoid the db migrations then how can we integrate them in a robust CI/CD pipeline where a model can be changed by different developers from different teams.
Yes, you can. You can create your tables manually and set Django to not manage your tables.
After your Django project is configured, just run on your terminal python manage.py inspectdb > models.py, and django will pick the models on the configured database. This is particularly good if your project will use a already existing or legacy database
Then, you can tell django to not manage your tables on the meta options of the model:
class MyModel(models.Model):
# your fields here
class Meta:
managed = False
See the docs here
But, unless you have a very good way to keep track of your table changes, I must say this is a mistake. Django migrations help you to keep track on your models changes along the way. It is really helpful if you need to rollback or understand your database history.
Migrations are not mandatory, it's not clear what you think has changed in 2.0 to make them so.
Migrations are intended for large teams. If you avoid them, you'll make things much much harder for yourself and your fellow team members.
I can think of three reasons why:
providing users with the flexibility on "when" to commit model changes
debugging modularity
perhaps resource consumption in larger
databases
However, it does seem that migrate always follows shortly after migration (tutorials/youtube videos).
so is there a philosophy behind this that I'm missing?
Ofcourse there are some reasons.
First of all, 'makemigrations' doesn't touch real DB, it just tells django how models(db scheme) have changed so you can see what's going on when you do 'migrate'.
and this makes django more safe.
This also provides to make default options for new fields or db changes..
Other reason is 'revert'.
If you want to roll-back db scehme with specific migrations, you can just tell django to roll back to specific migration file.
Another reason is 'reusable-app' principle.
If you create app with django and it could be reusable with no-db-interaction. It means if you deploy your app(or project, too!) to another project or server, it just needs 'migrations' files not real db.
Our product has a restful API and a server rendered app (the CMS). Both share the database. Both are written in django
The fields and the models needed in both are not mutually exclusive, there are some only particular to the API, some particular to the CMS, and some which are common.
My question is if I run migrations on one of the repos will they try to drop the fields that aren't present in the models of that particular repo, and needed by the other. Will running the migrations individually in both repos keep the database up to date and not pose a problem.
The only other valid option IMHO (besides merging projects) is turning off automation of Django migrations on common models (Meta.managed = False) and taking table creation & versioning into your own hands. You still can write migration scripts using django.db.migrations but makemigrations command won't do anything for these tables.
This was solved by using a schema migration tool external to Django's own. We use
yoyo migrations to migrate our schema now.
Will running the migrations individually in both repos keep the database up to
date and not pose a problem.
Unfortunately, no. As you suspected, changes in one will attempt to override the other.
The easiest thing to do is merge the two projects into one so this problem goes away entirely.
If this isn't an option, can the code be organised in such a way that both projects share the same models.py files? You could do this by perhaps having the models.py files and migrations folders only exist in one project. The second project could have a symlink across to each models.py file it uses. The trick (and the difficult part) will be to make sure you never create migrations for the app which uses the symlinks.
I think the best things to do would be to have one repo that contains all the fields. This project will be responsible to apply the migrations.
In the other projects, you'll need a db_router containing a function allow_migrate which will return False on your model classes.
Also having different db user with different db permissions can prevent from altering the tables.
I have two Django Projects that have different use cases. There are reached using different domains. They are hosted in two different servers. Also each Django project has it's own database.
Now, both the projects have some models and some business logic common between them. I don't want to duplicate the code and data which shall be chaotic going forward. Also, I want the models and code (business logic) to be in sync (when models/code is altered).
Can anyone guide me towards a pattern that can help me attain the required architecture: 2 separate projects with common models and business logic.
Thanks in advance.
I've done this before. You will have to move the shared models and business into a new python package (better if you can create a django app that encapsulates these models), in a separate directory.
Add this directory to your python path (the one that contains the package, not the package itself) and you should be able to use this code from within your projects.
The only downside to this is having to configure PYTHON_PATH in your servers or having to copy manually this package into your runtimes
I've got an app which has grown too big and I'd like to split it into two. It's not an easy task code-wise, but I'm more afraid of the database migrations I'll have to perform. Is there a proven way of doing such kind of thing?
Currently my thoughts are something like this:
Use south
Copy models to another app
Make a data migration to copy relevant models from one app to the other
Make another migration that removes the now unused models from the old app
Will have to make a script that does those operations in correct order
Has anybody done anything like this?
No need to do any database migrations if you don't want to. Just set the db_table Meta attribute on your models to refer to the old names, and everything will be perfectly happy.