How to manage version controlled Django migrations as a team? - django

We're a small team of developers working on a version-controlled Django 3 project. We are worried about utilizing migrations and the possibility of overwriting one another's migration files.
Options we've considered:
Using the --name / -n option of makemigrations to have a naming convention, but that seems cumbersome
Django documentation mentions tools for version control, but I don't see much on the specifics on it.
How do other teams handle this?

Using version control (git, etc.) will solve your problem.
Just make sure the developers work on a branch rather than directly on master. This way, the migrations will be visible, and if there's a conflict, git will let you know when you try to merge to your master branch. It's even more obvious if you're using GitHub as you can see the conflicts in the UI for the pull request.
The only problem you will have here is if two migrations are generated in the same app in two PRs. The one that gets merged into master last will cause a problem as there will be multiple leaf nodes, but this can be solved easily with ./manage.py makemigrations --merge, and ideally you should run ./manage.py makemigrations --check to see if there are problems before merging stuff - in continuous integration, for exaple.

Related

django migrations and git branches while working on multiple features simultaneously

In Django, when working on multiple new features simultaneously, what is the best way to handle database migrations with git?
(as in python manage.py makemigrations , and python manage.py migrate which are used to update the database after the models.py file has been changed)
I’m in middle of working on feature1, on its own branch.
Now I need to create feature2 and push it to production.
Should I:
fork new feature2 branch off of production, create feature2, merge back to prod, migrate.
fork new feature2 branch off of production, create feature2, migrate, merge back to prod.
Or some other way?
And when I go back to feature1, what do i do to insure everything will be up to date?
If you use a database migration tool by command line in conjunction with git you can take advantage of up and downs SQL scripts.
Let's say you want to add some changes to your DB, through the migration tool you can define a namescript1-up.sql and namescript1-down.sql and you can checkout to a specific database version from the command line (after a git checkout).
For example, I use golang-migrate for my go apps (I think it can be used for any language when used by command line).
Anyway, I would say this is an improper use of a migration tool, which is more suitable to apply the scripts regarding only the schema variations and not regarding ordinary INSERT or UPDATE sql statements.
I would suggest working on a containerized version of your DB, so you can destroy and recreate anything on flight.

Deleted Migration folder (Django 1.8) by accident what are my options?

I have accidently deleted one of migrations folders and and have no backup for it.
What are my options?
DB is postgres. Right now everything is OK.(I have moved instead migration folder I have on my DEV server with SQL lite) So I am just getting red message on server that not all migrations have been applied.
But next time if i run migration i will be in trouble.
What is my way out?
Migrations are mainly for backward compatibility, and tracking/versioning of the changes to models/database. If you do not really care about historical changes, etc.. then you can just delete the migrations directory and do:
python manage.py makemigrations <app_name>
This creates the initial migrations (like starting from a clean slate) - and moving forward you can track the historical migrations moving forward. More on this can be read here
Now, when you run the migrations, you can do
python manage.py migrate <app_name> --fake-initial
to fake the initial migration.
Now might be a good time to have a version control on your application
Use version control.
You are not the first developer to delete important files, but often recovery takes less than a second - thanks to version control systems (also called revision control systems). Please stop everything else and install and use one of Git, Mercury or Subversion.
Don't use FTP
It's totally. I mean totally insecure. Always use SFTP
Don't use sqlite3 for local with other db for production
sqlite doesn't enforce strict type checking. Postgresql on the other hand is very particular about it. Additionally sqlite only has a subset of the functionality that's found on postgresql. Last but not least different RDBMS have different intricacies. If you use one locally and another in production, there is always a chance that your code will break when you deploy to live
Managing without the migration files
This is not a big loss as long as your database is in sync with your models.
If you database is not in sync with your models, you can use
./manage.py inspectdb
to recreate local models that represent the actual structure in the db. Then you do makemigrations and migrate (as explained by karthik) on the generated models.
Then replace that with your live models and do the step again.

Django migrations gives error when run separately in different machines

We are a team of developers working on Django project. We are facing issues with django migrations. if one developer makes changes in model and runs makemigrations>migrate sequence it generates some sqls in migrations directory. Now when other developer pulls the code, and run the same sequence it's putting code in bad state. We've been clearing our migrations directory locally to get rid of the issue, and sometimes clear all the data. Not sure what we're doing incorrectly. Please suggest the right way of using django migrations.
Note - All of us use separate instances of DB in local machine.
makemigrations just create files
By running makemigrations, you’re telling Django that you’ve made some changes to your models (in this case, you’ve made new ones) and that you’d like the changes to be stored as a migration.
Migrations are how Django stores changes to your models (and thus your database schema) - they’re just files on disk. You can read the migration for your new model if you like; it’s the file polls/migrations/0001_initial.py.
Under Version Control, after you push the migrations file, for example, 0001_initial.py. Other developers just pull the file then run
python manage.py sqlmigrate your_app 0001 # to see what happen
python manage.py migrate your_app 0001
More about the Version Control:
Version control
Because migrations are stored in version control, you’ll occasionally come across situations where you and another developer have both committed a migration to the same app at the same time, resulting in two migrations with the same number.
Don’t worry - the numbers are just there for developers’ reference, Django just cares that each migration has a different name. Migrations specify which other migrations they depend on - including earlier migrations in the same app - in the file, so it’s possible to detect when there’s two new migrations for the same app that aren’t ordered.
When this happens, Django will prompt you and give you some options. If it thinks it’s safe enough, it will offer to automatically linearize the two migrations for you. If not, you’ll have to go in and modify the migrations yourself - don’t worry, this isn’t difficult, and is explained more in Migration files below.

Stopping ignoring migration files in git for a Django Project

My team has had the Django migration files in .gitignore , and so hasn't been committing these files. Instead we've been making migrations on our production server. We discovered that this is not the recommended practice (upon encountering Should I be adding the Django migration files in the .gitignore file? ). Would it cause problems to - at this point in time - remove migrations from .gitignore, makemigrations on the development machine, commit, push, and apply the migrations on the production server? If so, how can we get around these problems?
This question is motivated largely because we need to apply a custom migration that we partially wrote ourselves. (As such there's actually an additional step between making migrations on the development machine and committing them, namely adding in our custom code to the migration file).
I think the only way is to copy over all the migrations from production server to your repo then commit them. This is most likely a manual process because your production server is the only place that tracks all migrations. You don't need to worry about how to migrate your production server because it keeps the original copy. However, all these should be done before any new migrations are added and applied.
After all is fixed, you should create new migrations in your local dev environment, add to git and push your migration to production then apply the migration. Remember to have your CI or something else check for duplicate migration files.

Are there any downsides to doing a schema migration in my post_compile script

I have a django app running on heroku. I would like to run my South migrations before any code that depends on them goes live. After a quick review of current recommended practices I have found two suggested migration procedures.
Recommendation 1
Commit and push all changes
Run heroku run python manage.py migrate <APP_NAME> for each app
This suffers from having a period in between steps 1 and 2 where my code is assuming the latest schema is in place, but the db hasn't yet been updated.
Recommendation 2
commit and push all database changes.
Migrate.
Push all code changes.
This solves the previous problem, but adds a lot more complexity to the deployment process, and some day I will mess this up.
Potential Solution?
It seems that I can avoid the problem in Recommendation 1 and keep my deployment to a single step by utilising a custom post_compile script
that calls python $MANAGE_FILE migrate <APP_NAME> for each of my apps (in dependency order).
I have not seen this recommended anywhere, so my question is twofold. Can you see any potential problem with this approach, and do you have a better method?
If your application can afford some downtime, the easiest way seems to me to
Pause your app using $ heroku maintenance:on
Migrate all the apps at once with heroku run python manage.py migrate
Restart your app: $ heroku maintenance:off
Is it enough or do you have more complex needs ?