I need some advice on pushing Django updates, specifically the database updates, from my development server to my production server. I believe that updating the scripts, files, and such will be easy -- simply copy the new files from the dev server to the production server. However, the database updates are what have me unsure.
For Django, I have been using South during initial web app creation to change the database schema. If I were to have some downtime on the production server for updates, I could copy all the files over to the production server. These would include and changed models.py files which describe the database tables. I could then perform a python manage.py schemamigration my_app --auto and then a python migrate my_app to update the database based on the new files/models.py I've copied over.
Is this an OK solution or are there more appropriate ways to go about updating a database from development to production servers?
Your thoughts?
Thanks
Actually, python manage.py schemamigration my_app --auto will only create the migration based on the changes in models.py. To actually apply the migration to the database, you need to run python manage.py migrate my_app. Another option would be to create migrations (by running schemamigration) on the development server, and then copy over the migration files to the production server and apply migrations by running migrate.
Of course, having a source code repository would be way better than copying the files around. You could create migrations on your development server, commit them to the repository, on the production server pull the new files from repository, and finally apply migrations.
Related
I have a django REST project and a PostgreSQL database deployed to DigitalOcean. When I develop locally, I have a separate dockerized REST server and a separate PostgreSQL database to test backend features without touching production data.
My question arises when I'm adding/modifying model fields that require me to make migrations using python [manage.py](https://manage.py) makemigrations and python [manage.py](https://manage.py) migrate command. Here is my current situation so far:
What I was supposed to do
IN LOCAL ENV, to create the migration files,
python manage.py makemigrations
python manage.py migrate
Now commit these newly created files, something like below.
git add app/migrations/...
git commit -m 'add migration files' app/migrations/...
IN PRODUCTION ENV, run only the below command.
python manage.py migrate
What I did so far
IN LOCAL ENV, created the migration files,
python manage.py makemigrations
python manage.py migrate
I committed & pushed the changes to production WITHOUT the created migration file
IN PRODUCTION ENV, ran BOTH commands.
python manage.py makemigrations
python manage.py migrate
The production server successfully added the isActive field to the database and is working fine, but I still have a 0011_user_isActive.py migration file in my local changes that hasn't been staged/committed/pushed to github repo.
And because I ran makemigrations command in production env, it probably created the same migration file that I haven't pushed from local env.
My questions are:
What happens if I push the local migration file to production? Wouldn't it create a conflict when I run migration command on digitalocean console in the future?
How should I fix this situation?
I am just scared I'm going to corrupt/conflict my production database as I'm very inexperienced in databases and have too much to risk at the moment. Would appreciate any tips on best practices when dealing with such situations!
As docs says:
The migration files for each app live in a “migrations” directory inside of that app, and are designed to be committed to, and distributed as part of, its codebase. You should be making them once on your development machine and then running the same migrations on your colleagues’ machines, your staging machines, and eventually your production machines.
So it's best practice is to push your migration files and make your local and production migration files sync.
And if you got conflict when pushing migraions files and pulling them, the makemigrations --merge command is for solving that.
Also docs says:
Because migrations are stored in version control, you’ll occasionally come across situations where you and another developer have both committed a migration to the same app at the same time, resulting in two migrations with the same number.
Don’t worry - the numbers are just there for developers’ reference, Django just cares that each migration has a different name. Migrations specify which other migrations they depend on - including earlier migrations in the same app - in the file, so it’s possible to detect when there’s two new migrations for the same app that aren’t ordered.
When this happens, Django will prompt you and give you some options. If it thinks it’s safe enough, it will offer to automatically linearize the two migrations for you. If not, you’ll have to go in and modify the migrations yourself - don’t worry, this isn’t difficult, and is explained more in Migration files below.
Also be aware that in case of updating existed data in production, you can use RunPython in migration file. Read about it here.
I have a Django application and I'm using MariaDB as the database. Both of them have been deployed to a namespace on kubernetes. Now I want to add an additional field, so I made changes in the models.py file in the django app. These changes were made locally - I pulled the code from GIT and just made the changes locally. Normally to apply the changes, I have to run manage.py makemigrations and manage.py migrate and all the changes would've been reflected on to the DB if the DB was present locally.
So now my questions are
How can I apply the changes to MariaDb that is there on Kubernetes ?
Will running manage.py makemigrations and manage.py migrate locally and redeploying the django app to kubernetes solve this issue ?
TLDR;
If you don't require multiple replicas, then the simplest way to do it would be to run your migrations when the container starts.
If you require multiple replicas, then you'll have to get creative with jobs and init containers.There's a good article with more info on it here: https://andrewlock.net/deploying-asp-net-core-applications-to-kubernetes-part-7-running-database-migrations/
I have a learning project deployed on Heroku. It had a Postgres database provisioned. I introduced some major changes in the models of my Django project and destroyed the old database and provisioned a new one, which is totally empty, but it is not working like an empty database.
When I run the command heroku run python manage.py makemigrations, I get the error message
You are trying to add a non-nullable field....
Why am I getting this message when I have destroyed the old database?
First of all, you should never run manage.py makemigrations on Heroku.
By the time your code gets there no model changes should exist to generate new migrations. Run makemigrations locally to create migration files. Run migrate locally and on Heroku to apply migrations to your database.
Now that that's out of the way, this is likely caused by existing migrations files, not anything in your database. If you truly want to start over you can delete the files from each of yours apps' migrations/ directories.
Finally, there is no need to destroy and reprovision your database to reset it. Instead you can use heroku pg:reset:
The PostgreSQL user your database is assigned doesn’t have permission to create or drop databases. To drop and recreate your database use pg:reset.
use this command
heroku pg:reset
I am trying to create a development server from a production server from which I can test out new ideas.
I created a duplicate of my production server's database by dumping it using Postgres' db_dump and then imported the dump into a new database.
I then copied my production django directory and altered all .py files to refer to server_debug. rather than server in my import statements.
Using the admin interface to alter some data works in that only the development server has its data altered.
However, when I then try adding a new field in my models.py in my development server, manage.py syncdb fails to create it.
Is there something I am neglecting that could cause manage.py to refer to my production server rather than my development server?
syncdb doesn't touch tables that already exist. You need to either reset the app (easiest if you don't care about the data), modify the table manually (more of a quick hack) or use a migration app and version your models — South, for example.
Background :-
I am using Django 1.3. We are using South as the module for DB migration and Git SCM.
Problem:-
What is the correct way to deal with the migrations Folder that is formed?
The main problem is I make changes in the DB schema in the development machine, when I upload it to the production server I have to migrate the existing schema. While doing that there is always some issue with the migration files.
Should I just add the migrations folder to the gitignore ? or is there a better way to go about it ?
You should add the migrations folder to your version control system and use the same files for production and development. You may run into some problems on your production system if you introduced your migrations not from the beginning and you have already existing tables.
Therefore you have to fake the first migration, which normally does the same thing as syncdb did when you created your database for the first time.
So when trying to apply migrations for your app for the first time on the production machine, execute manage.py migrate app_name 0001 --fake. This lets South know, that the first migration has already been applied (which already happend with syncdb) and when you run migrate again, it would continue with the following migrations.