How to run Django migrations in Google App Engine Flexible deployment step? - django

I have a Django app up and running in Google App Engine flexible. I know how to run migrations using the cloud proxy or by setting the DATABASES value but I would like to automate running migrations by doing it in the deployment step. However, there does not seem to be a way to run a custom script before or after the deployment.
The only way I've come up with is by doing it in the entrypoint command which you can set in the app.yaml:
entrypoint: bash -c 'python3 manage.py migrate --noinput && gunicorn -b :$PORT app.wsgi'
This feels a lot like doing it wrong. A lot of Googling didn't provide a better answer.

Defining the python3 manage.py migrate command in your app.yaml file will make it run every time a new instance is spawned and set up to serve traffic. Although technically this may not be an issue (no migration will happen if database schema hasn't changed) this isn't the right place to declare it.
You'd want this command to run once on every new version code push. This fits perfectly in a CI/CD approach. There are several tutorials on the Google Cloud online documentation using Bitbucket Pipelines or Travis CI for example but you can use many other CI/CD solutions.

Related

Deploying Django to production correct way to do it?

I am developing Django Wagtail application on my local machine connected to a local postgres server.
I have a test server and a production server.
However when I develop locally and try upload it there is always some issue with makemigration and migrate e.g. KeyError etc.
What are the best practices of ensuring I do not get run into these issues? What files do I need to port across?
so ill tell you what i do and what most of the companies that i worked as a django developer did and i can tell you by experience that worked pretty well.
First containerize your application, this will make your life much more easy and you will remove external influence in your code, also will get you an easy way to reproduce your environment.
Your Dockerfile should be from some python image and should do 3 basically things:
Install your requirements dependencies
Run the python manage.py migrate --noinput command
Run a http server such as gunicorn with gunicorn -c /gunicorn.py wsgi:application
You ill do the makemigration in your local machine and make sure that everything is working before commit then to the repo.
In your gunicorn.py you ill put your settings to run the app such as the number of CPU, the binding port, the folder that your app is, something like:
import os
import multiprocessing
# Chdir to specified directory before apps loading.
# https://docs.gunicorn.org/en/stable/settings.html#chdir
chdir = '/app/'
# Bind the application on localhost both on ipv6 and ipv4 interfaces.
# https://docs.gunicorn.org/en/stable/settings.html#bind
bind = '0.0.0.0:8000'
Second containerize your other stuff, for example the postgres database, the redis (for cache), a connection pooler for the database depending on the size of your application.
Its highly recommend that you have a step in the pipeline to do tests, they need to run before everything, maybe just after lint
Ok what now? now you need a way to deploy that stuff, the best for that scenario is: pull your image to github registry, and you can add a tag to that for example:
IMAGE_ID=ghcr.io/${{ github.repository_owner }}/$IMAGE_NAME
# Change all uppercase to lowercase
IMAGE_ID=$(echo $IMAGE_ID | tr '[A-Z]' '[a-z]')
docker tag $IMAGE_NAME $IMAGE_ID:staging
docker push $IMAGE_ID:staging
This can be add in a github action in the build step for example.
After having your new code in a new image inside github you just need to update the current one, this can be done by creaaing a script to do it in the server and running that script from github action, is something like:
docker pull ghcr.io/${{ github.repository_owner }}/$IMAGE_NAME
echo 'Restarting Application...'
docker stop {YOUR_CONTAINER} && docker up -d
sudo systemctl restart nginx
echo 'Cleaning old images'
sudo docker system prune -af
You can see that i create the image with a staging tag, you can create a rule in github actions for example to trigger that action when you create a new release for example, and create another action to be trigger in every new commit and build/deploy for a dev tag.
For the migration problem, the first thing is, when your application go live squash every migration to the first one (you can drop the database and all the migration then create the database and run the makemigration command again to reach this), so you can have a clean migration in the server. Never creates unnecessary relation between the tables, prefer always doing cached properties instead of add new columns, use UUID for unique ids, and try to not do breaking changes in the database, its hard but if you plan the database before is not so difficult to do.
Hit me if you have any questions. A lot of the stuff that i said can be done in a lot of other platforms such as gitlab, travis, circle ci, but i use the github action in the example because i think is more simple to picture.
EDIT:
I forgot to tell you to have a cron in your server doing backups of your databases, the migrate command ill apply the changes only after the verification but if something else break the database this can save your life.

How to migrate db in django using CI/CD with Google CloudSQL?

I have a django repository setup in gitlab and I am trying to automate build and deploy on google cloud using gitlab CI/CD.
The app has to be deployed on App Engine and has to use CloudSQL for dynamic data storage.
Issue that I am facing is while executing migration on the db, before deploying my application.
I am supposed to run ./manage.py migrate which connects to cloudSQL.
I have read that we can use cloud proxy to connect to cloudSQL and migrate db. But it kind of seems like a hack. Is there a way to migrate my db via CI/CD pipeline script?
Any help is appreciated. Thanks.
When running Django in the App Engine Standard environment the recommended way of approaching database migration is to run ./manage.py migrate directly from the Console shell or from your local machine (which requires using the cloud sql proxy).
If you want the database migration to be decoupled from your application deployment and run it in Gitlab CI/CD you could do something along these lines:
Use as base image google/cloud-sdk:latest
Acquire credentials gcloud auth activate-service-account --key-file $GOOGLE_SERVICE_ACCOUNT_FILE
Download the cloudsqlproxy with wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy and make it executable chmod +x cloud_sql_proxy.
Start the proxy ./cloud_sql_proxy -instances="[YOUR_INSTANCE_CONNECTION_NAME]"=tcp:3306.
Finally, run the migration script.
You might also create a custom docker image that already does what's above behind the scenes, the result would be the same.
If you want to read further on the matter I suggest taking a look at the following articles link1 link2.
I'm also trying to find the correct way to do that.
One other hacky way would be to just add a call to it in the settings file that is loaded with your app.
something like the migrate.py file does:
from django.core.management import execute_from_command_line
execute_from_command_line(['./manage.py', 'migrate'])
so everytime after you'll deploy a new version of the app it will also run the migrate.
I want to beleive there are other ways, not involving the proxy, especially if you also want to work with a private ip for the sql - then this script must run in the same vpc.

What to do to run makemigrations/migrate on heroku when deploying changed models from github?

I have deployed app from github repository to the heroku account of my customer as collaborator but this time I had to add some new models.
However I realized that when I deploy my changes from github heroku does not run makemigrations and migrate.
I I read some answers on stackoverflow and understood this is how it is supposed to be.
However my question is what should I do ? What is the best practise to deploy change models to heroku app. (I assume it is not deleting and recreating my app again since customer already has data there.)
(I am able to run makemigrations and migrate from the bash manually but when I have 30+ deployments it's a pain)
Check out the new feature on Heroku called "Release Phase": https://devcenter.heroku.com/articles/release-phase It will allow you to run migrations during the deployment. Just add whatever command you want to your Procfile, like this:
web: your_web_command
release: python manage.py migrate
The release command will run after your app is done building, and before it's launched.

Run django migrate in docker

I am building a Python+Django development environment using docker. I defined Dockerfile files and services in docker-compose.yml for web server (nginx) and database (postgres) containers and a container that will run our app using uwsgi. Since this is a dev environment, I am mounting the the app code from the host system, so I can easily edit it in my IDE.
The question I have is where/how to run migrate command.
In case you don't know Django, migrate command creates the database structure and later changes it as needed by the project. I have seen people run migrate as part of the compose command directive command: python manage.py migrate && uwsgi --ini app.ini, but I do not want migrations to run on every container restart. I only want it to run once when I create the containers and never run again unless I rebuild.
Where/how would I do that?
Edit: there is now an open issue with the compose team. With any luck, one time command containers will get supported by compose. https://github.com/docker/compose/issues/1896
You cannot use RUN because as you mentioned in the comments your source is mounted during running of the container.
You cannot use CMD either since you don't want it to run everytime you restart the container.
I recommend using docker exec manually after running the container. I do not think there is a way to automate this inside a dockerfile or docker-compose because of the two reasons I gave above.
It sounds like what you need is a tool for managing project tasks. dobi is a tool designed to handle these tasks (disclaimer: I am the author of this tool).
You can see an example of how to run a migration here: https://github.com/dnephin/dobi/tree/master/examples/init-db-with-rails. The example uses rails, but it's basically the same idea as django.
You could setup a task called migrate which would run the command in a container and write the data to a volume. Then when you start your docker-compose containers, use that volume as the source for your database service.
https://github.com/docker/compose/issues/1896 is finally resolved now by the new service profiles introduced with docker-compose 1.28.0. With profiles you can mark services to be only started in specific profiles:
services:
nginx:
# ...
postgres:
# ...
uwsgi:
# ...
migrations:
profiles: ["cli-only"] # profile name chosen freely
# ...
docker-compose up # start only your app services, no migrations
docker-compose run migrations # run migrations on-demand
docker exec -it container-name bash
Then you will be inside the container and you can run any command you normally do when you develop without using docker.

Running ./manage.py migrate during Heroku deployment

I am working on a Django app, and I would like my Database migrations to be run when deploying on Heroku.
So far we have simply put the following command in the Procfile:
python manage.py migrate
When deploying the migrations are indeed run, but they seem to be run once for each dyno (and we use several dynos). As a consequence, data migrations (as opposed to pure schema migrations) are run several times, and data is duplicated.
Running heroku run python manage.py migrate after the deployment is not satisfactory since we want the database to be in sync with the code at all times.
What is the correct way to do this in Heroku?
Thanks.
This is my Procfile and it is working exactly as you describe:
release: python manage.py migrate
web: run-program waitress-serve --port=$PORT settings.wsgi:application
See Heroku docs on defining a release process:
https://devcenter.heroku.com/articles/release-phase#defining-a-release-command
The release command is run immediately after a release is created, but before the release is deployed to the app’s dyno formation. That means it will be run after an event that creates a new release:
An app build
A pipeline promotion
A config var change
A rollback
A release via the platform API
The app dynos will not boot on a new release until the release command finishes successfully.
If the release command exits with a non-zero exit status, or if it’s shut down by the dyno manager, the release will be discarded and will not be deployed to the app’s formation.
Be aware, however, this feature is still in beta.
Update:
When you have migrations that remove models and content types, Django requires a confirmation in the console
The following content types are stale and need to be deleted:
...
Any objects related to these content types by a foreign key will also be deleted. Are you sure you want to delete these content types? If you're unsure, answer 'no'. Type 'yes' to continue, or 'no' to cancel:
The migrate command in your Procfile does not respond and the release command fails. In this scenario, remove the migrate line, push live, run the migrate command manually, then add it back for future deploys.
The migrate does automatically runs on Heroku, but for now you can safely do it once your dyno is deployed with heroku run python manage.py migrate.
If production, you can put your app in maintenance first with heroku maintenance:on --app=<app name here>
Setup your Procfile like in the docs
release: python manage.py migrate
web: gunicorn myproject.wsgi --log-file -
documented at https://devcenter.heroku.com/articles/release-phase#specifying-release-phase-tasks
You can create a file bin/post_compile which will run bash commands after the build.
Note that it is still considered experimental.
Read here for more buildpack info.
See here for an example
Alternatively, Heroku is working on a new Releases feature, which aims to simplify and solve this process. (Currently in Beta).
Good luck!