How to run Django migrations inside kubernetes and apply changes to the database pod? - django

I have deployed my app to Azure with Kubernetes. I have separate pods for front end, back end and database. I encountered problems when one of my database fields changed from id to userId. I have tried to update this change to my deployment database but without any luck. I logged into my back end pod and removed the existing migration file and run python manage.py makemigrations & python manage.py migrate. After these I checked that everything was ok in the migration file. After this I don't know what to do. Do I need to remove the database pod and create it again? Or how do I update the database inside the pod?

id -> userId change is a DDL change for your DB. I suggest that you "exec" into your DB pod and start your DB shell.
kubectl exec -it mysql-pod-name bash
Then you should be able to execute your DDL statement. MySql example:
ALTER TABLE tableName
RENAME COLUMN id TO userId;

Related

Django app running on AWS elastic beanstalk cannot write to database

I have a Django app running on AWS elastic beanstalk. It was previously connected to a MySQL RDS instance. I have been through several steps to migrate the data from a MySQL instance to a PostgreSQL instance.
In brief I connected to the postgres instance and ran python manage.py migrate
I then ran python manage.py sqlflush and ran the resulting SQL commands
Then I imported a json fixture of content types from the MySQL db. I created the fixture (when connected to the MySQL db) with
python manage.py dumpdata contenttypes --indent=4 --natural-foreign > contenttype.json
and imported it into the PostgreSQL db with
python manage.py loaddata contenttype.json
The above preserves PKs and FKs
I then used AWS DMS to load the data from the MySQL instance to the PostgreSQL instance.
Using the Django app (e.g. through Django admin) all the data can be read from the Postgres instance. However, trying to post data to the database results in a 500 error.
Could this be a problem with security settings in AWS? It seems odd that I can read data from the DB but not post to it.
I've tried connecting to the DB using pgAdmin 4 with the same connection settings. I was able to use an INSERT statement to post to the database from there (but had to explicitly state the PK value)
Update
As suggested by Mark B the problem was Primary Keys. Many of the sequences in postgres had been reset to 1 as the current value. Finding the max value of the ID (PK) in each table and updating the corresponding sequence has got everything working. This question on sequences and primary keys was useful.

How to keep Django 2.2 migrations on Kubernetes while developing?

I have a rather complex K8s environment that one of the Deployments is a Django application. Currently we are having a very hard time whenever I need to update a model that has already been migrated to a PostgreSQL database.
Let's say for instance that I create an application named Sample, that has a simple table on the models.py. My development process (skaffold) builds the docker and apply it locally on the minikube, after this is done I connect to the pod via kubectl exec and execute the python manage.py makemigrations and python manage.py migrate, so far so good.
After some time, let's say I need to create a new table on the models.py file of the Sample application, the skaffold builds the docker, kills the old pod, and create the new pod. So I connect as usual via kubectl exec and try to execute the makemigrations and migrate command, lo and behold, there's no migration to apply. And of course no change is made on the PostgreSQL.
Upon further searching this, I believe that the reason for this is that since the docker is built without the Sample/migrations folder, and there's already a table (the original one) on the PostgreSQL, when I run the makemigrations it creates only the 0001_initial.py file, that has all the tables but, since the table already exists, when executing the migratethe django believes that the migration is already applied, therefore it won't apply.
If what I found out is true, how can I keep this files on a PVC, so that they are always kept between each pod recreation?
Thank you.

how to reflect changes made on postgres database of django from one pc to other

We three working with Django and postgres is the database. Whenever we push the code to GitHub. The database data is not reflecting. The data I stored is visible to me only. The postgres user, password and database name are same on all our laptops. How to make when I push that has to go their databases also.
If you are talking about changes in db schema, then take a look django migrations - https://docs.djangoproject.com/en/3.1/topics/migrations/. The workflow is following:
change model (e.g. add new field, change existing field...)
generate migration file by running python manage.py makemigrations. This generates migration file in <app_folder>/migrations
Run python manage.py migrate to apply changes in models to your database
Add migration file to your version control (github or whaever) and push it
Now when your colleagues gets updated code from version control they need to run python manage.py migrate to apply model changes to their local database.
I found that it is not possible to send postgress data directly. But sqlite can be used for it. This link aids in that - https://manuelvanrijn.nl/blog/2012/01/18/convert-postgresql-to-sqlite/ . But i send data by taking backup like
From Postgres to Postgres: taking backup :
pg_dump dbname > dbname.bak
and then on new Postgres restored with:
psql test < dbname.bak

How to completely reset Postgres database on Heroku?

I have a learning project deployed on Heroku. It had a Postgres database provisioned. I introduced some major changes in the models of my Django project and destroyed the old database and provisioned a new one, which is totally empty, but it is not working like an empty database.
When I run the command heroku run python manage.py makemigrations, I get the error message
You are trying to add a non-nullable field....
Why am I getting this message when I have destroyed the old database?
First of all, you should never run manage.py makemigrations on Heroku.
By the time your code gets there no model changes should exist to generate new migrations. Run makemigrations locally to create migration files. Run migrate locally and on Heroku to apply migrations to your database.
Now that that's out of the way, this is likely caused by existing migrations files, not anything in your database. If you truly want to start over you can delete the files from each of yours apps' migrations/ directories.
Finally, there is no need to destroy and reprovision your database to reset it. Instead you can use heroku pg:reset:
The PostgreSQL user your database is assigned doesn’t have permission to create or drop databases. To drop and recreate your database use pg:reset.
use this command
heroku pg:reset

How to connect with postgresql database after deployment to Heroku?

I'm new to Django but I'm deploying a Django-based website to Heroku, using Postgresql. The deployment was successful, and the website is online and has established connection with the database. However, none of the data from my local database has migrated to the heroku database, causing it to be a blank database. If I go into the admin section and manually input a datapoint, it appears on my site, so I know that database is correctly serving data. What is the proper way for migrating data from your local database to your online, heroku version of the database? I thought the following code would migrate the data:
heroku run python manage.py makemigrations
heroku run python manage.py migrate
But apparently I'm missing something.
make migrations will create a migration that contains your schema, but no data. The migrate command applies the migration to the database.
In order to provide data to be sent over as part of the migrate command you need to either create a data migration or use a fixture.
Another option you have is to dump your local database and do an import into Heroku Postgres
All in all, it depends on how much local data you have that you want copied over. If its only a few rows, I would use either a data migration or a fixture, if its 100s or 1000s of rows an export/import of your dataset is your best bet.