How to add Postgres extension when pushing local database to Heroku - django

I have a Django application which relies upon the postgis Postgres extension.
I want to copy my local Database to Heroku using pg:push, but I get numerous django.db.utils.ProgrammingError: relation does not exist errors, following on from:
pg_restore: error: could not execute query: ERROR: type "public.geography" does not exist
LINE 6: coords public.geography(Point,4326),
^
Command was: CREATE TABLE public.outreach_localcompanylocation (
id bigint NOT NULL,
long_name character varying(200) NOT NULL,
city character varying(200) NOT NULL,
country character varying(3) NOT NULL,
coords public.geography(Point,4326),
google_location_id bigint,
state character varying(200)
);
This seems to be because pg:push does not enable the postgis extension (typically enabled by running the SQL command CREATE EXTENSION postgis;).
If I try to enable it manually on a new Heroku database, then run pg:push, I get this error:
Remote database is not empty. Please create a new database or use heroku pg:reset
So is there a way to run CREATE EXTENSION postgis; as part of the pg:push process?

The docs indicate you cannot push to a non-empty database:
...To prevent accidental data overwrites and loss, the remote database must be empty. You’ll be prompted to pg:reset a remote database that isn’t empty...
https://devcenter.heroku.com/articles/managing-heroku-postgres-using-cli#pg-push
Also, are you doing your migrations? The fact that django is spitting relation errors makes me think it is leaning that way.
Delete the files in your local app's migrations folder, leave the __init__.py file there.
Run python manage.py makemigrations
Run python manage.py migrate
Reset the heroku database so it's empty and ready for you to push.
Deploy app with updated migrations to heroku.
Run heroku run python manage.py migrate

Related

Django's --fake-initial doesn't work when migrating with existing tables

I am migrating a project from Django 1.1 to Django 3.0 I am done with the project. When I am dumping the production dump to my local in the newly converted project I get "Table already exists".
Here's what I am doing.
mysql> create database xyx;
docker exec -i <container-hash> mysql -u<user> -p<password> xyx < dbdump.sql
then I run the migrate, as I have had to do some changes to the previously given models.
./manage.py migrate --fake-initial
this is the output I get
_mysql.connection.query(self, query)
django.db.utils.OperationalError: (1050, "Table 'city' already exists")
So, what to do ?
Alright boys and girls, here's the approach I followed to solve this problem.
I dumped the entire database.
docker exec -i <container-hash> mysql -u<username> -p<password> <dbname> < dump.sql
Now I listed all the migrations I made using
./manage.py showmigrations <app-name>
This will give me the list of all the migrations I have applied, now from inspecting the migrations, I realized that from the 7th migration to the 30th migration I had done my changes.
Here's the tedious part which any sys admin can write a script to do in less than 4 lines of bash script. You can generate the raw SQL of any migration with this command.
./manage.py sqlmigrate <app-name> <migration-name> > changes-i-made.sql
Now that I have created my changes-i-made.sql file I'll need to run this script 22 more times but with >> otherwise everytime you run the command with a single > it will keep overwriting your changes file.
Now once all of your migration changes are recorded inside a file, open up your sql shell connect to the database and start pasting the changes or do some sql magic to pick all the changes directly from the file.
Once you're done go ahead and fake all the migrations, cause you don't need Django to do them you already did.
./manage.py migrate --fake
and then login to your production instance and get ready to fuck with your senior team lead who said you couldn't do it.
I just checked to see if this approach is working and the future migrations will be working, so I created one and everything works like a breeze.

How to sync local Django sqlite3 data with Heroku's postgres database?

I have a Django website served by Heroku with a model storing data about projects that I've worked on in the past. I already ran makemigrations and migrate locally before pushing to heroku with git and running heroku run python3 manage.py migrate. So, my database models and fields are synced, but I'm asking about the field values. Whenever I update the value of a field for a model instance locally, I want it (the data) to sync with Heroku, and vice versa–sync values I've updated on Heroku in the admin panel with my local sqlite3 database. Is there a command for updating the values of the database itself, or am I missing something? Because I've looked all over the internet for the last hour for how to do this one thing and I couldn't find the command to do it.
Side note: I also want the command I'm looking for to sync newly created instances, not just data for existing model instances.
Ok, I've figured it out.
First step is to run python3 manage.py dumpdata --exclude contenttypes > data.json. This copies the local database data into a file called data.json (that's created by > also if it doesn't exist).
Next, git push to heroku and run heroku run python3 manage.py migrate for good measure.
Finally, heroku run python3 manage.py loaddata data.json. This translates the data loaded from the local sqlite3 database and loads it to the heroku postgre database. Unless the translating is done when you dump the data. Regardless, this synchronizes the heroku data with the local data.
I haven't tested synchronizing the local data with the heroku data, but I'm sure it'll work the same way: heroku run python3 manage.py dumpdata --exclude contenttypes > data.json and then git fetch from heroku (I've never fetched before to sync a directory with what's on github but it should be straightforward).
That's all there is to it. If I locally change the name of a project that I worked on, update the date it was last worked on, and write a few more paragraphs on the work process, and I don't want to redo all of that in the heroku shell, then I just synchronize by dumping the data, pushing it to heroku, and loading it there.

How to completely reset Postgres database on Heroku?

I have a learning project deployed on Heroku. It had a Postgres database provisioned. I introduced some major changes in the models of my Django project and destroyed the old database and provisioned a new one, which is totally empty, but it is not working like an empty database.
When I run the command heroku run python manage.py makemigrations, I get the error message
You are trying to add a non-nullable field....
Why am I getting this message when I have destroyed the old database?
First of all, you should never run manage.py makemigrations on Heroku.
By the time your code gets there no model changes should exist to generate new migrations. Run makemigrations locally to create migration files. Run migrate locally and on Heroku to apply migrations to your database.
Now that that's out of the way, this is likely caused by existing migrations files, not anything in your database. If you truly want to start over you can delete the files from each of yours apps' migrations/ directories.
Finally, there is no need to destroy and reprovision your database to reset it. Instead you can use heroku pg:reset:
The PostgreSQL user your database is assigned doesn’t have permission to create or drop databases. To drop and recreate your database use pg:reset.
use this command
heroku pg:reset

Django db back up postgres error

I am using django-dbbackup to create a backup of my postgres db. When I run python manage.py dbbackup it creates a default.backup file.
But, when I run python manage.py dbrestore I get the following error.
Restoring backup for database: hera
Finding latest backup
Restoring: /home/dev/Documents/Program Codes/Python/Django/Hera/default.backup
Restore tempfile created: 670.5 KB
Are you sure you want to continue? [Y/n]y
Running: dropdb --username=****** --host=localhost hera
dropdb: database removal failed: ERROR: must be owner of database hera
CommandError: Error running: [u'dropdb', u'--username=dev', u'--host=localhost', u'hera']
This is about making sure the user you're using is the owner of the database. Right now it looks like you're running the dbrestore command using the dev user (from the line CommandError: Error running: [u'dropdb', u'--username=dev', u'--host=localhost', u'hera']).
There are likely three options:
1/ Change the owner of the database to dev using a PSQL command something like:
ALTER DATABASE hera OWNER TO dev;
2/ Change the --username=dev your dbrestore script uses when it runs to the owner of the database (something other than dev, you can use \l in PSQL to list your databases and see which user owns the database.
3/ You could give dev superuser and createdb ability, though not sure this is ideal in terms of security.

Restoring OSQA database to initial state

I installed OSQA using the bitnami installer and everything worked fine. Now, I am hacking at the osqa code base. If I need to restore the database to its initial state, do i need to reinstall OSQA or is there any command to truncate the database and load intial data.
I tried using use_osqa.bat and did a python.py manage migrate forum but it didnt work.
It uses a postgresql database
You can use the django-admin.py flush:
Returns the database to the state it was in immediately after syncdb was executed. This means that all data will be removed from the database, any post-synchronization handlers will be re-executed, and the initial_data fixture will be re-installed.
django-admin.py flush
Finally, this worked for me. Note that this is applicable only for OSQA using the bitnami environment with postgresql database.
if you want to restore your database state to the original version, first connect to postgres database using admin credentials(use any client, I used Dbeaver)
database : postgres
username : postgres
password : **admin password** : this is the same password you gave while installing bitnami-osqa
Now, drop the bitnami-osqa database and create it again(if you already have connections to bitnami_osqa, close them)
Drop-database bitnami_osqa;
Commit;
Create-database bitnami-osqa;
Commit;
Now open use_osqa.bat in your bitnami root folder and execute the following
cd apps\osqa
python manage.py syncdb --all --verbosity 2
[specify yes when it prompts to create super user and create some user]
python manage.py migrate forum --fake