An error write EPIPE occurred while loading Heroku PostgreSQL database from the local database.
heroku addons:create heroku-postgresql:hobby-dev
heroku pg:push postgres://localhost:5432/design-sa-1 DATABASE_URL --app design-sa
Appeared only recently, before this was not.
My system Windows 10, local databases v.11 or v.12 - the result is one - this error. I have already tried it on two old applications - the result is one – this error.
The applications themselves (Django 3.0/ Python 3.7) are deployed on Heroku without errors.
Local python manage.py dumpdata > db.json works perfect.
I tried removing the tables and heroku push works but adding the tables back again it won't proceed.
Related
I've been stuck on an issue for a while where I have a model that I've created locally (class User(AbstractUser)), ran "python manage.py makemigrations" and "python manage.py migrate", which works and interacts correctly with the django server when run locally.
When I push this to heroku however, I receive an error response to certain api requests which essentially say "relation "app_user" does not exist". I assumed this was because somehow the postgreSQL db didn't receive the migrations when the app was deployed to heroku. After logging into the db I noticed that it had a few other tables, including "auth_user" (I assume this is a default django table), but not "app_user". I have done "heroku run bash" and "python manage.py migrate" in an attempt to migrate the changes to the postgreSQL db, but this doesn't have any effect.
I'm not sure if this will be useful but the postgreSQL server I'm using is the standard Heroku Postgres Hobby Dev add-on.
My questions are:
Should running "python manage.py migrate" on heroku update the postgreSQL database? If so, how does this work?
Given that the migrations are all on heroku, how can I manually update the postgreSQL with the migrations?
Fixed by deleting old heroku db and creating a new one. Unsure what exactly caused the issue above but I feel it may be because we deleted all migrations on the repo at some point in the past (to start from scratch), which meant the current db state was not applicable to migrations in the repo and so could not be applied. After doing this, running migrate on the remote server works perfectly.
There is a really old thread on stackoverflow here Getting 'DatabaseOperations' object has no attribute 'geo_db_type' error when doing a syncdb
but the difference that I have with their issue is that my containers have the POSTGIS and POSTGRES installed in. Specifically I used QGIS and the image is like so
db:
image: kartoza/postgis:13.0
volumes:
- postgis-data:/var/lib/postgresql
So locally I have two docker images - one is web and the other is the kartoza/postgis
I also have this as well in the settings.py file
import dj_database_url
db_from_env = dj_database_url.config(conn_max_age=500)
DATABASES['default'].update(db_from_env)
which should support the GIS data. I see all my packages gis, geolocation packages installed with no issues. But I am getting the above error when I run heroku run python manage.py migrate
The website runs with very limited functionality as the geo variables are needed to get you past the landing page.
The steps I have taken to deploy is
heroku create appname
heroku stack:set container -a appname
heroku addons:create heroku-postgresql:hobby-dev -a appname
heroku git:remote -a appname
git push heroku main
EDIT The db url on heroku is postgres://foobar:3242q34rq2rq32rf3q2rfq2q2r3vq23rvq23vr#er3-13-234-91-69.compute-
I have also ran the below command and it shows that the db now takes GIS, but I still get the error
$ heroku pg:psql
create extension postgis;
try replacing db with localhost
heroku config:add DATABASE_URL=postgis://foo:bar#localhost:5432/gis
Ok so after 14 or so hours of reading. The issue here is that
Heroku database does NOT have qgis capabilities enabled as a default - you have to use heroku pg:psql then run CREATE EXTENSION postgis;
Having QGIS installed within the docker container results it not being able to perform functions such as hero pg:push localdatabase -postgresqlremoteherokudb and heroku config:add DATABASE_URL=postgis://foo:bar#localhost:5432/gis So if anyone is following the python nearbyshops tutorial, you might end up here as well. I got fixated here thinking it's a db issue, but it was more so to do with heroku configuration settings.
If you are coming here from docker, which might mean you need to install django_heroku package to modify your settings. There are recent 2019 and 2018 guides on how to deploy Geospatial apps to heroku, but again, you don't need those. (at least for me I didn't)
GIS now is available as an extension, you no longer need to add buildpacks.
There are still some configuration issues if you are using gunicorn for staticfiles, so the website at first won't launch properly. Highly suggest that you track your heroku log errors (manual settings.py coding or heroku addons).
Background
Migrating a Django app from Digital Ocean to Heroku. I had problems migrating the data, so I used pg_dump to get the schema and the data of each table. Then ran those scripts in heroku. I loaded my website and I can see the new data coming through.
Problem
Now when I push new code with the Heroku CLI that auto runs the deployment, it fails for this reason: psycopg2.errors.DuplicateTable: relation "django_content_type" already exists
The commands I run are
git add .
git commit -m "some message"
git push heroku master"
The Procfile has release: python manage.py migrate which runs the commands, which I thought about taking it out but when I have migrations to run in the future this will cause an issue.
Any thoughts?
This sent me down a rabbit hole this morning, and I figured it out. I am going to leave the question up since I could not find a similar one.
The issue came down to migrations being out of sync locally and remotely. Following the instructions for the top answer on this post cleared up the issue: Django Heroku Error "Your models have changes that are not yet reflected in a migration"
I have a Django website served by Heroku with a model storing data about projects that I've worked on in the past. I already ran makemigrations and migrate locally before pushing to heroku with git and running heroku run python3 manage.py migrate. So, my database models and fields are synced, but I'm asking about the field values. Whenever I update the value of a field for a model instance locally, I want it (the data) to sync with Heroku, and vice versa–sync values I've updated on Heroku in the admin panel with my local sqlite3 database. Is there a command for updating the values of the database itself, or am I missing something? Because I've looked all over the internet for the last hour for how to do this one thing and I couldn't find the command to do it.
Side note: I also want the command I'm looking for to sync newly created instances, not just data for existing model instances.
Ok, I've figured it out.
First step is to run python3 manage.py dumpdata --exclude contenttypes > data.json. This copies the local database data into a file called data.json (that's created by > also if it doesn't exist).
Next, git push to heroku and run heroku run python3 manage.py migrate for good measure.
Finally, heroku run python3 manage.py loaddata data.json. This translates the data loaded from the local sqlite3 database and loads it to the heroku postgre database. Unless the translating is done when you dump the data. Regardless, this synchronizes the heroku data with the local data.
I haven't tested synchronizing the local data with the heroku data, but I'm sure it'll work the same way: heroku run python3 manage.py dumpdata --exclude contenttypes > data.json and then git fetch from heroku (I've never fetched before to sync a directory with what's on github but it should be straightforward).
That's all there is to it. If I locally change the name of a project that I worked on, update the date it was last worked on, and write a few more paragraphs on the work process, and I don't want to redo all of that in the heroku shell, then I just synchronize by dumping the data, pushing it to heroku, and loading it there.
I am running Heroku Django app with post-gre. On my local machine, I have the same app with local db. Now wanted to import my data from local db to heroku db. I am following this guide. I have created a dump file from local db using:
PGPASSWORD=mypassword pg_dump -Fc --no-acl --no-owner -h localhost -U myuser mylocaldb > mylocaldb.dump
I have uploaded this dump file to Dropbox. On heroku terminal when I enter:
heroku pg:backups restore 'link/to/dump/file' DATABASE_URL
The output after selecting app is:
b008 ---restore---> HEROKU_POSTGRESQL_NAVY
Running... 32.8kB
It gets stuck on that XX.XkB and no progress after that. It drops all tables in my heroku db, and does nothing after that (I checked using psql).
My question is:
How do I check logs for this restore process ? (and hence track the error). heroku logs --tail shows nothing related to restore.
My local and heroku db have different name. Is that okay ?
My local db has lots of Users, foreign key, south migrations and admin logs. Is that okay or one has to remove all of them before dumping ?
Do I run "pg:backups restore" before or after running syncdb and and south migrate ?
You can run this command to get the info for the import:
heroku pg:backups info b008
I just ran into a similar issue where I uploaded the file to dropbox and used the share link as the backup URL. However, I failed to change the ?dl=0 param in the URL to 1 which means Heroku was trying to download an HTML file instead of the database dump.