Heroku Postgres Database to ElephantSQL - django

I have a Postgres database on Heroku and I want to have that database hosted on ElephantSQL, but I cannot for the life of me find how to do it. I have a DUMP file downloaded from the Heroku Postgres database, but I cannot find how I put the data into ElephantSQL.
My database was linked to a Django app and I already have my instance on ElephantSQL linked to the a copy of the same Django app, but hosted on Render. I can see on ElephantSQL that the instance has all the tables, so I just need to put the data in there somehow.
Any tips or hints are welcome!

The tables that you already see may be the problem. It may also be that the parents of those tables are the problem. An SQL dump is just a series of SQL commands that run in order to write schema and data back into a new database.
The first SQL commands set up the databases, schemas, and tables. If your tables are already there, your import may be failing because it wants to create new tables that already exist. Sorry without more information on the specific errors you are seeing, it's hard to be more specific. Hopefully one of these 4 options help you. Let me know in the comments how it goes.
For all of the options below, I would suggest backing up your destination server database to be sure that you don't mess up the Django app that currently is working there even without your data.
Option 1: Look for a way to export your source database without database, schema, and table definitions.
Option 2: If you're careful, you can edit your dump file to remove those initial setup commands, and have the dump file start with the commands that only push the data to your tables.
Option 3: (IMPORTANT to backup the destination server first for this one.) Drop the database, schema, and/or tables on the destination server so that nothing is pre-defined, and see if your dump file can reconstruct everything again the way Django needs it on the destination server with your data.
Option 4: It may be a lot of tables, but you can usually export individual table data files with an option to not include table definitions. Then you have to import all the tables as separate imports. A mix of option 3 and 4 may work also where if you can't find an option to not include the definition, drop the tables at the destination and import each table independently.

Here is what worked for me using a Mac terminal:
Create a backup file from Heroku:
pg_dump -h heroku_host -d heroku_database -U heroku_user -p heroku_port -W -Ft > latest.dump
You'll be prompted to input the password from Heroku PostgreSQL. Note that these heroku_xxx are found in the Heroku PostgreSQL credentials.
Restore the backup from Heroku to ElephantSQL:
pg_restore -h elephant_host -d elephant_database -U elephant_user -p 5432 -W -Ft latest.dump
You'll be prompted to input the password from Heroku PostgreSQL. Note that these elephant_xxx are found in the ElephantSQL credentials and that they use the term "server" instead of host.

Related

problem when restoring postgresql database for a django application

I am trying to transfer my testing server's database to my working PC. So I do this:
(Postgres 12.5)
pg_dump -U nutrition -d mednutrition_db --clean -f mednut_bak.sql
and then I try to restore it like this on my Postgres 9.5
psql -U chris mednutrition_db < mednut_bak.sql
This simple method was working fine for me. But now it seems there is a problem which I cannot solve.
I noticed that postgres creates (django)ID fields using this default value (for example):
nextval('django_migrations_id_seq'::regclass)
However, when I restore the database, the default value for the ID fields remains empty, and of course I cannot insert any new data to the restored database.
How can I resolve this?
UPDATE: I just noticed that on my backup file (mednut_bak.sql) there is the proper instruction to create the default ID sequence:
ALTER SEQUENCE public.django_migrations_id_seq OWNED BY public.django_migrations.id;
and
ALTER TABLE ONLY public.django_migrations ALTER COLUMN id SET DEFAULT nextval('public.django_migrations_id_seq'::regclass);
So, I guess problem is while restoring because for some reason Postgres ignores these lines.

Data migration to Heroku Postgres DB?

What options do I have to migrate data to hosted with Heroku Postgres Database?
I have Django app and my client is migrating is moving from his existing software will most likely produce data in excel format. I will figure out the data model and data conversion I just need to know what tools to use to do the actual update of the DB.
Your Question is probably too broad for this site. But briefly, for Postgres in general…
INSERT
The usual way to put data into a SQL database is the INSERT command.
COPY FROM
To add data in bulk rather than one record at a time, call COPY FROM. You specify a file to be imported.

Drop and recreate old tables in Django

I'm in the initial phase of development, and the models are changing around quite a lot.
I have to keep dropping the old tables and then performing a "syncdb"
While I appreciate the reason why syncdb does not alter the old tables,
Is it possible (or is there any other alternative) to drop the old tables automatically and then run syncdb?
They way I typically do this is at the database level. If, for example, you were using postgres, and just wanted to blow away the whole DB to start fresh, you could do:
dropdb -U postgres "dbname"
createdb -U postgres -O "db_user" "db_name"
For long projects I'm working on, I use a fabfile for automating tasks like the above, as well as grabbing the latest database from my production server, and overwriting my local development db.
Also, related is database "migration", which becomes a requirement when you change code after it's been running in production a while. A lot of people / apps use South, but I prefer Nashvegas for my sites.
With Nashvegas, I would create a 0001_add_field_blah.sql file which contained my raw SQL commands for altering the db. eg:
ALTER TABLE myapp_model RENAME COLUMN first_name TO given_name;
I use python manage.py reset <app>. I don't think there's a way to do it project-wide though.

Django switch from sqlite3 to Postgresql issues

I've recently changed the database server on my project from sqlite3 to Postgresql and I have a few questions that I hope will give an answer to my issues.
I understand that switching from sqlite to Postgres implies that I create the new database and the tables inside it, right? I've done that but I haven't seen any new files created in my project to show me that the database I've made is visible. (Btw, I've changed the database name in settings.py)
I probably should mention that I'm working in a virtual environment and I would like to know if that affects my references in any way. I've tried to import the tables in Django to try and count the number of records in a table but I get the error: "No module named psdemo". (psdemo is my database name and i'm trying to import the table with:
from ps.psdemo import Product
where ps is my application, psdemo is my database and Product the table in the database.
In conclusion I'm trying to get access to my database and tables but I can't manage to find them. I repeat, there is no new database file in my project or in my virtual environment (I've searched thoroughly) but if I use a terminal connection I can connect to my virtual environment and change directories to get to the application folder then if I connect to the Postgresql server I can create the database, the tables and can Insert into them, make queries etc, but I cannot access them from the Django code.
I understand that switching from sqlite to Postgres implies that I create the new database and the tables inside it, right? I've done that but I haven't seen any new files created in my project to show me that the database I've made is visible. (Btw, I've changed the database name in settings.py)
All you have to do with postgres is create the database. Not the tables. Django will create the tables, and anything else it thinks are useful, once you call syncdb.
You won't have any new files in your project like you did in sqlite. If you want to view your database, you should download and install pgadminIII (which I would recommend in any event)
I probably should mention that I'm working in a virtual environment and I would like to know if that affects my references in any way. I've tried to import the tables in Django to try and count the number of records in a table but I get the error: "No module named psdemo". (psdemo is my database name and i'm trying to import the table with:
Here, you import models via normal python syntax and it then references your tables. Each model should represent a single table. You define your models first, and then call
python manage.py syncdb
In conclusion I'm trying to get access to my database and tables but I can't manage to find them.
See above, but you should definitely read about postgres installation from the postgres docs, and read the psycopg2 docs as well as the Django docs for setting up a postgres database.
I understand that switching from sqlite to Postgres implies that I
create the new database and the tables inside it, right? I've done
that but I haven't seen any new files created in my project to show me
that the database I've made is visible. (Btw, I've changed the
database name in settings.py)
Database files are not created in the project directory with postgresql. They are created in the database server data directory (like /var/lib/postgres it depends on the distribution). You should generally query it through a PostgreSQL client that connects to the PostgreSQL server rather than messing with the files directly.
You can for example run command:
manage.py dbshell
As to your first issue, see #jpic's answer.
On your second issue, your database is not a package, and you do not import models from your database. If you were able to import your models correctly before you made any changes, change your import statements back to how they were.

Tables not recognized on restored geodjango postgis backup

So I backed up my geodjango postgis database using pg_dump before performing some calculations which I've managed to mess up. I've created a new database using
createdb -T template0 -O proj proj_backup
psql proj_backup < proj_backup.pg
This seemed to work fine (though there were a few errors during the import), and connecting to the database using psql all my tables are there and appear to have the correct numbers of rows etc.
However, changing my settings.py to connect to my newly imported backup db (proj_backup in my example), gives me the following errors:
DatabaseError: relation "appname_model" does not exist
Any ideas? I'm guessing I did the dump wrong, or that I haven't maintained the ForeignKeys somehow. Thanks very much.
Update
So I figured out my first mistake: I had two similarly named backup databases and connected to the wrong one. Connecting to the correct one seems to have fixed everything. However, it's still quite strange that it didn't recognize the tables in the other backup database, which definitely did exist. Running syncdb on the incorrect database ends up duplicating those tables (if I remember correctly, there were duplicate table names when I listed them all from within psql). Unfortunately, the way I discovered my mistake was by dropping the bad table to recreate it, and so in order to reproduce this error I'll probably have to use time machine. Still very strange, I'll give that a shot when I can get physical access to my work machine.
So is your appname_model table there or is it a view? Was it in public or another named schema?
If the table is there, then chances are you have it in a schema that is not in your database search path. Check the search_path of your old database. It might have included something other than the default, or your default search schema is set in postgresql.conf and is non-standard.