So I backed up my geodjango postgis database using pg_dump before performing some calculations which I've managed to mess up. I've created a new database using
createdb -T template0 -O proj proj_backup
psql proj_backup < proj_backup.pg
This seemed to work fine (though there were a few errors during the import), and connecting to the database using psql all my tables are there and appear to have the correct numbers of rows etc.
However, changing my settings.py to connect to my newly imported backup db (proj_backup in my example), gives me the following errors:
DatabaseError: relation "appname_model" does not exist
Any ideas? I'm guessing I did the dump wrong, or that I haven't maintained the ForeignKeys somehow. Thanks very much.
Update
So I figured out my first mistake: I had two similarly named backup databases and connected to the wrong one. Connecting to the correct one seems to have fixed everything. However, it's still quite strange that it didn't recognize the tables in the other backup database, which definitely did exist. Running syncdb on the incorrect database ends up duplicating those tables (if I remember correctly, there were duplicate table names when I listed them all from within psql). Unfortunately, the way I discovered my mistake was by dropping the bad table to recreate it, and so in order to reproduce this error I'll probably have to use time machine. Still very strange, I'll give that a shot when I can get physical access to my work machine.
So is your appname_model table there or is it a view? Was it in public or another named schema?
If the table is there, then chances are you have it in a schema that is not in your database search path. Check the search_path of your old database. It might have included something other than the default, or your default search schema is set in postgresql.conf and is non-standard.
Related
I'm working in a development environment on a flask-app with a Postgres 10 database that has ~80 tables. There are lots of relationships and ForeignKeyConstraints networking it all together.
It was working fine with Flask-Migrate. I'd bootstrapped and migrated up to this point with ~80 tables. But, I wanted to test out some new scripts to seed the database tables, and thought it would be quickest to just drop the database and bring it back up again using Flask-Migrate.
In this process, the migration folder was deleted, so I just started over fresh with a db init. Then ran db migrate. I manually fixed a few imports in the migrate script. Finally, I ran db upgrade.
However, now with all these 80 create_table commands in my migrate script, when I run db_upgrade, I receive an error:
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) relation "items" does not exist
I receive this error for every Child table that has a ForeignKeyConstraint if the Child table is not in an order which is below the Parent table in the migration file.
But, the autogenerated script from db migrate has the tables sorted alphabetically, ordered by table name.
Referring to documentation, I don't see this importance of sort order mentioned.
Bottom line is, it seems I'm either forced to write a script to sort all these tables in an order where the Parent table is above the Child table. Or else, just cut and paste like a jigsaw puzzle until all the tables are in the required order.
What am I missing? Is there an easier way to do this with Flask-Migrate or Alembic?
After researching this, it seems flask-migrate and/or Alembic does not have any built-in methods to resolve this sort order issue. I fixed it by cutting and pasting the tables in an order which ensured the Parent table was above the child tables in the migration file.
I've just encountered this myself, and could not find a better and/or official answer.
My approach was to separate the table creation from the creation of foreign key constraints:
Edit Alembic's auto-generated migration script: In each table create operation, remove all lines creating foreign key constraints
Run Alembic's upgrade command (tables are created, minus the FK constraints, of course)
Run Alembic's migrate command (additional migration script created, that adds all FK constraints)
Run Alembic's upgrade command (FK constraints added to tables)
I've faced some problems when using flask-sqlalchemy and flask-migrate, I solved it using python interactive shell.
>>> from yourapp import db, create_app
>>> db.create_all(app=create_app())
Check this link to get more information.
Happy coding...
I have two different databases in django. Initially, I had a table called cdr in my secondary database. I decided to get rid of the second database and just add the cdr table to the first database.
I deleted references (all of them, I think) to the secondary database in the settings file and throughout my app. I deleted all of the migration files and ran make migrations fresh.
The table that used to be in the secondary database is not created when I run migrate even though it doesn't exist on my postgres database.
I simply cannot for the life of me understand why the makemigrations function will create the migration file for the table when I add it back in to the model definition and I have verified that it is in the migration file. When I run migrate, it tells me there are no migrations to apply.
Why is this so. I have confirmed that I have managed=True. I have confirmed that the model is not on my postgres database by logging into the first database and running \dt.
Why does Django still think that this table still exists such that it is telling me no migrations to apply even though it shows a create command in the migrations file? I even dropped the secondary database to make sure it wasn't somehow being referenced.
I suspect code isn't needed to explain this to me but I will post if needed. I figure I am missing something simple here.
Why does Django still think that this database still exists such that
it is telling me no migrations to apply even though it shows a create
command in the migrations file
Because django maintains a table called django_migrations in your database which lists all the migrations that have been applied. Since you are almost starting afresh, clear out this table and then run the migrations.
If this still doesn't work and still assuming that you are still on a fresh start, it's a simple matter to drop all the tables (or even the database and do the migration again). OTH that you have data you want to save, you need to look at the --fake and --fake-initial options to migrate
I'm in the initial phase of development, and the models are changing around quite a lot.
I have to keep dropping the old tables and then performing a "syncdb"
While I appreciate the reason why syncdb does not alter the old tables,
Is it possible (or is there any other alternative) to drop the old tables automatically and then run syncdb?
They way I typically do this is at the database level. If, for example, you were using postgres, and just wanted to blow away the whole DB to start fresh, you could do:
dropdb -U postgres "dbname"
createdb -U postgres -O "db_user" "db_name"
For long projects I'm working on, I use a fabfile for automating tasks like the above, as well as grabbing the latest database from my production server, and overwriting my local development db.
Also, related is database "migration", which becomes a requirement when you change code after it's been running in production a while. A lot of people / apps use South, but I prefer Nashvegas for my sites.
With Nashvegas, I would create a 0001_add_field_blah.sql file which contained my raw SQL commands for altering the db. eg:
ALTER TABLE myapp_model RENAME COLUMN first_name TO given_name;
I use python manage.py reset <app>. I don't think there's a way to do it project-wide though.
I've recently changed the database server on my project from sqlite3 to Postgresql and I have a few questions that I hope will give an answer to my issues.
I understand that switching from sqlite to Postgres implies that I create the new database and the tables inside it, right? I've done that but I haven't seen any new files created in my project to show me that the database I've made is visible. (Btw, I've changed the database name in settings.py)
I probably should mention that I'm working in a virtual environment and I would like to know if that affects my references in any way. I've tried to import the tables in Django to try and count the number of records in a table but I get the error: "No module named psdemo". (psdemo is my database name and i'm trying to import the table with:
from ps.psdemo import Product
where ps is my application, psdemo is my database and Product the table in the database.
In conclusion I'm trying to get access to my database and tables but I can't manage to find them. I repeat, there is no new database file in my project or in my virtual environment (I've searched thoroughly) but if I use a terminal connection I can connect to my virtual environment and change directories to get to the application folder then if I connect to the Postgresql server I can create the database, the tables and can Insert into them, make queries etc, but I cannot access them from the Django code.
I understand that switching from sqlite to Postgres implies that I create the new database and the tables inside it, right? I've done that but I haven't seen any new files created in my project to show me that the database I've made is visible. (Btw, I've changed the database name in settings.py)
All you have to do with postgres is create the database. Not the tables. Django will create the tables, and anything else it thinks are useful, once you call syncdb.
You won't have any new files in your project like you did in sqlite. If you want to view your database, you should download and install pgadminIII (which I would recommend in any event)
I probably should mention that I'm working in a virtual environment and I would like to know if that affects my references in any way. I've tried to import the tables in Django to try and count the number of records in a table but I get the error: "No module named psdemo". (psdemo is my database name and i'm trying to import the table with:
Here, you import models via normal python syntax and it then references your tables. Each model should represent a single table. You define your models first, and then call
python manage.py syncdb
In conclusion I'm trying to get access to my database and tables but I can't manage to find them.
See above, but you should definitely read about postgres installation from the postgres docs, and read the psycopg2 docs as well as the Django docs for setting up a postgres database.
I understand that switching from sqlite to Postgres implies that I
create the new database and the tables inside it, right? I've done
that but I haven't seen any new files created in my project to show me
that the database I've made is visible. (Btw, I've changed the
database name in settings.py)
Database files are not created in the project directory with postgresql. They are created in the database server data directory (like /var/lib/postgres it depends on the distribution). You should generally query it through a PostgreSQL client that connects to the PostgreSQL server rather than messing with the files directly.
You can for example run command:
manage.py dbshell
As to your first issue, see #jpic's answer.
On your second issue, your database is not a package, and you do not import models from your database. If you were able to import your models correctly before you made any changes, change your import statements back to how they were.
I have two Django sites: one for development and one for production. Every once in a while, the data from the development database needs to be transferred to the production database or the other way around. I use postgresql.
This works fine: I empty the tables from the database I want to copy to, I generate sql from the applicable tables, and insert the data in the emptied tables. So far, so good.
But when I enter data into the database via the admin interface, Django raises IntegrityErrors, because appname_modelname_pkey already exists.
I think this is because the admin interface wants to add data with id 1, but that's already an imported record. Django isn't aware that id '1' is already taken.
How do I fix this problem? I want Django to increment the id (like SQL auto_increment would do), no matter what data is already stored.
Any help is appreciated!
If you're using postgres on both sides, then the sequence associated with the primary keys are going to be different.
For example, suppose you're moving production data to development. Also, suppose the sequence value in production is 20 and the sequence in development is 10. Then the first new item you add in development will have an id of 11. That id (probably) already exists in the production data, so you get an integrity error.
When you restore tables from a dump, you can reset the sequences by dropping and recreating the existing tables before you restore.
(Or, you can probably use the ALTER SEQUENCE command to sync up the sequences. However, I'm not familiar enough with postgres to say whether that's the right way to do it.)