Foreground
Hi. I´m uploading my Django app database to my Heroku production environment. When I try to manage.py loaddata I get the following error:
django.db.utils.IntegrityError: Problem installing fixtures: insert or update in table "catalog_contactos" violates key constraint.
I found this post where someone suggests that first you should load models that are the base for foreign key: related post
Also I found this post that refers to moving a database to another app (the production version for example). Basically it says:
When you backup whole database by using dumpdata command, it will
backup all the database tables
If you use this database dump to load the fresh database(in another
django project), it can be causes IntegrityError (If you loaddata in
same database it works fine)
To fix this problem, make sure to backup the database by excluding
contenttypes and auth.permissions tables
Link to the post
My code
Following that suggestions I do:
python manage.py dumpdata --exclude auth.permission --exclude contenttypes > data.json
Then on the server I run the following code and get the integrity error:
heroku run python manage.py loaddata data.json
Should I dumpdata model by model and loaddata them in a certaing order? Or I´m coding something wrong? Any clues?
Thanks in advance!
Update
Could this "Natural keys" issue be the solutuion? Django serializing objects
When I had integrity error with django/postgreSQL I did DROP TABLE table in postgreSQL.
Related
My project use an SQLite3 DB but now I need to do a migration from SQLite3 to PostgreSQL. I've used DBeaver for do this using the "Import Data" option. At the end of the importation of all the tables from SQLite3 to PostgreSQL I noticed this error when I try to add a new content:
IntegrityError at /engine/kernel/keyconcept/add/
duplicate key value violates unique constraint
"kernel_keyconcept_pkey" DETAIL: Key (id)=(4) already exists.
If I add a new model, into this project, that use the new DB(PostgreSQL) there aren't problems to add contents, but if I use the old models I see the IntegrityError above.
I've do a test with a model that have only two rows: with the first attempt I received a same error but for key id 1, with the second attempt I received a same error but for key id 2. When I've tried for the third time I was be able to add the content and the same think happens for the next attempts.
So I think that the row counter of the every tables after the migrations never start from the last content but from zero. It strange because previous I've do the same migrations but from SQLite3 to SQLite3 whitout problems.
How I can solve this?
You have to reset the sequences used to generate primary keys. There is a management command that prints the necessary SQL: sqlsequencereset
Note that it only prints out the SQL, you have to execute it manually (which you can probably do with DBeaver).
I've solved using dumpdata and loaddata instead of DBeaver and following this examples all work fine.
In my case I need to exclude some apps as auth and contenttypes, then with dumpdata I've used this:
python3 manage.py dumpdata --exclude contenttypes --exclude auth --indent 2 > db.json
To upload tha datas I've used this:
python3 manage.py loaddata db.json
I have a PostgreSQL database that where I performed python manage.py dumpdata to backup the data into a json file. I created a new PostgreSQL database, performed a migrate, and everything worked like clockwork. When I tried to load the backup.json file with python manage.py loaddata backup.json it, gives me this error.
Could not load contenttypes.ContentType(pk=15): duplicate key value violates unique constraint "django_content_type_app_label_76bd3d3b_uniq"
DETAIL: Key (app_label, model)=(navigation, navigation) already exists.
I checked phpPgAdmin, and there is a row for News. Is there a way to load the backup json file without including the content types, or better yet dump everything except for content types data ?
I was getting similar error:
django.db.utils.IntegrityError: Problem installing fixture
'/home/knysys/ogmius/ogmius/db.json': Could not load
contenttypes.ContentType(pk=2): duplicate key value violates unique
constraint "django_content_type_app_label_model_76bd3d3b_uniq" DETAIL:
Key (app_label, model)=(auth, user) already exists
After that, I found out that If you are restoring a fresh database from another database, You need to dumpdata like this:
./manage.py dumpdata --exclude auth.permission --exclude contenttypes > db.json
and then load fresh database like this:
./manage.py loaddata db.json
You can use the following command to load data:
python manage.py loaddata --exclude auth.permission --exclude contenttypes back.json
I have a django 1.8 app working with a db.
I'm trying to change the schema of a table using the built-in migration.
Here are the steps I did:
In my dev invironment, I grabbed the app source and ran
python manage.py sycdb
then I ran
python manage.py loaddata ~/my_data.json
then I modified modes.py. Added a field and renamed a field...all from the same table 'TABLE1' which had no data.
then
python manage.py makemigrations myapp
python manage.py migrate
Error: django.db.utils.OperationalError: table "myapp_someother_table" already exists
then ran
python manage.py migrate --fake-initial
worked!
but when I browsed to the admin page for TABLE1, I get this error:
OperationalError: no such column: myapp_table1.my_new_field_id
I checked the db and yes, there is no such column.
How can I procceed from here? I prefer to fix this via django.
If I fix it straight in the db, then the migration goes out of sync.
Migrations do not automagically see that you have made changes. Migrations detect changes by comparing the current model with the historical model saved in the migration files.
In this case, you didn't have any historical models, since you didn't have any migrations. Django was not able to detect any changes in your models, even though they were different from your database.
The correct way to make changes to your model is to first run manage.py makemigration <my_app>, and then make the changes to your model, followed by another manage.py makemigrations.
You might not be able to do it via pure django and keep your data. I don't have personal experience with south but there are a lot of mentions if this tool. Just in case if nothing else works for you...
Here is what I did to make things work, but there must be a better way so please add more answers/comments...
I deleted the sqlite db and the migration folder
I made the desired changes to model.py
ran syncdb
ran loaddata to load the json data dump that I had saved previously.
just started the dev server
This is a very basic question, but the other SO questions I read do not seam to answer it.
I checked out a project from a repository which contains some models some of them having migrations files.
I did not create any local database yet
What is the correct way of creating all the tables for the project?
I thought I start by executing
./manage.py syncdb
this creates some tables but for the once using south it tells me to run migrate. So I execute
./manage.py migrate
but then I get the error that
DatabaseError: table "mytable" already exists
I usually use ./manage.py syncdb --migrate and it does not throw that kind of error.
Are you sure that none of you other apps use a table with the same name?
You can checkout which tables your app wants to create by opening a django shell and doing sth like:
./manage.py sql YourAppName
I'm trying to do a transition from MySQL to SQLite for a small site. django-tagging is used for one of the models. For the transition I'm using the dumpdata » loaddata method.
The dumpdata command works fine to export everything from the MySQL database into JSON. When I try to run the loaddata command for the SQLite database, I get this error:
IntegrityError: column name is not unique
I have isolated this error to the tagging app. The loaddata command won't tell me more (the "verbose" and "traceback" settings don't help).
How can I find out what's wrong in the input data (the tags) so I can import it? Or is there another way to do this whole thing?
It looks like doing
manage.py reset tagging
before running the loaddata command solves this problem.