I'm trying to do a transition from MySQL to SQLite for a small site. django-tagging is used for one of the models. For the transition I'm using the dumpdata » loaddata method.
The dumpdata command works fine to export everything from the MySQL database into JSON. When I try to run the loaddata command for the SQLite database, I get this error:
IntegrityError: column name is not unique
I have isolated this error to the tagging app. The loaddata command won't tell me more (the "verbose" and "traceback" settings don't help).
How can I find out what's wrong in the input data (the tags) so I can import it? Or is there another way to do this whole thing?
It looks like doing
manage.py reset tagging
before running the loaddata command solves this problem.
Related
My project use an SQLite3 DB but now I need to do a migration from SQLite3 to PostgreSQL. I've used DBeaver for do this using the "Import Data" option. At the end of the importation of all the tables from SQLite3 to PostgreSQL I noticed this error when I try to add a new content:
IntegrityError at /engine/kernel/keyconcept/add/
duplicate key value violates unique constraint
"kernel_keyconcept_pkey" DETAIL: Key (id)=(4) already exists.
If I add a new model, into this project, that use the new DB(PostgreSQL) there aren't problems to add contents, but if I use the old models I see the IntegrityError above.
I've do a test with a model that have only two rows: with the first attempt I received a same error but for key id 1, with the second attempt I received a same error but for key id 2. When I've tried for the third time I was be able to add the content and the same think happens for the next attempts.
So I think that the row counter of the every tables after the migrations never start from the last content but from zero. It strange because previous I've do the same migrations but from SQLite3 to SQLite3 whitout problems.
How I can solve this?
You have to reset the sequences used to generate primary keys. There is a management command that prints the necessary SQL: sqlsequencereset
Note that it only prints out the SQL, you have to execute it manually (which you can probably do with DBeaver).
I've solved using dumpdata and loaddata instead of DBeaver and following this examples all work fine.
In my case I need to exclude some apps as auth and contenttypes, then with dumpdata I've used this:
python3 manage.py dumpdata --exclude contenttypes --exclude auth --indent 2 > db.json
To upload tha datas I've used this:
python3 manage.py loaddata db.json
Foreground
Hi. I´m uploading my Django app database to my Heroku production environment. When I try to manage.py loaddata I get the following error:
django.db.utils.IntegrityError: Problem installing fixtures: insert or update in table "catalog_contactos" violates key constraint.
I found this post where someone suggests that first you should load models that are the base for foreign key: related post
Also I found this post that refers to moving a database to another app (the production version for example). Basically it says:
When you backup whole database by using dumpdata command, it will
backup all the database tables
If you use this database dump to load the fresh database(in another
django project), it can be causes IntegrityError (If you loaddata in
same database it works fine)
To fix this problem, make sure to backup the database by excluding
contenttypes and auth.permissions tables
Link to the post
My code
Following that suggestions I do:
python manage.py dumpdata --exclude auth.permission --exclude contenttypes > data.json
Then on the server I run the following code and get the integrity error:
heroku run python manage.py loaddata data.json
Should I dumpdata model by model and loaddata them in a certaing order? Or I´m coding something wrong? Any clues?
Thanks in advance!
Update
Could this "Natural keys" issue be the solutuion? Django serializing objects
When I had integrity error with django/postgreSQL I did DROP TABLE table in postgreSQL.
I deleted that table (whoops), and now I'd like to regenerate it. Is there a command that allows me to do that? Looking online, everyone is saying to use the command ./manage.py syncdb, but that command is no longer available in the most recent version of Django. So I tried ./manage.py migrate, but that didn't generate the table. I also tried ./manage.py --run-syncdb, but that didn't do it either.
I'm pretty sure I can do it by hand, but I'm hoping there's a way to do this with a built-in command.
Since the admin app only has one table, django_admin_log, you can revert all migrations for the admin app by running
python manage.py migrate admin zero
then re-apply the the admin app migrations by running
python manage.py migrate
You might want to create a backup before doing this (or any migration, really) :)
I have a django 1.8 app working with a db.
I'm trying to change the schema of a table using the built-in migration.
Here are the steps I did:
In my dev invironment, I grabbed the app source and ran
python manage.py sycdb
then I ran
python manage.py loaddata ~/my_data.json
then I modified modes.py. Added a field and renamed a field...all from the same table 'TABLE1' which had no data.
then
python manage.py makemigrations myapp
python manage.py migrate
Error: django.db.utils.OperationalError: table "myapp_someother_table" already exists
then ran
python manage.py migrate --fake-initial
worked!
but when I browsed to the admin page for TABLE1, I get this error:
OperationalError: no such column: myapp_table1.my_new_field_id
I checked the db and yes, there is no such column.
How can I procceed from here? I prefer to fix this via django.
If I fix it straight in the db, then the migration goes out of sync.
Migrations do not automagically see that you have made changes. Migrations detect changes by comparing the current model with the historical model saved in the migration files.
In this case, you didn't have any historical models, since you didn't have any migrations. Django was not able to detect any changes in your models, even though they were different from your database.
The correct way to make changes to your model is to first run manage.py makemigration <my_app>, and then make the changes to your model, followed by another manage.py makemigrations.
You might not be able to do it via pure django and keep your data. I don't have personal experience with south but there are a lot of mentions if this tool. Just in case if nothing else works for you...
Here is what I did to make things work, but there must be a better way so please add more answers/comments...
I deleted the sqlite db and the migration folder
I made the desired changes to model.py
ran syncdb
ran loaddata to load the json data dump that I had saved previously.
just started the dev server
I had a django project "demo_project", I ran syncdb command and created tables and saved data. Then I took a copy of this project "demo_project_copy" and ran syncdb command. Now I can't login with previous data which was stored in database(error: invalid credentials). And when I run syncdb on "demo_project" I get error . did syncdb on "demo_project_copy" overwrite the data of "demo_project" ? What happened?
syncdb doesn't overwrite the database data, it just add the database tables that you need for the application you installed in your project. So if you add a new application in the INSTALLED_APPS in the settings.py file and then run syncdb, the command will add the necessary tables, nothing more.
If you are using django 1.7.x you'd better use the migrate command, since django 1.7.x syncdb is deprecated.
Reference for the syncdb command, documentation is very good for django:
https://docs.djangoproject.com/en/1.6/ref/django-admin/#syncdb