Loading from pickled data causes database error with new saves - django

In order to save time moving data I pickled some models and dumped them to file. I then reloaded them into another database using the same exact model. The save worked fine and the objects kept their old id which is what I wanted. However, when saving new objects I run into nextval errors.
Not being very adept with postgres, I'm not sure how to fix this so I can keep old records with their existing ID while being able to continue adding new data.
Thanks,
Thomas

There is actually a django command that prints out sequence reset SQL called sqlsequencereset.
$ python manage.py sqlsequencereset issues
BEGIN;
SELECT setval('"issues_project_id_seq"', coalesce(max("id"), 1), max("id") IS NOT null) FROM "issues_project";
COMMIT;

I think that you are talking about the sequence that is being used for autoincrementing your id fields.
the easiest solution here would be in a "psql" shell:
select max(id)+1 from YOURAPP_YOURMODEL;
and use the value in this command:
alter sequence YOURAPP_YOURMODEL_id_seq restart with MAX_ID_FROM_PREV_STATEMENT;
that should do the trick.

Related

what way can i fix django DB migration error without deleting migration files

what is the best way to fix this kind of DataBase Errors without having to delete my db and migration files and starting to enter data from scratch?
django.db.utils.IntegrityError: The row in table 'store_product_category' with primary key '1' has an invalid foreign key: store_product_category.category_id contains a value '1' that does not have a corresponding value in store_category.id.
while inspection the sqlit DB i observe that there is a mismatch in the IDs of the store_product_category and store_category.id.
is there anyway i can modify the id directly on the DB, i dont want to start deleting database files and migrations
If I've understood right:
The model StoreProductCategory has a FK - category, linking to a model StoreCategory.
You have a SPC record with category == 1 but no record in StoreCategory with this ID?
If so, the fix is reasonably simple.
Enter the DB shell using python manage.py dbshell and run an SQL INSERT command to add the appropriate record.
Change your model for StoreProductCategory and set on_delete for that FK. I would suggest maybe PROTECT might be appropriate here, but it's up to you - just make sure it's something that will keep things consistent.
If (2) is already done, I do question how this happened in the first place - that would kind of indicate somebody has messed directly with the DB. You may want to investigate who has access and what gets done there.

How to delete all data for one and only one app in Django

I have a set up (Django 1.11) with several apps including OOK, EEK, and others irrelevant ones. I want to delete all the data for OOK while leaving EEK (and the rest) untouched. Ideally, I want all the primary keys to be reset as well so the first new OOK model will get 1 and so on…
Is this possible?
All I can find is reset and sqlclear which are both deprecated. flush removed all data from the database and thus not what I want
I do release that this is an odd thing to do, but this is the hand given to me…
I think you can achieve this behaviour dropping all the tables of that <app> and then migrating only that <app>. That way you'll reset the tables of the <app>. In django 1.7+ you can do:
$ python manage.py migrate OOK zero //This command unapply your migrations
$ python manage.py migrate OOK
https://docs.djangoproject.com/en/2.0/ref/django-admin/#django-admin-migrate
If you are allowed to replace the db, you could export the data you need to a fixture, then do some clever text processing in the json that is in there, say by finding all ID fields and replacing them from 1. Then reimport the result into a clean db?
The ids are autoincremented by postgresql, according to this answer you can reset the index sequence, but not even sure it can go back to 1.
But really what's the point of resetting the indexes?
This might not be possible with django. However, it is doable with raw SQL:
SET FOREIGN_KEY_CHECKS = 0;
TRUNCATE OOK_table1;
TRUNCATE OOK_table2;
[…]
SET FOREIGN_KEY_CHECKS = 1;
⚠ Do take a backup of your database before doing that!

Received "ValueError: Found wrong number (0) of constraints for ..." during Django migration

While using Django 1.7 migrations, I came across a migration that worked in development, but not in production:
ValueError: Found wrong number (0) of constraints for table_name(a, b, c, d)
This is caused by an AlterUniqueTogether rule:
migrations.AlterUniqueTogether(
name='table_name',
unique_together=set([('a', 'b')]),
)
Reading up on bugs and such in the Django bug DB it seems to be about the existing unique_together in the db not matching the migration history.
How can I work around this error and finish my migrations?
(Postgres and MySQL Answer)
If you look at your actual table (use \d table_name) and look at the indexes, you'll find an entry for your unique constraint. This is what Django is trying to find and drop. But it can't find an exact match.
For example,
"table_name_...6cf2a9c6e98cbd0d_uniq" UNIQUE CONSTRAINT, btree (d, a, b, c)
In my case, the order of the keys (d, a, b, c) did not match the constraint it was looking to drop (a, b, c, d).
I went back into my migration history and changed the original AlterUniqueTogether to match the actual order in the database.
The migration then completed successfully.
I had a similar issue come up while I was switching over a CharField to become a ForeignKey. Everything worked with that process, but I was left with Django thinking it still needed to update the unique_together in a new migration. (Even though everything looked correct from inside postgres.) Unfortunately applying this new migration would then give a similar error:
ValueError: Found wrong number (0) of constraints for program(name, funder, payee, payer, location, category)
The fix that ultimately worked for me was to comment out all the previous AlterUniqueTogether operations for that model. The manage.py migrate worked without error after that.
"unique_together in the db not matching the migration history" - Every time an index is altered on a table it checks its previous index and drops it. In your case it is not able to fetch the previous index.
Solution-
1.Either you can generate it manually
2.Or revert to code where previous index is used and migrate.Then finally change to new index in your code and run migration.(django_migration files to be taken care of)
Also worth checking that you only have the expected number of Unique indexes for the table in question.
For example, if your table has multiple Unique indexes, then you should delete them to make sure you have only 1 (or whatever the number of expected Unique indexes is) pre-migration index present.
To check how many Unique indexes are there for a given table in PostgreSQL:
SELECT *
FROM information_schema.table_constraints AS c
WHERE
c.table_name = '<table_name>'
and c.constraint_type = 'UNIQUE'
Just in case someone runs into this and the previous answers haven't solved, In my case the issue was that when I modified the unique together constraint, the migration was attempted but the data didn't allow it (because of a more restrictive unique together constraint). However, the migration managed to delete the unique together constraint from the table leaving it in an inconsistent state. I had to migrate back to zero and re-apply the migration without data. Then it went through without problems.
In summary, make sure your data will be able to accept the new constraint before you apply the migration.
Find the latest migration file of the respective table, find unique
together, and replace current unique constraints fields.
Migrate database using ./manage.py migrate your_app_name.
Revert or undo the previous migrations file.
In my case problem was that the previous migration was not present in the table dajsngo_migrations. I added missing entry and then the new migration worked
Someone may get this issue while modifying the unique_together. Basically, the table state is not consistent with the migrations. You may need to add the previous constraints manually using MySQL shell.
incase one is using migrate with django and there is no data in the database, then drop the database and the do again python manage.py migrate`

python shell not able to find sqlite table

I have created a sqlite table using sqlite browser. I have added one row too. When I connect to the table through terminal, I am able to find the table as well as row.
However, I am trying to connect to table through python as I have lots of rows to be inserted. Even though I wrote a python program, I tried to connect and insert the row through shell.
import sqlite3
import os
home=os.environ['HOME']
conn=sqlite3.connect(home+'/AndroidStudioProjects/TableTopicPractice/database/dbTableTopic')
cur=conn.cursor()
cur.execute("SELECT name FROM sqlite_master WHERE type='table'")
print cur.rowcount
The last statement printed -1. When I tried to query the specific table that I created, the result was same. Both the query I tried in sqlite browser and it works.
Please note sqlite3 is installed in my system. I used the following tutorial as guide.
http://zetcode.com/db/sqlitepythontutorial/
Where am I going wrong? What do I need to correct?
Any pointers will be highly appreciated.

South Data migration in django after modifing the model

I have a project with existing class model as
class Disability(models.Model):
child = models.ForeignKey(Child,verbose_name=_("Child"))
But with the recent architecture change i have to modify it as
class Disability(models.Model):
child = models.ManyToManyField(Child,verbose_name=_("Child"))
now for this new change .. ( even i have to modify the existing database for the new one )
i guess data migration is the best way to do it rather than doing it manually .
i refered this online doc
http://south.readthedocs.org/en/latest/commands.html#commands-datamigration
but it has very less about data migration . and more about schema migration .
question 1 . if i do the schema migration will this make me loose all me previous data belonging to that old model.
question 2 . Even i am tring for schema migartion it is asking this ...
(rats)rats#Inspiron:~/profoundis/kenyakids$ ./manage.py schemamigration web --auto
? The field 'Disability.child' does not have a default specified, yet is NOT NULL.
? Since you are removing this field, you MUST specify a default
? value to use for existing rows. Would you like to:
? 1. Quit now, and add a default to the field in models.py
? 2. Specify a one-off value to use for existing columns now
? 3. Disable the backwards migration by raising an exception.
? Please select a choice: 1
Can anyone Explain the concept and difference between schema and data migration and how this can be achieved separately .
Schema and data migrations are not different options you can take to modify your table structure. They are completely different things. Of course, data migrations are fully described in the South docs.
Here a data migration will not help you, because you need to modify your schema. And the whole point of South and other migration systems is that they allow you to do that without losing data.
South will try to do a transaction by moving your table data to a temporary table (I could be wrong there), then restructure the table and try to add in the origin data to the new strucutre. Like this:
old_table -> clone -> tmp_table
old_table ->restructure
tmp_table.data -> table
South will look at the field types. If there is big changes it will ask what to do. For example chaning a text field to a int field would be very hard to convert :)
When you remove fields you may still want to be able to convert back to an old structure, so south will need some default data to be able to create a table with the old structure.
Moving data is always an issue since you may change table structure and field type. For example how would you manually deal with data from a Char(max_length=100) to a Char(max_length=50)?
Best suggestion is to keep good backups.
Also take advantage of djangos fixtures. You can save fixtures for different datastructures along with south migrations.
South will load initial_data files in the same way as syncdb, but it
loads them at the end of every successful migration process
http://south.readthedocs.org/en/latest/commands.html#initial-data-and-post-syncdb