Primary key validation after Django's sqlsequencereset [duplicate] - django

I'm following up in regards to a question that I asked earlier in which I sought to seek a conversion from a goofy/poorly written mysql query to postgresql. I believe I succeeded with that. Anyways, I'm using data that was manually moved from a mysql database to a postgres database. I'm using a query that looks like so:
UPDATE krypdos_coderound cru
set is_correct = case
when t.kv_values1 = t.kv_values2 then True
else False
end
from
(select cr.id,
array_agg(
case when kv1.code_round_id = cr.id
then kv1.option_id
else null end
) as kv_values1,
array_agg(
case when kv2.code_round_id = cr_m.id
then kv2.option_id
else null end
) as kv_values2
from krypdos_coderound cr
join krypdos_value kv1 on kv1.code_round_id = cr.id
join krypdos_coderound cr_m
on cr_m.object_id=cr.object_id
and cr_m.content_type_id =cr.content_type_id
join krypdos_value kv2 on kv2.code_round_id = cr_m.id
WHERE
cr.is_master= False
AND cr_m.is_master= True
AND cr.object_id=%s
AND cr.content_type_id=%s
GROUP BY cr.id
) t
where t.id = cru.id
""" % ( self.object_id, self.content_type.id)
)
I have reason to believe that this works well. However, this has lead to a new issue. When trying to submit, I get an error from django that states:
IntegrityError at (some url):
duplicate key value violates unique constraint "krypdos_value_pkey"
I've looked at several of the responses posted on here and I haven't quite found the solution to my problem (although the related questions have made for some interesting reading). I see this in my logs, which is interesting because I never explicitly call insert- django must handle it:
STATEMENT: INSERT INTO "krypdos_value" ("code_round_id", "variable_id", "option_id", "confidence", "freetext")
VALUES (1105935, 11, 55, NULL, E'')
RETURNING "krypdos_value"."id"
However, trying to run that results in the duplicate key error. The actual error is thrown in the code below.
# Delete current coding
CodeRound.objects.filter(
object_id=o.id, content_type=object_type, is_master=True
).delete()
code_round = CodeRound(
object_id=o.id,
content_type=object_type,
coded_by=request.user, comments=request.POST.get('_comments',None),
is_master=True,
)
code_round.save()
for key in request.POST.keys():
if key[0] != '_' or key != 'csrfmiddlewaretoken':
options = request.POST.getlist(key)
for option in options:
Value(
code_round=code_round,
variable_id=key,
option_id=option,
confidence=request.POST.get('_confidence_'+key, None),
).save() #This is where it dies
# Resave to set is_correct
code_round.save()
o.status = '3'
o.save()
I've checked the sequences and such and they seem to be in order. At this point I'm not sure what to do- I assume it's something on django's end but I'm not sure. Any feedback would be much appreciated!

This happend to me - it turns out you need to resync your primary key fields in Postgres. The key is the SQL statement:
SELECT setval('tablename_id_seq', (SELECT MAX(id) FROM tablename)+1);

It appears to be a known difference of behaviour between the MySQL and SQLite (they update the next available primary key even when inserting an object with an explicit id) backends, and other backends like Postgres, Oracle, ... (they do not).
There is a ticket describing the same issue. Even though it was closed as invalid, it provides a hint that there is a Django management command to update the next available key.
To display the SQL updating all next ids for the application MyApp:
python manage.py sqlsequencereset MyApp
In order to have the statement executed, you can provide it as the input for the dbshell management command. For bash, you could type:
python manage.py sqlsequencereset MyApp | python manage.py dbshell
The advantage of the management commands is that abstracts away the underlying DB backend, so it will work even if later migrating to a different backend.

I had an existing table in my "inventory" app and I wanted to add new records in Django admin and I got this error:
Duplicate key value violates unique constraint "inventory_part_pkey"
DETAIL: Key (part_id)=(1) already exists.
As mentioned before, I run the code below to get the SQL command to reset the id-s:
python manage.py sqlsequencereset inventory
Piping the python manage.py sqlsequencereset inventory | python manage.py dbshell to the shell was not working
So I copied the generated raw SQL command
Then opened pgAdmin3 https://www.pgadmin.org for postgreSQL and opened my db
Clicked on the 6. icon (Execute arbitrary SQL queries)
Copied the statement what was generated
In my case the raw SQL command was:
BEGIN;
SELECT setval(pg_get_serial_sequence('"inventory_signup"','id'), coalesce(max("id"), 1), max("id") IS NOT null) FROM "inventory_signup";
SELECT setval(pg_get_serial_sequence('"inventory_supplier"','id'), coalesce(max("id"), 1), max("id") IS NOT null) FROM "inventory_supplier";
COMMIT;
Executed it with F5.
This fixed everything.

In addition to zapphods answer:
In my case the indexing was indeed incorrect, since I had deleted all migrations, and the database probably 10-15 times when developing as I wasn't in the stage of migrating anything.
I was getting an IntegrityError on finished_product_template_finishedproduct_pkey
Reindex the table and restart runserver:
I was using pgadmin3 and for whichever index was incorrect and throwing duplicate key errors I navigated to the constraints and reindexed.
And then reindexed.

The solution is that you need to resync your primary key fields as reported by "Hacking Life" who wrote an example SQL code but, as suggested by "Ad N" is better to run the Django command sqlsequencereset to get the exact SQL code that you can copy and past or run with another command.
As a further improvement to these answers I would suggest to you and other reader to dont' copy and paste the SQL code but, more safely, to execute the SQL query generated by sqlsequencereset from within your python code in this way (using the default database):
from django.core.management.color import no_style
from django.db import connection
from myapps.models import MyModel1, MyModel2
sequence_sql = connection.ops.sequence_reset_sql(no_style(), [MyModel1, MyModel2])
with connection.cursor() as cursor:
for sql in sequence_sql:
cursor.execute(sql)
I tested this code with Python3.6, Django 2.0 and PostgreSQL 10.

If you want to reset the PK on all of your tables, like me, you can use the PostgreSQL recommended way:
SELECT 'SELECT SETVAL(' ||
quote_literal(quote_ident(PGT.schemaname) || '.' || quote_ident(S.relname)) ||
', COALESCE(MAX(' ||quote_ident(C.attname)|| '), 1) ) FROM ' ||
quote_ident(PGT.schemaname)|| '.'||quote_ident(T.relname)|| ';'
FROM pg_class AS S,
pg_depend AS D,
pg_class AS T,
pg_attribute AS C,
pg_tables AS PGT
WHERE S.relkind = 'S'
AND S.oid = D.objid
AND D.refobjid = T.oid
AND D.refobjid = C.attrelid
AND D.refobjsubid = C.attnum
AND T.relname = PGT.tablename
ORDER BY S.relname;
After running this query, you will need to execute the results of the query. I typically copy and paste into Notepad. Then I find and replace "SELECT with SELECT and ;" with ;. I copy and paste into pgAdmin III and run the query. It resets all of the tables in the database. More "professional" instructions are provided at the link above.

If you have manually copied the databases, you may be running into the issue described here.

I encountered this error because I was passing extra arguments to the save method in the wrong way.
For anybody who encounters this, try forcing UPDATE with:
instance_name.save(..., force_update=True)
If you get an error that you cannot pass force_insert and force_update at the same time, you're probably passing some custom arguments the wrong way, like I did.

This question was asked about 9 years ago, and lots of people gave their own ways to solve it.
For me, I put unique=True in my email custom model field, but while creating superuser I didn't ask for the email to be mandatory.
Now after creating a superuser my email field is just saved as blank or Null. Now this is how I created and saved new user
obj = mymodel.objects.create_user(username='abc', password='abc')
obj.email = 'abc#abc.com'
obj.save()
It just threw the error saying duplicate-key-value-violates in the first line because the email was set to empty by default which was the same with the admin user. Django spotted a duplicate !!!
Solution
Option1: Make email mandatory while creating any user (for superuser as well)
Option2: Remove unique=True and run migrations
Option3: If you don't know where are the duplicates, you either drop the column or you can clear the database using python manage.py flush
It is highly recommended to know the reason why the error occurred in your case.

I was getting the same error as the OP.
I had created some Django models, created a Postgres table based on the models, and added some rows to the Postgres table via Django Admin. Then I fiddled with some of the columns in the models (changing around ForeignKeys, etc.) but had forgotten to migrate the changes.
Running the migration commands solved my problem, which makes sense given the SQL answers above.
To see what changes would be applied, without actually applying them:
python manage.py makemigrations --dry-run --verbosity 3
If you're happy with those changes, then run:
python manage.py makemigrations
Then run:
python manage.py migrate

I was getting a similar issue and nothing seemed to be working. If you need the data (ie cant exclude it when doing dump) make sure you have turned off (commented) any post_save receivers. I think the data would be imported but it would create the same model again because of these. Worked for me.

You just have to go to pgAdmin III and there execute your script with the name of the table:
SELECT setval('tablename_id_seq', (SELECT MAX(id) FROM tablename)+1);

Based on Paolo Melchiorre's answer, I wrote a chunk as a function to be called before any .save()
from django.db import connection
def setSqlCursor(db_table):
sql = """SELECT pg_catalog.setval(pg_get_serial_sequence('"""+db_table+"""', 'id'), MAX(id)) FROM """+db_table+""";"""
with connection.cursor() as cursor:
cursor.execute(sql)

This is the right statement. Mostly, It happens when we insert rows with id field.
SELECT setval('tablename_id_seq', (SELECT MAX(id) FROM tablename));

Related

unique_together does not replace primary key

In my Django app, I want to insert a record with a composite primary key. Apparently this should be possible by making use of "unique_together". I'm quite sure this code was working in the past, but for some reason it does not seem to be working now. This code used to run on a Linux VM, and now I'm hosting it in Google App Engine. However I don't see how this can be the cause for this error.
class TermsAndConditionsDocument(models.Model):
organization = models.ForeignKey(Organization, on_delete=models.CASCADE, verbose_name=_("Organization"))
language = models.CharField(_('Language'),choices=LANGUAGE_CHOICES, max_length=5, help_text=_("The language of the content."))
content = models.TextField()
class Meta:
unique_together = ('organization', 'language')
The error:
IntegrityError at /transactions/settings/terms_and_conditions
null value in column "id" violates not-null constraint
DETAIL: Failing row contains (null, nl-BE, <p>B</p>, 10).
According to what I've read, using "unique_together" should cause Django to not need or include an ID as primary key. I checked the database, and the ID field DOES exist. I do not understand where the database constraint and the ID field are still coming from?
Apparently, as pointed out in the comments, a primary key "id" field is always added, even if you don't need it. It's supposed to get out of your way, so you don't even notice its existence. In my case, it required me to give it a value when I created a new record, which is not how things are supposed to work.
A while back I migrated this database from one Postgres database to another Postgres database. I used an SQL dump and load method for this. Some sequences seem to have been lost during that migration.
Because there are no sequences, some fields now lacked autoincrement capabilities, explaining the IntegrityError on insertion.
In order to fix this, I did the following:
1) Export the current data:
manage.py dumpdata > data.json
2) Drop your database and create a new empty one.
3) Run database migrations:
manage.py migrate
4) Load the data again, excluding some default data already recreated by Django.
manage.py loaddata --exclude auth.permission --exclude contenttypes data.json
This procedure seems to have recreated the sequences while also keeping the data.
The unique_together only creates a DB constraint (https://docs.djangoproject.com/en/2.2/ref/models/options/#unique-together).
You could create a custom primary key with the option primary_key https://docs.djangoproject.com/en/2.2/ref/models/fields/#django.db.models.Field.primary_key but you could only do that for one field.
But I suggest to just keep the auto increment id field, this works better with Django.
For the error are you saving a model? or doing a raw import?

Django: How to insert sql statements into sqlite3 database and configure with models

I'm using Django for a project. I have a .sql file containing insert statements. How can I get them into my sqlite3 database and have them work with my models?
The project "fortune" has a models.py file that looks like the following:
class fortune(models.Model):
id = models.IntegerField(primary_key=True)
category = models.CharField(max_length=50)
length = models.IntegerField()
aphorism = models.CharField(max_length=5000)
I have a .sql file with a list of Insert statements like the follwing:
INSERT INTO "fortune_fortune" VALUES(1,'fortunes',127,'Arbitrary Text');
When I run .schema on my db.sqlite3 file which is configured with my project I see:
CREATE TABLE fortune_fortune(id integer, category varchar(50), length integer, aphorism varchar(5000));
I've tried using .read in my sqlite shell with no luck. I've tried typing "sqlite3 file.sqlite3 < file.sql" in bash as well. There is something I'm missing here, but I can't seem to ID the problem.
Thanks for your help.
ok, wait.. normally you dont use sql statements to insert data into db, if you work with django.
to insert data into db, you work with django ORM which is way much fun than these ugly sql statements.
fortune = fortune(category='new cat', length=19, aphorism='i love life')
fortune.save()
then as a result, you will have one new row in fortune table in your db. just read django docs and you feel happy!
and one more thing, class names are always in capital.
to your issue:
Django provides a hook for passing the database arbitrary SQL that’s executed just after the CREATE TABLE statements when you run migrate. You can use this hook to populate default records, or you could also create SQL functions, views, triggers, etc.
The hook is simple: Django just looks for a file called sql/<modelname>.sql, in your app directory, where <modelname> is the model’s name in lowercase.
more in docs

django added a new class in a model

This thing never happened to me before,as i never had to create another class in the model field after doing syncdb already. I am now reviewing one of my past projects and i need to add another class in the models.py file. I have very little understanding of south, its more like a procedural one.
when i do this
./manage.py sql app_name
it showls the new table but when i run the server it throws an operational error 'no such table found'. Am i missing something this whole time?? Is there a way??
according to this
./manage.py sql app_name
just print sql statement for create table.
you can write it in a file
./manage.py sql app_name > command.sql
and feed it to database. for example if use postgresql you can use:
psql -U user db_name < command.sql

Django queryset filter on boolean not working

I have a model:
class mymodel(models.Model):
order_closed = models.BooleanField(default=False)
I added this field to my development mysqllite db manually since its a new field for a model/table that already existed. I then tried:
mymodel.objects.filter(order_closed=False) #and with True
and its producing incorrect or unpredictable results. I saw some mention that is could be a sqllite thing but I'm not sure? The templates seem to understand whether its a true or false value but python code doesn't. To clarify with some examples:
{{mymodel.order_closed}} will print 0 after I set the default to 0 in sqllite. but using .filter(order_closed=value) will still return every record.
I think you make some mistake in wrirting SQL . If you have db which have some important informations use
south
http://south.aeracode.org/
When you will have it you can easly upgrate/edit your database .
If you dont wanna install new 'plugins' try that .
1. Delete this field from DB manualy .
2. write : python manage.py sql 'name of app'
It will return the CREATE TABLE SQL statements for the app.
Then you can upr your database manualy with some of CReate command.

IntegrityError duplicate key value violates unique constraint - django/postgres

I'm following up in regards to a question that I asked earlier in which I sought to seek a conversion from a goofy/poorly written mysql query to postgresql. I believe I succeeded with that. Anyways, I'm using data that was manually moved from a mysql database to a postgres database. I'm using a query that looks like so:
UPDATE krypdos_coderound cru
set is_correct = case
when t.kv_values1 = t.kv_values2 then True
else False
end
from
(select cr.id,
array_agg(
case when kv1.code_round_id = cr.id
then kv1.option_id
else null end
) as kv_values1,
array_agg(
case when kv2.code_round_id = cr_m.id
then kv2.option_id
else null end
) as kv_values2
from krypdos_coderound cr
join krypdos_value kv1 on kv1.code_round_id = cr.id
join krypdos_coderound cr_m
on cr_m.object_id=cr.object_id
and cr_m.content_type_id =cr.content_type_id
join krypdos_value kv2 on kv2.code_round_id = cr_m.id
WHERE
cr.is_master= False
AND cr_m.is_master= True
AND cr.object_id=%s
AND cr.content_type_id=%s
GROUP BY cr.id
) t
where t.id = cru.id
""" % ( self.object_id, self.content_type.id)
)
I have reason to believe that this works well. However, this has lead to a new issue. When trying to submit, I get an error from django that states:
IntegrityError at (some url):
duplicate key value violates unique constraint "krypdos_value_pkey"
I've looked at several of the responses posted on here and I haven't quite found the solution to my problem (although the related questions have made for some interesting reading). I see this in my logs, which is interesting because I never explicitly call insert- django must handle it:
STATEMENT: INSERT INTO "krypdos_value" ("code_round_id", "variable_id", "option_id", "confidence", "freetext")
VALUES (1105935, 11, 55, NULL, E'')
RETURNING "krypdos_value"."id"
However, trying to run that results in the duplicate key error. The actual error is thrown in the code below.
# Delete current coding
CodeRound.objects.filter(
object_id=o.id, content_type=object_type, is_master=True
).delete()
code_round = CodeRound(
object_id=o.id,
content_type=object_type,
coded_by=request.user, comments=request.POST.get('_comments',None),
is_master=True,
)
code_round.save()
for key in request.POST.keys():
if key[0] != '_' or key != 'csrfmiddlewaretoken':
options = request.POST.getlist(key)
for option in options:
Value(
code_round=code_round,
variable_id=key,
option_id=option,
confidence=request.POST.get('_confidence_'+key, None),
).save() #This is where it dies
# Resave to set is_correct
code_round.save()
o.status = '3'
o.save()
I've checked the sequences and such and they seem to be in order. At this point I'm not sure what to do- I assume it's something on django's end but I'm not sure. Any feedback would be much appreciated!
This happend to me - it turns out you need to resync your primary key fields in Postgres. The key is the SQL statement:
SELECT setval('tablename_id_seq', (SELECT MAX(id) FROM tablename)+1);
It appears to be a known difference of behaviour between the MySQL and SQLite (they update the next available primary key even when inserting an object with an explicit id) backends, and other backends like Postgres, Oracle, ... (they do not).
There is a ticket describing the same issue. Even though it was closed as invalid, it provides a hint that there is a Django management command to update the next available key.
To display the SQL updating all next ids for the application MyApp:
python manage.py sqlsequencereset MyApp
In order to have the statement executed, you can provide it as the input for the dbshell management command. For bash, you could type:
python manage.py sqlsequencereset MyApp | python manage.py dbshell
The advantage of the management commands is that abstracts away the underlying DB backend, so it will work even if later migrating to a different backend.
I had an existing table in my "inventory" app and I wanted to add new records in Django admin and I got this error:
Duplicate key value violates unique constraint "inventory_part_pkey"
DETAIL: Key (part_id)=(1) already exists.
As mentioned before, I run the code below to get the SQL command to reset the id-s:
python manage.py sqlsequencereset inventory
Piping the python manage.py sqlsequencereset inventory | python manage.py dbshell to the shell was not working
So I copied the generated raw SQL command
Then opened pgAdmin3 https://www.pgadmin.org for postgreSQL and opened my db
Clicked on the 6. icon (Execute arbitrary SQL queries)
Copied the statement what was generated
In my case the raw SQL command was:
BEGIN;
SELECT setval(pg_get_serial_sequence('"inventory_signup"','id'), coalesce(max("id"), 1), max("id") IS NOT null) FROM "inventory_signup";
SELECT setval(pg_get_serial_sequence('"inventory_supplier"','id'), coalesce(max("id"), 1), max("id") IS NOT null) FROM "inventory_supplier";
COMMIT;
Executed it with F5.
This fixed everything.
In addition to zapphods answer:
In my case the indexing was indeed incorrect, since I had deleted all migrations, and the database probably 10-15 times when developing as I wasn't in the stage of migrating anything.
I was getting an IntegrityError on finished_product_template_finishedproduct_pkey
Reindex the table and restart runserver:
I was using pgadmin3 and for whichever index was incorrect and throwing duplicate key errors I navigated to the constraints and reindexed.
And then reindexed.
The solution is that you need to resync your primary key fields as reported by "Hacking Life" who wrote an example SQL code but, as suggested by "Ad N" is better to run the Django command sqlsequencereset to get the exact SQL code that you can copy and past or run with another command.
As a further improvement to these answers I would suggest to you and other reader to dont' copy and paste the SQL code but, more safely, to execute the SQL query generated by sqlsequencereset from within your python code in this way (using the default database):
from django.core.management.color import no_style
from django.db import connection
from myapps.models import MyModel1, MyModel2
sequence_sql = connection.ops.sequence_reset_sql(no_style(), [MyModel1, MyModel2])
with connection.cursor() as cursor:
for sql in sequence_sql:
cursor.execute(sql)
I tested this code with Python3.6, Django 2.0 and PostgreSQL 10.
If you want to reset the PK on all of your tables, like me, you can use the PostgreSQL recommended way:
SELECT 'SELECT SETVAL(' ||
quote_literal(quote_ident(PGT.schemaname) || '.' || quote_ident(S.relname)) ||
', COALESCE(MAX(' ||quote_ident(C.attname)|| '), 1) ) FROM ' ||
quote_ident(PGT.schemaname)|| '.'||quote_ident(T.relname)|| ';'
FROM pg_class AS S,
pg_depend AS D,
pg_class AS T,
pg_attribute AS C,
pg_tables AS PGT
WHERE S.relkind = 'S'
AND S.oid = D.objid
AND D.refobjid = T.oid
AND D.refobjid = C.attrelid
AND D.refobjsubid = C.attnum
AND T.relname = PGT.tablename
ORDER BY S.relname;
After running this query, you will need to execute the results of the query. I typically copy and paste into Notepad. Then I find and replace "SELECT with SELECT and ;" with ;. I copy and paste into pgAdmin III and run the query. It resets all of the tables in the database. More "professional" instructions are provided at the link above.
If you have manually copied the databases, you may be running into the issue described here.
I encountered this error because I was passing extra arguments to the save method in the wrong way.
For anybody who encounters this, try forcing UPDATE with:
instance_name.save(..., force_update=True)
If you get an error that you cannot pass force_insert and force_update at the same time, you're probably passing some custom arguments the wrong way, like I did.
This question was asked about 9 years ago, and lots of people gave their own ways to solve it.
For me, I put unique=True in my email custom model field, but while creating superuser I didn't ask for the email to be mandatory.
Now after creating a superuser my email field is just saved as blank or Null. Now this is how I created and saved new user
obj = mymodel.objects.create_user(username='abc', password='abc')
obj.email = 'abc#abc.com'
obj.save()
It just threw the error saying duplicate-key-value-violates in the first line because the email was set to empty by default which was the same with the admin user. Django spotted a duplicate !!!
Solution
Option1: Make email mandatory while creating any user (for superuser as well)
Option2: Remove unique=True and run migrations
Option3: If you don't know where are the duplicates, you either drop the column or you can clear the database using python manage.py flush
It is highly recommended to know the reason why the error occurred in your case.
I was getting the same error as the OP.
I had created some Django models, created a Postgres table based on the models, and added some rows to the Postgres table via Django Admin. Then I fiddled with some of the columns in the models (changing around ForeignKeys, etc.) but had forgotten to migrate the changes.
Running the migration commands solved my problem, which makes sense given the SQL answers above.
To see what changes would be applied, without actually applying them:
python manage.py makemigrations --dry-run --verbosity 3
If you're happy with those changes, then run:
python manage.py makemigrations
Then run:
python manage.py migrate
I was getting a similar issue and nothing seemed to be working. If you need the data (ie cant exclude it when doing dump) make sure you have turned off (commented) any post_save receivers. I think the data would be imported but it would create the same model again because of these. Worked for me.
You just have to go to pgAdmin III and there execute your script with the name of the table:
SELECT setval('tablename_id_seq', (SELECT MAX(id) FROM tablename)+1);
Based on Paolo Melchiorre's answer, I wrote a chunk as a function to be called before any .save()
from django.db import connection
def setSqlCursor(db_table):
sql = """SELECT pg_catalog.setval(pg_get_serial_sequence('"""+db_table+"""', 'id'), MAX(id)) FROM """+db_table+""";"""
with connection.cursor() as cursor:
cursor.execute(sql)
This is the right statement. Mostly, It happens when we insert rows with id field.
SELECT setval('tablename_id_seq', (SELECT MAX(id) FROM tablename));