Djangos ORM uses a integer datatype for the automatically created ID column, but I need them to be bigint (using postgres backend). Is there a way of doing this?
Django 1.10
Use the newly added BigAutoField
A 64-bit integer, much like an AutoField except that it is guaranteed
to fit numbers from 1 to 9223372036854775807.
Older versions of django
You need to create your model like this
class MyModel(models.Model):
id = models.BigIntegerField(primary_key=True)
name = models.CharField(max_length=100)
Then after ./manage.py makemigrations has been run, open the generated migration and add the following into he operations list:
migrations.RunSQL("CREATE SEQUENCE myapp_seq"),
migrations.RunSQL("ALTER TABLE myapp_mymodel ALTER COLUMN id SET DEFAULT NEXTVAL('myapp_seq')");
Update
A valid point was raised by Daniel Roseman in the comments. In postgreql the following query works
INSERT INTO myapp_mymodel(name) values('some name');
but the following doesn't because primary keys are not null
INSERT INTO myapp_mymodel(id, name) values(null,'some name');
unfortunately it's the second form of the query that's passed through by django. This can still be solved with a bit of work.
def save(self, *args, **kwargs):
if not self.id :
cursor = connection.cursor();
cursor.execute("SELECT NEXTVAL('myapp_seq')")
id = cursor.fetchone()
self.id = id[0]
Model(MyModel,self).save(*args, **kwargs)
Related
I have a data from a raw sql query which has records that look like this:
"product_id": 12345
"date": 2022-12-25
"qty": 10
this is to go into a table with model "ProductMovement".
Its unique key is product_id and date.
product_movement has product_id as a foreign key to the Product model.
ProductMovement.objects.update_or_create() requires me to provide an instance of the Product model, not merely the unique key. This is inconvenient.
I guess I can use raw sql (backend is postgresql) but I wonder if there is another way. Can I add something to the model manager that intercepts the key value and replaces it with the instance, so at least I can hide this complexity from this part of the code? (I have never worked with model manager overrides).
Try this first
If you're command looks like this ProductMovement.objects.update_or_create(product=productObj) has no _id
Try running it like ProductMovement.objects.update_or_create(product_id=12345) with _id
Manager
Yea! You can override the update_or_create function- I've done it for filters/gets to fetch M2M relations easier
class ProductMovementManager(models.Manager):
def update_or_create(self, *args, **kwargs):
product = kwargs.pop('product')
if type(product) == int:
kwargs['product'] = Product.objects.filter(pk=product).first()
else:
# oops! It was already the object
kwargs['product'] = product
super().update_or_create(*args, **kwargs)
class ProductMovement(models.Model)
product = models.ForeignKey(product, on_delete=models.PROTECT)
objects = ProductMovementManager()
I'm trying to avoid having duplicate localized items stored in a Django-rest-framework app, django-localalized-fields package with a PostgreSQL database I can't find any way to make this work.
(https://pypi.org/project/django-localized-fields/)
I've tried writing custom duplicate detection logic in the Serializer, which works for create, but for update the localized fields become null (they are required fields, so I receive a not null constraint error). It seems to be django-localized-fields utility which is causing this problem.
The serializer runs correctly (create/update) when I'm not overriding create/update in the serializer by defining them separately.
I've also tried adding unique options to the database in the model, which does not work - duplicates are still created. Using the standard unique methods, or the method in the django-localized-fields documentation (uniqueness=['en', 'ro']).
I've also tried the UniqueTogetherValidator in Django, which also doesn't seem to support HStore/localizedfields.
I'd appreciate some help in tracking down either how to fix the update in the serializer or place a unique constraint in the database. Since django-localized-fields uses hstore in PostgreSQL it must be a common enough problem for applications using hstore to maintain uniqueness.
For those who aren't familiar, Hstore stores items as key/value pairs within a database. Here's an example of how django-localized-fields stores language data within the database:
"en"=>"english word!", "es"=>"", "fr"=>"", "frqc"=>"", "fr-ca"=>""
django-localized-fields constraint unique values only per the same language. If you want to achieve that values in a row don't collide with values in another row, you have to validate them on Django and database level.
Validation in Django
In Django you can create custom function validate_hstore_uniqueness, which is called everytime is model validated.
def validate_hstore_uniqueness(obj, field_name):
value_dict = getattr(obj, field_name)
cls = obj.__class__
values = list(value_dict.values())
# find all duplicite existing objects
duplicite_objs = cls.objects.filter(**{field_name+'__values__overlap':values})
if obj.pk:
duplicite_objs = duplicite_objs.exclude(pk=obj.pk)
if len(duplicite_objs):
# extract duplicite values
existing_values = []
for obj2 in duplicite_objs:
existing_values.extend(getattr(obj2, field_name).values())
duplicate_values = list(set(values) & set(existing_values))
# raise error for field
raise ValidationError({
field_name: ValidationError(
_('Values %(values)s already exist.'),
code='unique',
params={'values': duplicate_values}
),
})
class Test(models.Model):
slug = LocalizedField(blank=True, null=True, required=False)
def validate_unique(self, exclude=None):
super().validate_unique(exclude)
validate_hstore_uniqueness(self, 'slug')
Constraint in DB
For DB constraint you have to use constraint trigger.
def slug_uniqueness_constraint(apps, schema_editor):
print('Recreating trigger quotes.slug_uniqueness_constraint')
# define trigger
trigger_sql = """
-- slug_hstore_unique
CREATE OR REPLACE FUNCTION slug_uniqueness_constraint() RETURNS TRIGGER
AS $$
DECLARE
duplicite_count INT;
BEGIN
EXECUTE format('SELECT count(*) FROM %I.%I ' ||
'WHERE id != $1 and avals("slug") && avals($2)', TG_TABLE_SCHEMA, TG_TABLE_NAME)
INTO duplicite_count
USING NEW.id, NEW.slug;
IF duplicite_count > 0 THEN
RAISE EXCEPTION 'Duplicate slug value %', avals(NEW.slug);
END IF;
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
DROP TRIGGER IF EXISTS slug_uniqueness_constraint on quotes_author;
CREATE CONSTRAINT TRIGGER slug_uniqueness_constraint
AFTER INSERT OR UPDATE OF slug ON quotes_author
FOR EACH ROW EXECUTE PROCEDURE slug_uniqueness_constraint();
"""
cursor = connection.cursor()
cursor.execute(trigger_sql)
And enable it in migrations:
class Migration(migrations.Migration):
dependencies = [
('quotes', '0031_auto_20200109_1432'),
]
operations = [
migrations.RunPython(slug_uniqueness_constraint)
]
Probably is a good idea to also create GIN db index for speeding up lookups:
CREATE INDEX ON test_table using GIN (avals("slug"));
I'm trying to build a custom model manager, but have run into an error. The code looks like this:
class LookupManager(models.Manager):
def get_options(self, *args, **kwargs):
return [(t.key, t.value) \
for t in Lookup.objects.filter(group=args[0].upper())]
class Lookup(models.Model):
group = models.CharField(max_length=1)
key = models.CharField(max_length=1)
value = models.CharField(max_length=128)
objects = LookupManager()
(I have played around with get_options quite a lot using super() and other ways to filter the results)
When I run syncdb, I get the following error (ops_lookup being the corresponding table):
django.db.utils.DatabaseError: no such table: ops_lookup
I noticed that if I change the manager to return [] instead of a filter, then syncdb works. Also, if I've run syncdb and all the tables exist, then change the code to the above, it works as well.
How can I get Django to not expect this table to exist when running syncdb for the first time?
Update
After looking through the traceback I realised what was happening. The lookup table is meant to contain values which populate the choices of some columns in other tables. I think what happens is that the manager gets called when the other tables are created which, it seems, happens before the lookup table is created.
Is there any way to force django to create the lookup table first (short of renaming it?)
What's happening is that you're trying to access the database during module load time. For example:
class MyModel(models.Model):
name = models.CharField(max_length=255)
class OtherModel(models.Model):
some_field = models.CharField(
max_length=255,
# Next line fails on syncdb because the database table hasn't been created yet
# but the model is being queried during module load time (during class definition)
choices=[(o.pk, o.name) for o in MyModel.objects.all()]
)
This is equivalent to what you're doing because, as you've stated, you're using the manager method (transitively) to generate choices for other models.
Replacing the list comprehension with a generator expression will return an iterable, but will not evaluate the filtered queryset until the first iteration. So, this would fix the above example:
choices=((o.pk, o.name) for o in MyModel.objects.all())
Using your example, it would be:
class LookupManager(models.Manager):
def get_options(self, *args, **kwargs):
return ((t.key, t.value) for t in Lookup.objects.filter(group=args[0].upper()))
(note the use of ( and ) instead of [ and ]) (the outer ones) - that is the syntax for creating a generator expression.
Thanks for taking time to read my question.
I have a django app with the following model:
class UserProfile(models.Model):
user = models.OneToOneField(User)
...
class Visit(models.Model):
profile = models.ForeignKey(UserProfile)
date = models.DateField(auto_now_add=True, db_index=True)
ip = models.IPAddressField()
class Meta:
unique_together = ('profile', 'date', 'ip')
In a view:
profile = get_object_or_404(Profile, pk = ...)
get, create = Visit.objects.get_or_create(profile=profile, date=now.date(), ip=request.META['REMOTE_ADDR'])
if create: DO SOMETHING
Everything works fine, except that the Postgres Logs are full with duplicate key errors:
2012-02-15 14:13:44 CET ERROR: duplicate key value violates unique constraint "table_visit_profile_id_key"
2012-02-15 14:13:44 CET STATEMENT: INSERT INTO "table_visit" ("profile_id", "date", "ip") VALUES (1111, E'2012-02-15', E'xx.xx.xxx.xxx') RETURNING "table_visit"."id"
Tried different solution e.g.
from django.db import transaction
from django.db import IntegrityError
#transaction.commit_on_success
def my_get_or_create(prof, ip):
try:
object = Visit.objects.create(profile=prof, date=datetime.now().date(), ip=ip)
except IntegrityError:
transaction.commit()
object = Visit.objects.get(profile=prof, date=datetime.now().date(), ip=ip)
return object
....
created = my_get_or_create(prof, request.META['REMOTE_ADDR'])
if created: DO SOMETHING
This only helps for MySQL? Does anyone know how to avaid the duplicate key value errors for postgres?
Another possible reason for these errors in get_or_create() is data type mismatch in one of the search fields - for example passing False instead of None into a nullable field. The .get() inside .get_or_create() will not find it and Django will continue with new row creation - which will fail due to PostgreSQL constraints.
I had issues with get_or_create when using postgres. In the end I abandoned the boilerplate code for traditional:
try:
jobInvite = Invite.objects.get(sender=employer.user, job=job)
except Invite.DoesNotExist:
jobInvite = Invite(sender=employer.user, job=job)
jobInvite.save()
# end try
Have you at some point had unique=True set on Visit's profile field?
It looks like there's been a unique constraint generated for postgres that's still in effect. "table_visit_profile_id_key" is what it's auto generated name would be, and naturally it would cause those errors if you're recording multiple visits for a user.
If this is the case, are you using South to manage your database changes? If you aren't, grab it!
PostgreSQL behaves somewhat differently in some subtle queries, which results in IntegrityError errors, especially after you switch to Django 1.6. Here's the solution - you need to add select_on_save option to each failing model:
class MyModel(models.Model):
...
class Meta:
select_on_save = True
It's documented here: Options.select_on_save
I'm trying to write an internal API in my application without necessarily coupling it with the database.
class Product(models.Model):
name=models.CharField(max_length=4000)
price=models.IntegerField(default=-1)
currency=models.CharField(max_length=3, default='INR')
class Image(models.Model):
# NOTE -- Have changed the table name to products_images
width=models.IntegerField(default=-1)
height=models.IntegerField(default=-1)
url=models.URLField(max_length=1000, verify_exists=False)
product=models.ForeignKey(Product)
def create_product:
p=Product()
i=Image(height=100, widght=100, url='http://something/something')
p.image_set.add(i)
return p
Now, when I call create_product() Django throws up an error:
IntegrityError: products_images.product_id may not be NULL
However, if I call p.save() & i.save() before calling p.image_set.add(i) it works. Is there any way that I can add objects to a related object set without saving both to the DB first?
def create_product():
product_obj = Product.objects.create(name='Foobar')
image_obj = Image.objects.create(height=100, widght=100, url='http://something/something', product=product_obj)
return product_obj
Explanation:
Product object has to be created first and then assign it to the Image object because id and name here is required field.
I am wondering why wouldn't you not require to make a product entry in DB in first case? If there is any specific reason then i may suggest you some work around?
EDIT: Okay! i think i got you, you don't want to assign a product to an image object initially. How about creating a product field as null is equal to true.
product = models.ForeignKey(Product, null=True)
Now, your function becomes something like this:
def create_product():
image_obj = Image.objects.create(height=100, widght=100, url='http://something/something')
return image_obj
Hope it helps you?
I got same issue with #Saurabh Nanda
I am using Django 1.4.2. When I read in django, i see that
# file django/db/models/fields/related.py
def get_query_set(self):
try:
return self.instance._prefetched_objects_cache[rel_field.related_query_name()]
except (AttributeError, KeyError):
db = self._db or router.db_for_read(self.model, instance=self.instance)
return super(RelatedManager,self).get_query_set().using(db).filter(**self.core_filters)
# file django/db/models/query.py
qs = getattr(obj, attname).all()
qs._result_cache = vals
# We don't want the individual qs doing prefetch_related now, since we
# have merged this into the current work.
qs._prefetch_done = True
obj._prefetched_objects_cache[cache_name] = qs
That 's make sese, we only need to set property _prefetched_objects_cache for the object.
p = Product()
image_cached = []
for i in xrange(100):
image=Image(height=100, widght=100, url='http://something/something')
image_cached.append(image)
qs = p.images.all()
qs._result_cache = image_cached
qs._prefetch_done = True
p._prefetched_objects_cache = {'images': qs}
Your problem is that the id isn't set by django, but by the database (it's represented in the database by an auto-incremented field), so until it's saved there's no id. More about this in the documentation.
I can think of three possible solutions:
Set a different field of your Image model as the primary key (documented here).
Set a different field of your Production model as the foreign key (documented here).
Use django's database transactions API (documented here).