Identifying new Model Instance in Django Save with UUID pk - django

If I have a model that has a UUID primary key and the the user may set the value on creation, is there any way to tell within the save method that the instance is new?
Previous techniques of checking the auto assigned fields: In a django model custom save() method, how should you identify a new object? do not work.

Use self._state.adding. It defaults to True and gets set to False after saving the model instance or loading it from the DB.
You should also check the force_insert argument of save.
Note that this will not work if you attempt to copy an instance by changing its id and saving (a common shortcut). If you need to detect this, you could override the instance saving and loading to also store the pk on self._state, then compare the current pk with self._state.pk.

In save(), self.pk is None with pk (uuid) dont work because it should has default = uuid.uuid4 and if you set it to default = None primarykey should has default attribute as valid uuid in DB, so let default = uuid.uuid4 in UUID field.
The esay way is to add field created_at:
created_at = models.DateTimeField(auto_now_add=True)
and in save() use :
if self.created_at is None:
your code here

save takes an optional parameter, force_insert. Passing that as True will force Django to do an INSERT. See the documentation.

You can use django-model-utils TimeStampedModel (you can also use django-extensions TimeStampedModel or make your own).
This provides each model a created and modified field. Then, compare the timedelta between the new instance's created and modified fields to an arbitrary time difference (this example uses 5 seconds). This allows you to identify if an instance is new:
def save(self, *args, **kwargs):
super(<ModelName>, self).save(*args, **kwargs)
if (self.modified - self.created).seconds < 5:
<the instance is new>

Related

Django "duplicate key value violates unique constraint" because of the save method

I am trying to update an existing instance from a model in Django admin interface but I'm getting django.db.utils.IntegrityError: duplicate key value violates unique constraint because of the save method.
According to the documentation, the save method should not have any problems UPDATING the model. https://docs.djangoproject.com/en/4.0/ref/models/instances/#how-django-knows-to-update-vs-insert .
Django abstracts the need to use INSERT or UPDATE SQL statements.
Specifically, when you call save() and the object’s primary key
attribute does not define a default, Django follows this algorithm:
If the object’s primary key attribute is set to a value that evaluates
to True (i.e., a value other than None or the empty string), Django
executes an UPDATE. If the object’s primary key attribute is not set
or if the UPDATE didn’t update anything (e.g. if primary key is set to
a value that doesn’t exist in the database), Django executes an
INSERT.
Now this is my model and the save method:
class Examene(models.Model):
saptamani = models.IntegerField(default=3, validators=[MinValueValidator(1), MaxValueValidator(5)])
zile = models.IntegerField(null=False, blank=True)
profesor = models.ForeignKey(Materii_profesor, on_delete=models.CASCADE)
materii = models.ManyToManyField(Materie)
def save(self, *args, **kwargs):
self.zile = (self.saptamani * 5)
super(Examene, self).save(self, *args, **kwargs)
I am using Postgres as database, how would I avoid the error? Is there a way to delete existing instance from the save method and save the updated version? Or is another way to pass this? I'm having the same problem for all the models that have override the save method.

Getting model instance by variable model name and pk

I am using Django 1.9, and now trying to override save method behaviour. The problem is that when I do instance.some_field = some_value, the self object is already modified, whereas I need to know that was the original value of some_field. To do this, the only way out seems to be fetching object from database by self's pk.
So I wonder what is GENERIC syntax - if there's any - to fetch instance from the database? By saying GENERIC, I imply that I don't want to explicitly type the model name (like MYMODEL.objects.get(...)), but rather make Django figure out the right model based on target instance's model.
To make the question clearer, let me illustrate my goal in a pseudo-code:
def save_extented(self, *args, **kwargs):
original_object = (self model).objects.get(pk = self.pk)
Is it possible ? Or maybe I don't need this, and there's some smart Django hack to fetch the instance with rolled back field values ?
You can use django-dirtyfields and pre_save signal:
#receiver(pre_save, sender=MyModel)
def pre_save_logic(sender, instance, **kwargs):
if 'my_field' in instance.get_dirty_fields():
do_some_logic()

Restrict a model to access only rows with a specific condition?

I want to use a Django model to access a subset of database rows. Working with a number of legacy databases, I'd rather not create views to the database, if possible.
In short, I'd like to tell my model that there's field foo which should always have the value bar. This should span any CRUD operation for the table, so that newly created rows would also have foo=bar. Is there a simple Django way for what I'm trying to achieve?
UPDATE: I want to ensure that this model doesn't write anything to the table where foo != bar. It must be able to read, modify or delete only those rows where foo=bar.
For newly created items you can set the default value in model definition
class MyModel(models.Model):
# a lot of fields
foo = models.CharField(max_length=10, default='bar')
# Set the manager
objects = BarManager()
def save(self, force_insert=False, force_update=False, using=None):
self.foo = 'bar'
super(MyModel, self).save(force_insert, force_update, using)
To achieve that MyModel.objects.all() should return only rows with foo=bar you should implement your custom manager. You can re-define the get_query_set method to add filtering.
class BarManager(models.Manager):
use_for_related_fields = True
def get_query_set(self):
return super(BarManager, self).get_query_set().filter(foo='bar')
Update after #tuomassalo comment
1) The custom manager will affect all calls to MyModel.objects.get(id=42) as this call just proxy a call to .get_query_set().get(id=42). To achieve this you have to set Manager as default manager for model (assign it to objects variable).
To use this manager for related lookups (e.g. another_model_instance.my_model_set.get(id=42)) you need to set use_for_related_fields = True on you BarManager. See Controlling automatic Manager types in the docs.
2) If you want to enforce foo=bar then default value is not enough for you. You can either use pre_save signal or overwrite the save method on your model. Don't forget to call the original save method.
I updated the MyModel example above.

Django: making a custom PK auto-increment?

I've been using custom primary keys for a model in Django. (This was because I was importing values into the database and they already had ID's attached, and it made sense to preserve the existing values.)
class Transaction(models.Model):
id = models.IntegerField(primary_key=True)
transaction_type = models.IntegerField(choices=TRANSACTION_TYPES)
date_added = models.DateTimeField(auto_now_add=True)
However, now I want to add new instances of the model to the database, and I'd like to autogenerate a unique primary key. But if I don't specify the ID at the time of creating the instance, I get an error:
t = Transaction(transaction_type=0)
t.save()
gives:
IntegrityError at /page
(1048, "Column 'id' cannot be null")
How can I autogenerate a unique ID to specify for new values, without having to alter the way I import the existing values?
UPDATE
I've written this custom method, which seems to work...
class Transaction(models.Model):
def save(self, *args, **kwargs):
if not self.id:
i = Transaction.objects.all().order_by('-id')[0]
self.id = i.id+1
super(Transaction, self).save(*args, **kwargs)
You can use AutoField for the column id instead of IntegerField. The following should work for you:
id = models.AutoField(primary_key=True)
id will now increase automatically and won't have concurrency problems as it may encounter in save method.
I've ended up using very similar piece of code, but have made it slightly more generic:
def save(self, *args, **kwargs):
if self.id is None:
self.id = self.__class__.objects.all().order_by("-id")[0].id + 1
super(self.__class__, self).save(*args, **kwargs)
it uses self.__class__ so you can just copy paste this code to any model class without changing anything.
How are you importing the existing values? It would be trivial to write something into your Transactions __init__ to generate a new ID for you, but without knowing how you're importing the other values I can't say for sure whether it will alter the way you work with them.
If you remove your declared id field, django will automatically assume this:
id = models.AutoField(primary_key=True)
In Django 1.8, inspectdb will automatically detect auto_increment and use an AutoField when generating models.
Django migrations will do most of the hard work for you here.
Firstly, stop any access to your app so users can't change the database whilst you are working on it.
It would then be very wise to backup your database, before performing any work, as a precaution.
Remove your manually declared id field from your models.py (i.e. delete it).
Run makemigrations and then migrate. Django will modify the id field to the correct implementation for your database version.
Run this (example) command in psql adapting, if need be, to your table names:
select setval(pg_get_serial_sequence('transactions_transaction', 'id'), max(id)) from transactions_transaction;
This will set your id field to the correct serial sequence value in postgres for your table (i.e. the largest value of the id field of your existing records). This is crucial, as otherwise the value will be 1!
And that's it: from now on everything will be automatic again.

How to update multiple fields of a django model instance?

I'm wondering, what is a standard way of updating multiple fields of an instance of a model in django? ... If I have a model with some fields,
Class foomodel(models.Model):
field1 = models.CharField(max_length=10)
field2 = models.CharField(max_length=10)
field3 = models.CharField(max_length=10)
...
... and I instantiate it with one field given, and then in a separate step I want to provide the rest of the fields, how do I do that by just passing a dictionary or key value params? Possible?
In other words, say I have a dictionary with some data in it that has everything I want to write into an instance of that model. The model instance has been instantiated in a separate step and let's say it hasn't been persisted yet. I can say foo_instance.field1 = my_data_dict['field1'] for each field, but something tells me there should be a way of calling a method on the model instance where I just pass all of the field-value pairs at once and it updates them. Something like foo_instance.update(my_data_dict). I don't see any built-in methods like this, am I missing it or how is this efficiently done?
I have a feeling this is an obvious, RTM kind of question but I just haven't seen it in the docs.
It's tempting to mess with __dict__, but that won't apply to attributes inherited from a parent class.
You can either iterate over the dict to assign to the object:
for (key, value) in my_data_dict.items():
setattr(obj, key, value)
obj.save()
Or you can directly modify it from a queryset (making sure your query set only returns the object you're interested in):
FooModel.objects.filter(whatever="anything").update(**my_data_dict)
You could try this:
obj.__dict__.update(my_data_dict)
It seems like such a natural thing you'd want to do but like you I've not found it in the docs either. The docs do say you should sub-class save() on the model. And that's what I do.
def save(self, **kwargs):
mfields = iter(self._meta.fields)
mods = [(f.attname, kwargs[f.attname]) for f in mfields if f.attname in kwargs]
for fname, fval in mods: setattr(self, fname, fval)
super(MyModel, self).save()
I get primary key's name, use it to filter with Queryset.filter() and update with Queryset.update().
fooinstance = ...
# Find primary key and make a dict for filter
pk_name foomodel._meta.pk.name
filtr = {pk_name: getattr(fooinstance, pk_name)}
# Create a dict attribute to update
updat = {'name': 'foo', 'lastname': 'bar'}
# Apply
foomodel.objects.filter(**filtr).update(**updat)
This allows me to update an instance whatever the primary key.
Update using update()
Discussion.objects.filter(slug=d.slug)
.update(title=form_data['title'],
category=get_object_or_404(Category, pk=form_data['category']),
description=form_data['description'], closed=True)