I am using Django 1.9, and now trying to override save method behaviour. The problem is that when I do instance.some_field = some_value, the self object is already modified, whereas I need to know that was the original value of some_field. To do this, the only way out seems to be fetching object from database by self's pk.
So I wonder what is GENERIC syntax - if there's any - to fetch instance from the database? By saying GENERIC, I imply that I don't want to explicitly type the model name (like MYMODEL.objects.get(...)), but rather make Django figure out the right model based on target instance's model.
To make the question clearer, let me illustrate my goal in a pseudo-code:
def save_extented(self, *args, **kwargs):
original_object = (self model).objects.get(pk = self.pk)
Is it possible ? Or maybe I don't need this, and there's some smart Django hack to fetch the instance with rolled back field values ?
You can use django-dirtyfields and pre_save signal:
#receiver(pre_save, sender=MyModel)
def pre_save_logic(sender, instance, **kwargs):
if 'my_field' in instance.get_dirty_fields():
do_some_logic()
Related
How to use PUT method for creating an object on particular id if no object is available
on that id in Django Rest Framework?
You can try update_or_create()
e.g:
class YourAPIView(APIView):
def put(self, request, **kwargs):
serializer = YourSerializer(data=request.data)
serializer.is_valid(raise_exception=True)
obj, created = YourModel.objects.update_or_create(
id=kwargs['id'],
defaults=serializer.validated_data)
return Response()
A RESTFUL API should error out for a PUT request on an object that doesn't exist. The idea being that if it had existed at one point to create the id, it has since been deleted. It makes more sense to keep it deleted than to re-create it.
This is especially true if the id is auto-generated. And even more so if it's an auto-incrementing integer id like the default id of Django models. If you were to support this functionality in that case, a user would create an instance of data with an id that the table hasn't incremented over yet potentially leading to errors like this.
I have a custom model manager and a custom queryset defined specifically for related obj which means I have defined Meta.base_manager_name in the model.
I would like to use a all() manager method which fetches related obj on a OneToOneFeild.
Now I know this does not make sense since OneToOneFeild will always return one obj there is no need for a all() method. I am working on django-oscar project and am extending its "Partner" model. It originally has a field "users" with ManyToManyField and now changed to a OneToOneFeild.
The users field is called in code several times using relation user.partners.all(). I don't want to extend/modify all these places (am I being lazy here?) since I want to keep the code as upgrade friendly as possible and so instead I wanted to have all() model manager defined which will work. Not sure if it is a good idea?
the all() method takes user arg to return queryset of the user instance
class PartnerQuerySet(models.QuerySet):
def all(self, user):
return self.filter(user=user)
class PartnerManager(models.Manager):
def get_queryset(self):
return PartnerQuerySet(self.model, using=self._db)
def all(self, user):
return self.get_queryset().all(users)
class Partner(models.Model):
objects = PartnerManager()
class Meta:
base_manager_name = 'objects'
The problem is when it is used with related obj it asks for user arg which makes sense but since I am using it with a related obj I wanted to use the related obj as arg so,
user.partner.all() - should use user as arg and fetch the results
user.partner.all(user) - and I should not have to do the below
2 related questions:
1) Does this make sense - should I be doing this?
2) how I can achieve user.partner.all() without adding user in arg
PS: I know i can work with middleware to get_current_user but this function is not reliable as per some of the responses on a different question on SO.
I don't think what you are trying to do will work. Your new situation with a OneToOneField gives you the partner instance.
>>>> user.partner
<Partner xxx>
While in the old situation with the ManyToManyField, the PartnerQuerySet would've been returned.
>>>> user.partner
<PartnerQuerySet []>
A solution would be to create a custom OneToOneField, but this would most probably violate the "simple is better than complex" rule and in the end may even be more work than changing all existing .all()'s.
If I have a model that has a UUID primary key and the the user may set the value on creation, is there any way to tell within the save method that the instance is new?
Previous techniques of checking the auto assigned fields: In a django model custom save() method, how should you identify a new object? do not work.
Use self._state.adding. It defaults to True and gets set to False after saving the model instance or loading it from the DB.
You should also check the force_insert argument of save.
Note that this will not work if you attempt to copy an instance by changing its id and saving (a common shortcut). If you need to detect this, you could override the instance saving and loading to also store the pk on self._state, then compare the current pk with self._state.pk.
In save(), self.pk is None with pk (uuid) dont work because it should has default = uuid.uuid4 and if you set it to default = None primarykey should has default attribute as valid uuid in DB, so let default = uuid.uuid4 in UUID field.
The esay way is to add field created_at:
created_at = models.DateTimeField(auto_now_add=True)
and in save() use :
if self.created_at is None:
your code here
save takes an optional parameter, force_insert. Passing that as True will force Django to do an INSERT. See the documentation.
You can use django-model-utils TimeStampedModel (you can also use django-extensions TimeStampedModel or make your own).
This provides each model a created and modified field. Then, compare the timedelta between the new instance's created and modified fields to an arbitrary time difference (this example uses 5 seconds). This allows you to identify if an instance is new:
def save(self, *args, **kwargs):
super(<ModelName>, self).save(*args, **kwargs)
if (self.modified - self.created).seconds < 5:
<the instance is new>
I have a model which is overriding save() to slugify a field:
class MyModel(models.Model):
name = models.CharField(max_length=200)
slug = models.SlugField(max_length=200)
def save(self, *args, **kwargs):
self.slug = slugify(self.name)
super(MyModel, self).save(*args, **kwargs)
When I run load data to load a fixture, this save() does not appear to be called because the slug field is empty in the database. Am I missing something?
I can get it to work by a pre_save hook signal, but this is a bit of a hack and it would be nice to get save() working.
def mymodel_pre_save(sender, **kwargs):
instance = kwargs['instance']
instance.slug = slugify(instance.name)
pre_save.connect(mymodel_pre_save, sender=MyModel)
Thanks in advance.
No you're not. save() is NOT called by loaddata, by design (its way more resource intensive, I suppose). Sorry.
EDIT: According to the docs, pre-save is not called either (even though apparently it is?).
Data is saved to the database as-is, according to https://docs.djangoproject.com/en/dev/ref/django-admin/#what-s-a-fixture
I'm doing something similar now - I need a second model to have a parallel entry for each of the first model in the fixture. The second model can be enabled/disabled, and has to retain that value across loaddata calls. Unfortunately, having a field with a default value (and leaving that field out of the fixture) doesn't seem to work - it gets reset to the default value when the fixture is loaded (The two models could have been combined otherwise).
So I'm on Django 1.4, and this is what I've found so far:
You're correct that save() is not called. There's a special DeserializedObject that does the insertion, by calling save_base() on the Model class - overriding save_base() on your model won't do anything since it's bypassed anyway.
#Dave is also correct: the current docs still say the pre-save signal is not called, but it is. It's behind a condition: if origin and not meta.auto_created
origin is the class for the model being saved, so I don't see why it would ever be falsy.
meta.auto_created has been False so far with everything I've tried, so I'm not yet sure what it's for. Looking at the Options object, it seems to have something to do with abstract models.
So yes, the pre_save signal is indeed being sent.
Further down, there's a post_save signal behind the same condition that is also being sent.
Using the post_save signal works. My models are more complex, including a ManyToMany on the "Enabled" model, but basically I'm using it like this:
from django.db.models.signals import post_save
class Info(models.Model):
name = models.TextField()
class Enabled(models.Model):
info = models.ForeignKey(Info)
def create_enabled(sender, instance, *args, **kwards):
if Info == sender:
Enabled.objects.get_or_create(id=instance.id, info=instance)
post_save.connect(create_enabled)
And of course, initial_data.json only defines instances of Info.
Sorry for some crazy subj.
I'd like to override django models save method and call some additional code if the model instance is newly created.
Sure I can use signals or check if the model have empty pk field and if yes, create temporary variable and later call a code:
Class EmailModel(models.Model):
email = models.EmailField()
def save(self, *args, **kwargs)
is_new = self.pk is None
super(EmailModel, self).save(*args, **kwargs)
# Create necessary objects
if is_new:
self.post_create()
def post_create(self):
# do job, send mails
pass
But I like to have some beautiful code and avoid using temporary variable in save method.
So the question is: is it possible to find if the instance of model is newly created object just after super save_base parent method call?
I've checked django sources can't find how to do that in right way.
Thanks
We have related post
For real - signals are best approch in this case.
You could use post_save() signal and in the listener just check if the credit_set exist for current model instance and if not - create one. That would be my choice - there is no need to overdo such a simple task.
Of course if you really need to know exactly when the model was initiated (I doubt it) use post_init() signal. You don't need to override save() method just to set some additional variables. Just catch post_init() signal, or pre_save(), and just change/add what you want. IMHO there is no sense to override save() method and check if this is new instance or not - that's why the signals are there.