Lets say I have a model called BookModel with 4 fields : (title, author, price, publish_year).
And I have a handler in signals:
#receiver([post_save, post_delete], sender=BookModel)
def signal_handler(sender, instance, **kwargs):
…..
Question is how to distinguish a situation when specific model field has changed during save(). For example if price has changed I want to do stuff. Better explain in pseudo code...
#receiver([post_save, post_delete], sender=BookModel)
def signal_handler(sender, instance, **kwargs):
# pseudo code bellow
if field “price” has changed:
do stuff
else:
do nothing
According the docs if I use “update_fields” in save() - it is possible, but what if I dont use it???
Also is it possible to distinguish a situation when I received signal from post_save or from post_delete still using 1 handler?
#receiver([post_save, post_delete], sender=BookModel)
def signal_handler(sender, instance, **kwargs):
# pseudo code bellow
if signal is post_save:
if field “price” has changed:
do stuff
else:
do nothing
else:
do other stuff
Thanks
You can try django-model-utils's FieldTracker to track changes in model fields. It also use with post_save signal.
Checking changes using signals
The field tracker methods may also be used in pre_save and post_save signal handlers to identify field changes on model save.
Related
I have a User model and a UserImage model that contains a foreign key to a User. The foreign key is set to CASCADE delete.
Here is what the receivers look like in my models.py:
#receiver(pre_delete, sender=User)
def deleteFile(sender, instance, **kwargs):
print("User pre_delete triggered")
instance.thumbnail.delete()
#receiver(pre_delete, sender=UserImage)
def deleteFile(sender, instance, **kwargs):
print("UserImage pre_delete triggered")
instance.image.delete()
When I execute the following lines of code:
>>> User.objects.last().delete()
"UserImage pre_delete triggered"
For some reason the associated UserImage signal is being received but the actual User model's signal is not.
Am I missing something?
If you read the documentation carefully you will see that the delete() method on a model will execute purely in SQL (if possible). So the delete() method on UserImage will not be called by Django, thus the signal will not be triggered. If you want it to be triggered you could override the delete method on your User model to also call the delete() on the related object. Something like this:
class User(models.Model):
def delete(self, using=None):
self.userimage_set.all().delete()
super().delete(using=using)
UPDATE:
I did not read the question correctly so I have to update my answer. I think what is happening is that both signals have the same name and thus the first one is overwritten by the second one, and thus only the second one is executed. I would suggest changing the function name to something else and see if that changes things.
I want to use m2m_changed signal when a realtion in ManyToMany field removed. I write this code in models.py:
class Unit(Model):
# ....
class Package(Model):
# ...
Lesson = ManyToManyField(Unit, blank=True)
def toppings_changed(sender, **kwargs):
if kwargs.get("pk_set") and kwargs.get("action") == "pre_clear":
# get id of lesson to delete and do something with it
m2m_changed.connect(toppings_changed, sender=Packages.Lesson.through)
I want to do something when removing a Lesson relation in Package Model. but when I remove a relation pre_clear will call and I can't access to pk_set (it is None). Is there a way to access Unit id when remove a relation in Lesson Model.
First: m2m_changed is called when any of the two sides of a many-to-many relation changes. i.e. some_unit.package_set.clear() and some_package.Lesson.clear() both can trigger the registered signal.
Second: In the pre_clear state, all relations still exist (since it's actually pre-clear). So you can see what's gonna be removed like this:
def toppings_changed(sender, reverse, action, **kwargs):
if action == "pre_clear":
if reverse:
# sender is of type Unit
# i.e. current signal is raised by: some_unit.package_set.clear()
packages_to_be_removed = sender.package_set.all()
else:
# sender is of type Package
# i.e. current signal is raised by: some_package.Lesson.clear()
lessons_to_be_removed = sender.Lesson.all()
Per the Django 1.8 docs.:
pk_set ...
For the pre_clear and post_clear actions, this is None.
https://docs.djangoproject.com/en/1.8/ref/signals/#m2m-changed
So you'll have to generate it by other means like #Emran BatmanGhelich suggested
I have the following models:
class Item(models.Model):
# fields
# ...
class Collection(models.Model):
items = models.ManyToManyField(Item, related_name="collections")
# other fields
# ...
Now I want two things:
I want to control if an Item can be added to a Collection.
I want the Collection to update some of its fields if an Item was added or removed.
For the second issue I know that there is the django.db.models.signals.m2m_changed which I can use to hook into changes of the relation. Is it allowed/ok to change the Collection within the signal callback? Can I use the signal also for "aborting" the insertion for issue 1?
I think the best way to approach both of your desired behaviors is not with signals, but rather with an overridden save() and delete() method on the through table which you would define explicitly using the argument through see: https://docs.djangoproject.com/en/dev/ref/models/fields/#django.db.models.ManyToManyField.through. and this: https://docs.djangoproject.com/en/dev/topics/db/models/#overriding-predefined-model-methods
Something like this:
# -*- coding: utf-8 -*-
from django.db import models
class Item(models.Model):
# fields
# ...
class Collection(models.Model):
items = models.ManyToManyField(Item, related_name="collections", through="CollectionItem")
# other fields
# ...
class CollectionItem(models.Model):
collection = models.ForeignKey(Collection)
item = models.ForeignKey(Item)
def save(self, *args, **kwargs):
# Only allow this relationship to be created on some_condition
# Part 1 of your question.
if some_condition:
super(CollectionItem, self).save(*args, **kwargs)
# Update some fields on Collection when this
# relationship is created
# Part 2 of your question (1/2)
self.Collection.updateSomeFields()
def delete(self, *args, **kwargs):
collection = self.collection
super(CollectionItem, self).delete(*args, **kwargs)
# Update some fields on Collection when this relationship
# is destroyed.
# Part 2 of your question (2/2)
collection.updateSomeFields()
Incidentally, you'll find that adding a relationship will cause a save-signal on this through model.
And, regarding signals, once you have the through table in place, you'd be able to listen for pre_save and/or post_save signals, but neither of them will allow you to directly veto the creation of the relationship.
If one or both of your models are supplied by a 3rd party and you really cannot create the through table, then, yes, the signal route may be the only way to go.
https://docs.djangoproject.com/en/dev/ref/signals/#m2m-changed
In which case, you could listen for the m2m_changed event and trigger updates to your collection objects (part 2 of your question) and retroactively delete inappropriately created relationships (part 1 of your question). However, as this latter bit is a fugly kludgy, I'd stick with the explicit through table if you can.
The pre_save signal is called before saving an instance. But you are not able to abort the save operation from there. A better solution would be to add a new method to your Collection model, which is responsible for checking if an Item can be added:
class Collection(models.Model):
items = models.ManyToManyField(Item, related_name="collections")
...
def add_item(self, item):
if check_if_item_can_be_added(item):
items.add(item)
self.save()
def check_if_item_can_be_added(self, item):
# do your checks here
When adding an instance to a m2m field, the save method does not get called. You are right, the m2m_changed signal is the way to go. You can safely update the collection instance in there.
When is the appropriate time to do validation on related fields in a model?
For example if I have a class Video that has a ManyToMany relationship with a class Playlist, when the Video is changed to 'private', it should be removed from all Playlists.
Doing this in the model's clean() method seems dangerous - since the model might fail validation and not save, but the Playlist references will have been deleted.
Is doing it in a post_save or pre_save signal the right way to go?
You have two choices:
First one is using a post_save signal that does the job. Not a pre_save, because saving can fail and I guess you only want to do that if the save worked right.
Other option is overriding model's save() method like:
def save(self, *args, **kwargs):
super(MyModel, self).save(*args, **kwargs)
# do stuff for removing whatever you want to remove
My personal choice is the first one because you deal with different models. If you only need to deal with the current one, I'll do the second one. But it's just a personal thought.
I have a model which is overriding save() to slugify a field:
class MyModel(models.Model):
name = models.CharField(max_length=200)
slug = models.SlugField(max_length=200)
def save(self, *args, **kwargs):
self.slug = slugify(self.name)
super(MyModel, self).save(*args, **kwargs)
When I run load data to load a fixture, this save() does not appear to be called because the slug field is empty in the database. Am I missing something?
I can get it to work by a pre_save hook signal, but this is a bit of a hack and it would be nice to get save() working.
def mymodel_pre_save(sender, **kwargs):
instance = kwargs['instance']
instance.slug = slugify(instance.name)
pre_save.connect(mymodel_pre_save, sender=MyModel)
Thanks in advance.
No you're not. save() is NOT called by loaddata, by design (its way more resource intensive, I suppose). Sorry.
EDIT: According to the docs, pre-save is not called either (even though apparently it is?).
Data is saved to the database as-is, according to https://docs.djangoproject.com/en/dev/ref/django-admin/#what-s-a-fixture
I'm doing something similar now - I need a second model to have a parallel entry for each of the first model in the fixture. The second model can be enabled/disabled, and has to retain that value across loaddata calls. Unfortunately, having a field with a default value (and leaving that field out of the fixture) doesn't seem to work - it gets reset to the default value when the fixture is loaded (The two models could have been combined otherwise).
So I'm on Django 1.4, and this is what I've found so far:
You're correct that save() is not called. There's a special DeserializedObject that does the insertion, by calling save_base() on the Model class - overriding save_base() on your model won't do anything since it's bypassed anyway.
#Dave is also correct: the current docs still say the pre-save signal is not called, but it is. It's behind a condition: if origin and not meta.auto_created
origin is the class for the model being saved, so I don't see why it would ever be falsy.
meta.auto_created has been False so far with everything I've tried, so I'm not yet sure what it's for. Looking at the Options object, it seems to have something to do with abstract models.
So yes, the pre_save signal is indeed being sent.
Further down, there's a post_save signal behind the same condition that is also being sent.
Using the post_save signal works. My models are more complex, including a ManyToMany on the "Enabled" model, but basically I'm using it like this:
from django.db.models.signals import post_save
class Info(models.Model):
name = models.TextField()
class Enabled(models.Model):
info = models.ForeignKey(Info)
def create_enabled(sender, instance, *args, **kwards):
if Info == sender:
Enabled.objects.get_or_create(id=instance.id, info=instance)
post_save.connect(create_enabled)
And of course, initial_data.json only defines instances of Info.