Modifying Django's pre_save/post_save Data - django

I'm having a hard time to grasp this post_save/pre_save signals from django.
What happens is that my model has a field called status and when a entry to this model is added/saved, it's status must be changed accordingly with some condition.
My model looks like this:
class Ticket(models.Model):
(...)
status = models.CharField(max_length=1,choices=OFFERT_STATUS, default='O')
And my signal handler, configured for pre_save:
def ticket_handler(sender, **kwargs):
ticket = kwargs['instance']
(...)
if someOtherCondition:
ticket.status = 'C'
Now, what happens if I put aticket.save() just bellow this last line if statement is a huge iteration black hole, since this action calls the signal itself. And this problem happens in both pre_save and post_save.
Well... I guess that the capability of altering a entry before (or even after) saving it is pretty common in django's universe. So, what I'm doing wrong here? Is the Signals the wrong approach or I'm missing something else here?
Also, would it be possible to, once this pre_save/post_save function is triggered, to access another model's instance and change a specific row entry on that?
Thanks

Signals are not a proper approach for updating fields in the same model prior to save. Override the model's save method instead of using a signal for this case.
def save(self, force_insert=False, force_update=False):
status = whatever....
super(Ticket, self).save(force_insert, force_update)
For updates to other models, signals are a great approach because you can easily decouple your models. Specifically, you can add a pre_/post_save signal to trigger actions without the need to modify the code of the saved model (which could reside in another application from a third party).

I agree with Carles that this may belong in a save(). When you must do this with signals, make sure you have a very tight condition around the save(). Your test could be rewrtitten as:
if someOtherCondition and ticket.status != 'C':
ticket.status = 'C'
ticket.save()
You won't get into infinite recursion with the test done this way.

Related

How to reliably run custom code when related (ForeignKey) objects change

In a simple ForeignKey relation, I want to run specific code when one of the related object is modified.
Here a schematic code :
class Car(models.Model):
pass
class Wheel(models.Model):
car = models.ForeignKey('Car',
on_delete=models.CASCADE,
related_name='wheels',
null=True)
def save(self, *args, **kwargs):
super().save(*args, **kwargs)
my_custom_code()
According to django's documentation, the reverse related manager method .add perform the save in database using update() instead of save() by default. (idem for .remove, etc...)
So in the below code, the my_custom_code isn't called when using .add:
car = Car.objects.create()
wheel = Wheel.objects.create() # ok my_custom_code called here
car.wheels.add(wheel) # not called here because django use "update mechanism" which do not use save method
We need to indicate bulk=False to add method in order to force using save method.
car.wheels.add(wheel, bulk=False) # ok my_custom_code called here
This is a problem for me as it is important that the my_custom_code is called when any of the related objects is modified. If someone forgot to indicate bulk=False, it will generate inconsistent data.
There is a signal for this case ? (like m2m_changed signal)
There is a way to force bulk=False for all methods of a ForeignKey relation ?
Thanks for your help.
Why I need this ? My X problem is to store a computed result depending of the related elements in the parent record. (and I need this value always be up to date)
My initial idea was to do computation every time a related model is modified by overriding save.

Django post_save signal seemingly triggered only on 2nd save or after

So I'm using a signal-triggered function on post_save to create instances of another model when the the first is saved:
The model triggering the signal:
class Product(models.Model):
# ...
colors = models.ManyToManyField(Color)
sizes = models.ManyToManyField(Size)
And the function:
def create_skus(instance, **kwargs):
for color in instance.colors.select_related():
for size in instance.colors.select_related():
SKU.objects.get_or_create(product=instance, color=color, size=size)
My issue is that create_skus should be called on post_save every time, but only seems to work on the 2nd save or after, resulting in users have to save a Product twice. What is the origin of this?
EDIT: I think this has something to do with how these M2M relations are added (i.e. instance.colors.add(<Color object>) but I'm not sure, and if you know of a workaround, I'd love you forever.
The signal is sent when the Product instance is saved, not when the Color and Size instances are saved. Therefore, on the first try, your post_save() function's Product instance will not (yet) have the Color and Size instances, as they are not saved through the Product model's save() method.
Check out these two links:
A possible solution posted by a fellow SO'er
You could also work with the m2m_changed signal.

Post create instance code call in django models

Sorry for some crazy subj.
I'd like to override django models save method and call some additional code if the model instance is newly created.
Sure I can use signals or check if the model have empty pk field and if yes, create temporary variable and later call a code:
Class EmailModel(models.Model):
email = models.EmailField()
def save(self, *args, **kwargs)
is_new = self.pk is None
super(EmailModel, self).save(*args, **kwargs)
# Create necessary objects
if is_new:
self.post_create()
def post_create(self):
# do job, send mails
pass
But I like to have some beautiful code and avoid using temporary variable in save method.
So the question is: is it possible to find if the instance of model is newly created object just after super save_base parent method call?
I've checked django sources can't find how to do that in right way.
Thanks
We have related post
For real - signals are best approch in this case.
You could use post_save() signal and in the listener just check if the credit_set exist for current model instance and if not - create one. That would be my choice - there is no need to overdo such a simple task.
Of course if you really need to know exactly when the model was initiated (I doubt it) use post_init() signal. You don't need to override save() method just to set some additional variables. Just catch post_init() signal, or pre_save(), and just change/add what you want. IMHO there is no sense to override save() method and check if this is new instance or not - that's why the signals are there.

Django ORM, aggregation?

Not sure how to phrase question, but here goes.
Say i have a table of events, and each event links to a object using a generic foreign key.
I dont want to show two events that link to the same foreign object, only the most recent event out of those which link to the same object.
Can anyone give any hints?
I think it may be very hard to do what you are asking. Since you are looking for a list of events and using generic relations, your only bet is probably to add some SQL via the .extra() filter. I'm not sure I would even know what SQL you would need to run for this since it would probably need GROUP BY and HAVING clauses.
As an alternative though, consider adding a BooleanField to your event object called latest. Then in the save method of your event, write some code like this:
def save(self, *args, **kwargs):
similar_events = Event.objects.filter(content_type=self.content_type,
object_id=self.object_id)
later_events = similar_events.filter(date__gt=self.date)
if later_events:
self.latest = False
else:
self.latest = True
similar_events.filter(date__lte=self.date).update(latest=False)
super(Event, self).save(*args, **kwargs)
Then to get your list of events simply do this:
Event.objects.filter(latest=True)
When you use foreign keys to point to a model, the model which is pointed to gets a descriptor referring to all the models that point to it. By default this is modelname_set, or event_set in this case, unless you've changed it.
You can use this descriptor to get the most recent event for the object. Since the descriptor returns a query set, you can order it by the date field, and take the first one.
latest_event_for_obj = obj.entry_set.order_by('-your_date_field')[0]

Django pre_save signal called when commit=False

I am writing a Django application that will track changes to the models, in a similar way to the admin interface. For example, I will be able to display a list of changes to a model, that look something like Changed Status from Open to Closed.
I am using the pre_save signal to do this, comparing the relevant fields between the existing item in the database, and the "instance" which is being saved. To get the existing item, I have to do sender._default_manager.get(pk=sender.pk) which seems a bit messy, but that part works.
The problem is, the view for changing this model calls the save() method on the form twice (first with commit=False) - this means that 2 changes get logged in the database, as the pre_save signal is emitted twice.
Is there any way I can accomplish this? Maybe in a different way altogether, though I remember reading that the Django admin app uses signals to track changes that users make.
Looking through the Django source, it seems that pre_save signals are sent on every call to save, even if commit is false. I would suggest inserting on the first pre_save, but add a flag column to the changes table, e.g.
class FooChanges(models.Model):
foo = models.ForeignKey(Foo)
dt = models.DateTimeField(default=datetime.now)
field = models.CharField(max_length=50)
value = models.CharField(max_length=50) # Or whatever is appropriate here
finished = models.BooleanField(default=False)
Then, your presave can be:
def pre_save_handler(sender, instance):
foo_changes, created = FooChanges.objects.get_or_create(foo=instance, finished=False, field='Status', value=instance.status)
if not created:
foo_changes.finished = True
foo_changes.save()
So on the first pre_save, you actually insert the change. On the second pass, you retrieve it from the database, and set the flag to false to make sure you don't pick it up the next time Foo's status changes.
use dispatch_uid:
http://docs.djangoproject.com/en/1.2/topics/signals/#preventing-duplicate-signals
Django Audit Log
django-audit-log is a pluggable app that does exactly what you want with little effort. I've used it in a project and I'll surely use it in many more now that I know it.