Django: Update Record before save - django

Model:
class Tester:
test = models.ForeignKey(Platform)
count = models.Integer()
status = models.CharField(max_length=1, default='A')
I need to change the status to 'D' every time I insert a new record for the same test.
I tried using Signals pre_save, but that function went into a loop. I would really appreciate any help.

Signal fucntion probably goes into an infinite loop because you save the same model's instances in that function, in turn each one triggering the signal function itself. With a little care you can prevent this from happenning:
from django.db.models.signals import pre_save
#receiver(pre_save, sender=Tester)
def tester_pre_save(sender, instance, **kwargs):
if not instance.pk:
# This means that a new record is being created. We need this check as you want to do the operation when a new entry is **inserted** into table
Tester.objects.filter(test=instance.test).update(status='D')
or, with the post_save signal:
from django.db.models.signals import post_save
#receiver(post_save, sender=Tester)
def tester_post_save(sender, instance, created, **kwargs):
if created:
# This means that a new record has been created. We need this check as you want to do the operation when a new entry is **inserted** into table
Tester.objects.filter(test=instance.test).update(status='D')
Important points here is. as we are using update method of a query set to update existing entries, those won't trigger signals, because they don't use the model's save method, so that signal method won't be triggered for other instances we update here. And even if it were to be triggered, as we are doing the update operation under a condition check (if a new instance is being created), signal methods for those other saves wouldn't do anything, hence they would not cause an infinite loop.

Override the save method of our Tester class:
def save(self, *args, **kwargs):
if Tester.objects.filter(test=self.test).count()>0:
self.status="D"
else:
self.status="A"
super(Model, self).save(*args, **kwargs)
Put this into the class definition of Tester.

Related

Circumventing F expression problem of django-simple-history by overriding save

Django-simple-history is inserting new record on each save of target model. In docs the problem with F expressions is described. I try to circumvent this with overriden save method.
def save(self, *args, **kwargs):
super().save(*args, **kwargs)
# some other actions
self.refresh_from_db()
But it seems, that this is not working. Is the post_save signal of base model called directly after super().save() call? If so, is there a way to solve this problem keeping the F expression in target model update?
UPDATE: Saved instance has one of its attributes defined using an F expression, so this code is called in some other module:
instance.some_attribute = (F('some_attribute') + 15)
instance.save(update_fields=['some_attribute'])
This throws an error in django-simple-history's post_save signal, when it tries to insert a extended copy of instance to history table. I tried to refresh the instance in overriden save method to get rid of the F expression in some_attribute so the actual value is loaded. From the traceback it seems that the post_save is called right after super().save() call, before the refresh. Is it the way Django post_save with overriden save works? If so, is there a way to not change the update code (leave the update with F expression) and solve the history insert in model's save?
django-simple-history provides signals for before and after the historical record is created: https://django-simple-history.readthedocs.io/en/2.7.0/signals.html
I suggest using these to update the instance before it gets saved to the historical table. Something like this should work:
from django.dispatch import receiver
from simple_history.signals import (
pre_create_historical_record,
post_create_historical_record
)
#receiver(pre_create_historical_record)
def pre_create_historical_record_callback(sender, **kwargs):
instance = kwargs["instance"]
history_instance = kwargs["history_instance"]
if isinstance(instance, ModelYouWantToRefresh)
instance.refresh_from_db()
history_instance.some_attribute = instance.some_attribute
Based on Ross Mechanic answer I made a universal solution
from django import dispatch
from django.db.models import expressions
from simple_history import signals
#dispatch.receiver(
signals.pre_create_historical_record, dispatch_uid="simple_history_refresh"
)
def remove_f_expressions(sender, instance, history_instance, **kwargs) -> None: # noqa
f_expression_fields = []
for field in history_instance._meta.fields: # noqa
field_value = getattr(history_instance, field.name)
if isinstance(field_value, expressions.BaseExpression):
f_expression_fields.append(field.name)
if f_expression_fields:
instance.refresh_from_db()
for field_name in f_expression_fields:
field_value = getattr(instance, field_name)
setattr(history_instance, field_name, field_value)

Unable to override model save()

Trying to execute some code when the Profile model gets updated but it appears the save method never gets called as the print statement never shows.
Profile.objects.filter(user__id=1).update(field_a='test')
models.py
class Profile(models.Model):
def save(self, *args, **kwargs):
print("Test")
super(Profile, self).save(*args, **kwargs)
No, the save method is not called. See the docs on update():
Finally, realize that update() does an update at the SQL level and, thus, does not call any save() methods on your models, nor does it emit the pre_save or post_save signals (which are a consequence of calling Model.save()). If you want to update a bunch of records for a model that has a custom save() method, loop over them and call save(), like this:
for e in Entry.objects.filter(pub_date__year=2010):
e.comments_on = False
e.save()

Is save() method commiting changes asynchronously?

I have a simple class model with Django Admin (v. 1.9.2) like this:
from django.contrib.auth.models import User
class Foo(models.Model):
...
users = models.ManyToManyField(User)
bar = None
I have also overloaded save() method like this:
def save(self, *args, **kwargs):
self.bar = 1
async_method.delay(...)
super(Foo, self).save(*args, **kwargs)
Here async_method is an asynchronous call to a task that will run on Celery, which takes the users field and will add some values to it.
At the same time, whenever a user is added to the ManyToManyField, I want to do an action depending on the value of the bar field. For that, I have defined a m2m_changed signal:
def process_new_users(sender, instance, **kwargs):
if kwargs['action'] == 'post_add':
# Do some stuff
print instance.bar
m2m_changed.connect(process_new_users, sender=Foo.users.through)
And there's the problem. Although I'm changing the value of bar inside the save() method and before I call the asynchronous method, when the process_new_users() method is triggered, instance.bar is still None (initial value).
I'm not sure if this is because the save() method commits changes asynchronously and when the process_new_users() is triggered it has not yet commited changes and is retrieving the old value, or if I'm missing something else.
Is my assumption correct? If so, is there a way to force the values in save() be commited synchronously so I can then call the asynchronous method?
Note: Any alternative way of achieving this is also welcome.
UPDATE 1: As of #Gert's answer, I implemented a transaction.on_change() trigger so whenever the Foo instance is saved, I can safely call the asynchronous function afterwards. To do that I implemented this:
bar = BooleanField(default=False) # bar has became a BooleanField
def call_async(self):
async_method.delay(...)
def save(self, *args, **kwargs):
self.bar = True
super(Foo, self).save(*args, **kwargs)
transaction.on_commit(lambda: self.call_async())
Unfortunately, this changes nothing. Instead of None I'm now getting False when I should be getting True in the m2m_changed signal.
You want to make sure that your database is up to date. In Django 1.9, there is a new transaction.on_commit which can trigger celery tasks.

Django, post_save signal recrusion. How to bypass signal firing

I have a situation where when one of my models is saved MyModel I want to check a field, and trigger the same change in any other Model with the same some_key.
The code works fine, but its recursively calling the signals. As a result I am wasting CPU/DB/API calls. I basically want to bypass the signals during the .save(). Any suggestions?
class MyModel(models.Model):
#bah
some_field = #
some_key = #
#in package code __init__.py
#receiver(models_.post_save_for, sender=MyModel)
def my_model_post_processing(sender, **kwargs):
# do some unrelated logic...
logic = 'fun! '
#if something has changed... update any other field with the same id
cascade_update = MyModel.exclude(id=sender.id).filter(some_key=sender.some_key)
for c in cascade_update:
c.some_field = sender.some_field
c.save()
Disconnect the signal before calling save and then reconnect it afterwards:
post_save.disconnect(my_receiver_function, sender=MyModel)
instance.save()
post_save.connect(my_receiver_function, sender=MyModel)
Disconnecting a signal is not a DRY and consistent solution, such as using update() instead of save().
To bypass signal firing on your model, a simple way to go is to set an attribute on the current instance to prevent upcoming signals firing.
This can be done using a simple decorator that checks if the given instance has the 'skip_signal' attribute, and if so prevents the method from being called:
from functools import wraps
def skip_signal(signal_func):
#wraps(signal_func)
def _decorator(sender, instance, **kwargs):
if hasattr(instance, 'skip_signal'):
return None
return signal_func(sender, instance, **kwargs)
return _decorator
Based on your example, that gives us:
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(post_save, sender=MyModel)
#skip_signal()
def my_model_post_save(sender, instance, **kwargs):
instance.some_field = my_value
# Here we flag the instance with 'skip_signal'
# and my_model_post_save won't be called again
# thanks to our decorator, avoiding any signal recursion
instance.skip_signal = True
instance.save()
Hope This helps.
A solution may be use update() method to bypass signal:
cascade_update = MyModel.exclude(
id=sender.id).filter(
some_key=sender.some_key).update(
some_field = sender.some_field )
"Be aware that the update() method is converted directly to an SQL statement. It is a bulk operation for direct updates. It doesn't run any save() methods on your models, or emit the pre_save or post_save signals"
You could move related objects update code into MyModel.save method. No playing with signal is needed then:
class MyModel(models.Model):
some_field = #
some_key = #
def save(self, *args, **kwargs):
super(MyModel, self).save(*args, **kwargs)
for c in MyModel.objects.exclude(id=self.id).filter(some_key=self.some_key):
c.some_field = self.some_field
c.save()

django: recursion using post-save signal

Here's the situation:
Let's say I have a model A in django. When I'm saving an object (of class A) I need to save it's fields into all other objects of this class. I mean I need every other A object to be copy of lat saved one.
When I use signals (post-save for example) I get a recursion (objects try to save each other I guess) and my python dies.
I men I expected that using .save() method on the same class in pre/post-save signal would cause a recursion but just don't know how to avoid it.
What do we do?
#ShawnFumo Disconnecting a signal is dangerous if the same model is saved elsewhere at the same time, don't do that !
#Aram Dulyan, your solution works but prevent you from using signals which are so powerful !
If you want to avoid recursion and keep using signals (), a simple way to go is to set an attribute on the current instance to prevent upcoming signals firing.
This can be done using a simple decorator that checks if the given instance has the 'skip_signal' attribute, and if so prevents the method from being called:
from functools import wraps
def skip_signal():
def _skip_signal(signal_func):
#wraps(signal_func)
def _decorator(sender, instance, **kwargs):
if hasattr(instance, 'skip_signal'):
return None
return signal_func(sender, instance, **kwargs)
return _decorator
return _skip_signal
We can now use it this way:
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(post_save, sender=MyModel)
#skip_signal()
def my_model_post_save(sender, instance, **kwargs):
# you processing
pass
m = MyModel()
# Here we flag the instance with 'skip_signal'
# and my_model_post_save won't be called
# thanks to our decorator, avoiding any signal recursion
m.skip_signal = True
m.save()
Hope This helps.
This will work:
class YourModel(models.Model):
name = models.CharField(max_length=50)
def save_dupe(self):
super(YourModel, self).save()
def save(self, *args, **kwargs):
super(YourModel, self).save(*args, **kwargs)
for model in YourModel.objects.exclude(pk=self.pk):
model.name = self.name
# Repeat the above for all your other fields
model.save_dupe()
If you have a lot of fields, you'll probably want to iterate over them when copying them to the other model. I'll leave that to you.
Another way to handle this is to remove the listener while saving. So:
class Foo(models.Model):
...
def foo_post_save(instance):
post_save.disconnect(foo_post_save, sender=Foo)
do_stuff_toSaved_instance(instance)
instance.save()
post_save.connect(foo_post_save, sender=Foo)
post_save.connect(foo_post_save, sender=Foo)