I'm using Celery in my django project to create tasks to send email at a specific time in the future. User can create a Notification instance with notify_on datetime field. Then I pass value of notify_on as a eta.
class Notification(models.Model):
...
notify_on = models.DateTimeField()
def notification_post_save(instance, *args, **kwargs):
send_notification.apply_async((instance,), eta=instance.notify_on)
signals.post_save.connect(notification_post_save, sender=Notification)
The problem with that approach is that if notify_on will be changed by the user, he will get two(or more) notifications instead of one.
The question is how do I update the task associated with a specific notification, or somehow delete the old one and create new.
First of all, by using post_save, we can't fetch the old data. So, here I'm overriding the save() method of the Notification model. Apart from that,create a field to store the celery task_id.
from celery.task.control import revoke
class Notification(models.Model):
...
notify_on = models.DateTimeField()
celery_task_id = models.CharField(max_length=100)
def save(self, *args, **kwargs):
pre_notify_on = Notification.objects.get(pk=self.pk).notify_on
super().save(*args, **kwargs)
post_notify_on = self.notify_on
if not self.celery_task_id: # initial task creation
task_object = send_notification.apply_async((self,), eta=self.notify_on)
Notification.objects.filter(pk=self.pk).update(celery_task_id=task_object.id)
elif pre_notify_on != post_notify_on:
# revoke the old task
revoke(self.celery_task_id, terminate=True)
task_object = send_notification.apply_async((self,), eta=self.notify_on)
Notification.objects.filter(pk=self.pk).update(celery_task_id=task_object.id)
Reference
Cancel an already executing task with Celery?
Django: How to access original (unmodified) instance in post_save signal
I think there is no need for deleting the previous tasks. You just have to validate that the task that is executing is the lasted one. For that create a new field called checksum that is a UUID field update that field everytime you change notify_on. Check this checksum in the task where you are sending the email.
class Notification(models.Model):
checksum = models.UUIDField(default=uuid.uuid4)
notify_on = models.DateTimeField()
def notification_post_save(instance, *args, **kwargs):
send_notification.apply_async((instance.id, str(instance.checksum)),eta=instance.notify_on)
signals.post_save.connect(notification_post_save, sender=Notification)
#shared_task
def send_notification(notification_id, checksum):
notification = Notification.objects.get(id=notification_id)
if str(notification.checksum) != checksum:
return False
#send email
Also please don't send signal everytime on notification object save just send this when notify_on changes. You can also check this
Identify the changed fields in django post_save signal
Related
Model:
class Tester:
test = models.ForeignKey(Platform)
count = models.Integer()
status = models.CharField(max_length=1, default='A')
I need to change the status to 'D' every time I insert a new record for the same test.
I tried using Signals pre_save, but that function went into a loop. I would really appreciate any help.
Signal fucntion probably goes into an infinite loop because you save the same model's instances in that function, in turn each one triggering the signal function itself. With a little care you can prevent this from happenning:
from django.db.models.signals import pre_save
#receiver(pre_save, sender=Tester)
def tester_pre_save(sender, instance, **kwargs):
if not instance.pk:
# This means that a new record is being created. We need this check as you want to do the operation when a new entry is **inserted** into table
Tester.objects.filter(test=instance.test).update(status='D')
or, with the post_save signal:
from django.db.models.signals import post_save
#receiver(post_save, sender=Tester)
def tester_post_save(sender, instance, created, **kwargs):
if created:
# This means that a new record has been created. We need this check as you want to do the operation when a new entry is **inserted** into table
Tester.objects.filter(test=instance.test).update(status='D')
Important points here is. as we are using update method of a query set to update existing entries, those won't trigger signals, because they don't use the model's save method, so that signal method won't be triggered for other instances we update here. And even if it were to be triggered, as we are doing the update operation under a condition check (if a new instance is being created), signal methods for those other saves wouldn't do anything, hence they would not cause an infinite loop.
Override the save method of our Tester class:
def save(self, *args, **kwargs):
if Tester.objects.filter(test=self.test).count()>0:
self.status="D"
else:
self.status="A"
super(Model, self).save(*args, **kwargs)
Put this into the class definition of Tester.
I am trying to get around difficulty bulk importing users with user info via django allauth social login. I am doing this by creating a separate model which I can csv upload user information to, which will be ported over to the real table after the user logs in for the first time. Essentially, it will pre-populate the user's information for them based on their email. The issue I am running into is that because I am using a post_save sender, it is trying to update the information every time the user changes an aspect of their profile. My issue with this is in the case where user information legitimately changes, but it wasn't updated in the UserData table (which is only meant to be used for initial import), it will simply change back. I am curious for any ideas to get around this. Thanks!
#receiver(post_save, sender=User)
def imported_info_update(sender, instance, **kwargs):
imported_info = UserData.objects.get(email=instance.email)
job = Jobs.objects.get(job=imported_info.department)
location = Locations.objects.get(location=imported_info.team)
school = Schools.objects.get(school=imported_info.school)
UserProfile.objects.update_or_create(user_id=instance.id,
job=job,
location=location,
school=school)
Try this
#receiver(post_save, sender=User)
def imported_info_update(sender, instance=None, created=False, **kwargs):
if created:
imported_info = UserData.objects.get(email=instance.email)
job = Jobs.objects.get(job=imported_info.department)
location = Locations.objects.get(location=imported_info.team)
school = Schools.objects.get(school=imported_info.school)
UserProfile.objects.update_or_create(user_id=instance.id,
job=job,
location=location,
school=school)
Reference: https://docs.djangoproject.com/en/2.0/ref/signals/#post-save
The post_save function has an extra parameter: created [Django-doc]. It is a boolean that specifies if an object has been created. You can thus write it like:
#receiver(post_save, sender=User)
def imported_info_update(sender, instance, created=None, **kwargs):
if created:
imported_info = UserData.objects.get(email=instance.email)
job = Jobs.objects.get(job=imported_info.department)
location = Locations.objects.get(location=imported_info.team)
school = Schools.objects.get(school=imported_info.school)
UserProfile.objects.update_or_create(user_id=instance.id,
job=job,
location=location,
school=school)
Mind however that signals can be bypassed in some cases. Since there are ways to construct rows without the .save() being called.
For example if you would use a Model.objects.bulk_create(collection), it will not call .save() on the items of the collection, hence nor the .save(), nor the triggers will fire.
I'm using Django 2.0
I have created few actions to use in admin interface
def clear_arbitrase(modeladmin, request, queryset):
queryset.update(arbitrase_generated=False)
#admin.register(WallmartRecord)
class WallmartRecordAdmin(admin.ModelAdmin):
actions = [
clear_arbitrase
]
This updates, the arbitrase_generated flag to False for WallmartRecord model.
I have another table to save record of arbitrase in ArbitraseRecord
I want to delete record from ArbitraseRecord as well when flagging arbitrase_generated to False.
I tried implementing post_save signal in models.py
#receiver(post_save, sender=WallmartRecord)
def post_save_wallmart_record_receiver(sender, instance, *args, **kwargs):
if not instance.arbitrase_generated:
# delete records from ArbitraseRecord
arbitrase_record = ArbitraseRecord.objects.filter(wallmart_record=instance)
if arbitrase_record is not None:
for record in arbitrase_record:
record.delete()
But this seems be to not calling receiver.
How can I delete record from another model on admin action select?
update() performs on SQL level directly and it's not calling model's save(), so post_save signal will not be triggered in your case. You can move deletion into clear_arbitrase like this:
def clear_arbitrase(modeladmin, request, queryset):
queryset.update(arbitrase_generated=False)
ArbitraseRecord.objects.filter(wallmart_record__id__in=queryset.values_list('id', flat=True)).delete()
I have a simple class model with Django Admin (v. 1.9.2) like this:
from django.contrib.auth.models import User
class Foo(models.Model):
...
users = models.ManyToManyField(User)
bar = None
I have also overloaded save() method like this:
def save(self, *args, **kwargs):
self.bar = 1
async_method.delay(...)
super(Foo, self).save(*args, **kwargs)
Here async_method is an asynchronous call to a task that will run on Celery, which takes the users field and will add some values to it.
At the same time, whenever a user is added to the ManyToManyField, I want to do an action depending on the value of the bar field. For that, I have defined a m2m_changed signal:
def process_new_users(sender, instance, **kwargs):
if kwargs['action'] == 'post_add':
# Do some stuff
print instance.bar
m2m_changed.connect(process_new_users, sender=Foo.users.through)
And there's the problem. Although I'm changing the value of bar inside the save() method and before I call the asynchronous method, when the process_new_users() method is triggered, instance.bar is still None (initial value).
I'm not sure if this is because the save() method commits changes asynchronously and when the process_new_users() is triggered it has not yet commited changes and is retrieving the old value, or if I'm missing something else.
Is my assumption correct? If so, is there a way to force the values in save() be commited synchronously so I can then call the asynchronous method?
Note: Any alternative way of achieving this is also welcome.
UPDATE 1: As of #Gert's answer, I implemented a transaction.on_change() trigger so whenever the Foo instance is saved, I can safely call the asynchronous function afterwards. To do that I implemented this:
bar = BooleanField(default=False) # bar has became a BooleanField
def call_async(self):
async_method.delay(...)
def save(self, *args, **kwargs):
self.bar = True
super(Foo, self).save(*args, **kwargs)
transaction.on_commit(lambda: self.call_async())
Unfortunately, this changes nothing. Instead of None I'm now getting False when I should be getting True in the m2m_changed signal.
You want to make sure that your database is up to date. In Django 1.9, there is a new transaction.on_commit which can trigger celery tasks.
I have a model that saves an Excursion. The user can change this excursion, but I need to know what the excursion was before he change it, because I keep track of how many "bookings" are made per excursion, and if you change your excursion, I need to remove one booking from the previous excursion.
Im not entirely sure how this should be done.
Im guessing you use a signal for this?
Should I use pre_save, pre_init or what would be the best for this?
pre_save is not the correct one it seems, as it prints the new values, not the "old value" as I expected
#receiver(pre_save, sender=Delegate)
def my_callback(sender, instance, *args, **kwargs):
print instance.excursion
Do you have several options.
First one is to overwrite save method:
#Delegate
def save(self, *args, **kwargs):
if self.pk:
previous_excursion = Delegate.objects.get(self.pk).excursion
super(Model, self).save(*args, **kwargs)
if self.pk and self.excursion != previous_excursion:
#change booking
Second one is binding function to post save signal + django model utils field tracker:
#receiver(post_save, sender=Delegate)
def create_change_booking(sender,instance, signal, created, **kwargs):
if created:
previous_excursion = get it from django model utils field tracker
#change booking
And another solution is in pre_save as you are running:
#receiver(pre_save, sender=Delegate)
def my_callback(sender, instance, *args, **kwargs):
previous_excursion = Delegate.objects.get(self.pk).excursion
if instance.pk and instance.excursion != previous_excursion:
#change booking
You can use django model utils to track django model fields. check this example.
pip install django-model-utils
Then you can define your model and use fieldtracker in your model .
from django.db import models
from model_utils import FieldTracker
class Post(models.Model):
title = models.CharField(max_length=100)
body = models.TextField()
tracker = FieldTracker()
status = models.CharField(choices=STATUS, default=STATUS.draft, max_length=20)
after that in post save you can use like this :
#receiver(post_save, sender=Post)
def my_callback(sender, instance,*args, **kwargs):
print (instance.title)
print (instance.tracker.previous('title'))
print (instance.status)
print (instance.tracker.previous('status'))
This will help you a lot to do activity on status change. as because overwrite save method is not good idea.
As an alternative and if you are using Django forms:
The to-be version of your instance is stored in form.instance of the Django form of your model. On save, validations are run and this new version is applied to the model and then the model is saved.
Meaning that you can check differences between the new and the old version by comparing form.instance to the current model.
This is what happens when the Django Admin's save_model method is called. (See contrib/admin/options.py)
If you can make use of Django forms, this is the most Djangothic way to go, I'd say.
This is the essence on using the Django form for handling data changes:
form = ModelForm(request.POST, request.FILES, instance=obj)
new_object = form.instance # not saved yet
# changes are stored in form.changed_data
new_saved_object = form.save()
form.changed_data will contain the changed fields which means that it is empty if there are no changes.
There's yet another option:
Django's documentation has an example showing exactly how you could do this by overriding model methods.
In short:
override Model.from_db() to add a dynamic attribute containing the original values
override the Model.save() method to compare the new values against the originals
This has the advantage that it does not require an additional database query.