Django post_save signal not working properly - django

In my models.py file, I have a class named OrderPayment, now whenever I create a new object of OrderPayment, I want to also create a Transaction object. Am trying to do this with post_save signal but it not working. Below is how the code looks
#receiver(post_save, sender=OrderPayment)
def orderpayment_setup(sender, instance, **kwargs):
print(instance.order.dispatch_rider.company)
print(instance)
wt = Transaction.objects.create(
wallet=Wallet.objects.get(user=instance.order.dispatch_rider.company),
amount=instance.amount,
description="Dispatch Order",
status='success',
transaction_type='deposit'
)
print(wt)
The orderpayment_setup function gets triggered but the Transaction object is not created. I also sees the first two print statement but I don't see the last one.
I don't know what I'm not doing right

Related

Djangos m2m_changed trigger doesn't trigger properly

I have a model that looks as follows and I wish to trigger a method every time the user_ids field get's changed. Using the post_save signal obviously didn't do anything, as ManyToMany relationships are special in that way.
class Lease(models.Model):
unit = models.ForeignKey(Unit, on_delete=models.CASCADE)
user_ids = models.ManyToManyField('user.User')
Using the m2m_changed trigger as follows also didn't do anything, which got me puzzled. I don't really understand what is wrong with this code also having tried to leave the '.user_ids' out. There are no errors or anything, it just doesn't trigger when the user_ids from the Lease model are changed.
#receiver(m2m_changed, sender=Lease.user_ids)
def update_user_unit(sender, instance, **kwargs):
print('Test')
Reading the documentation, I suppose the sender should be the intermediate model, not the ManyToMany field itself. Try this:
#receiver(m2m_changed, sender=Lease.user_ids.through)

How to trigger Django's pre_delete signal?

I have a User model and a UserImage model that contains a foreign key to a User. The foreign key is set to CASCADE delete.
Here is what the receivers look like in my models.py:
#receiver(pre_delete, sender=User)
def deleteFile(sender, instance, **kwargs):
print("User pre_delete triggered")
instance.thumbnail.delete()
#receiver(pre_delete, sender=UserImage)
def deleteFile(sender, instance, **kwargs):
print("UserImage pre_delete triggered")
instance.image.delete()
When I execute the following lines of code:
>>> User.objects.last().delete()
"UserImage pre_delete triggered"
For some reason the associated UserImage signal is being received but the actual User model's signal is not.
Am I missing something?
If you read the documentation carefully you will see that the delete() method on a model will execute purely in SQL (if possible). So the delete() method on UserImage will not be called by Django, thus the signal will not be triggered. If you want it to be triggered you could override the delete method on your User model to also call the delete() on the related object. Something like this:
class User(models.Model):
def delete(self, using=None):
self.userimage_set.all().delete()
super().delete(using=using)
UPDATE:
I did not read the question correctly so I have to update my answer. I think what is happening is that both signals have the same name and thus the first one is overwritten by the second one, and thus only the second one is executed. I would suggest changing the function name to something else and see if that changes things.

Trigering post_save signal only after transaction has completed

I have written some APIs, for which the respective functions executive inside a transaction block. I am calling the save() method (after some modifications) on instance/s of a/several Model/s, and also consecutively indexing some JSON related information of the instance/s in Elasticsearch. I want the database to rollback even if for some reason the save() for one of the instances or indexing to the Elasticsearch fails.
Now, the problem is arising that even inside the transaction block, the post_save() signals gets called, and that is an issue because some notifications are being triggered from those signals.
Is there a way to trigger post_save() signals only after the transactions have completed successful?
I think the simplest way is to use transaction.on_commit(). Here's an example using the models.Model subclass Photo that will only talk to Elasticsearch once the current transaction is over:
from django.db import transaction
from django.db.models.signals import post_save
#receiver(post_save, sender=Photo)
def save_photo(**kwargs):
transaction.on_commit(lambda: talk_to_elasticsearch(kwargs['instance']))
Note that if the transaction.on_commit() gets executed while not in an active transaction, it will run right away.
Not really. The signals have nothing to do with the db transaction success or failure, but with the save method itself - before the call you have the pre_save signal fired and after the call you have the post_save signal fired.
There are 2 approaches here:
you are going to inspect the instance in the post_save method and decide that the model was saved successfully or not; simplest way to do that: in the save method, after the transaction executed successfully, annotate your instance with a flag, say instance.saved_successfully = True, which you will test in the post_save handler.
you are going to ditch the post_save signal and create a custom signal for yourself, which you will trigger after the transaction ran successfully.
Makes sense?
P.S.
If you strictly need to bind to the transaction commit signal, have a look over this package: https://django-transaction-hooks.readthedocs.org/en/latest/; it looks like the functionality is integrated in Django 1.9a.
I was having serious issues with django's admin not allowing post_save transactions on parent objects when they had inline children being modified.
This was my solution to an error complaining about conducting queries in the middle of an atomic block:
def on_user_post_save_impl(user):
do_something_to_the_user(user)
def on_user_post_save(sender, instance, **kwargs):
if not transaction.get_connection().in_atomic_block:
on_user_post_save_impl(instance)
else:
transaction.on_commit(lambda: on_user_post_save_impl(instance))
We are using this little nugget:
def atomic_post_save(sender, instance, **kwargs):
if hasattr(instance, "atomic_post_save") and transaction.get_connection().in_atomic_block:
transaction.on_commit(lambda: instance.atomic_post_save(sender, instance=instance, **kwargs))
post_save.connect(atomic_post_save)
Then we simply define a atomic_post_save method on any model we like:
class MyModel(Model):
def atomic_post_save(self, sender, created, **kwargs):
talk_to_elasticsearch(self)
Two things to notice:
We only call atomic_post_save when inside a transaction.
It's too late in the flow to send messages and have them included in the current request from inside atomic_post_save.

object is not in the database when django post_save is running?

I'm not sure what this ticket is talking about is what i'm experiencing.
https://code.djangoproject.com/ticket/14051
MyClass.objects.create() calls post_save() handler, and inside one of the handler, I do Myclass.objects.filter(id=instance.id) and it returns nothing.
So when you are inside post_save signal handler,
your instance is not yet found in the DB because it 's not committed yet?
Is this true?
this is create part
thread = ReviewThread.objects.create(**validated_data)
each review_thread has a review_meta (foreign_key) and related_name is 'review_threads'
class ReviewThread(forum_models.Thread):
thread = models.OneToOneField(forum_models.Thread, parent_link=True)
review_meta = models.ForeignKey(ReviewMeta, related_name='review_threads')
This is the receiving part
def maybe_update_review_meta_primary_image(review_thread):
ReviewThread.objects.filter(id=review_thread.id) #returns nothing
#...
#receiver(post_save, sender=ReviewThread)
def update_review_meta(sender, instance, **kwargs):
review_thread = instance
maybe_update_review_meta_primary_image(review_thread)
If you use transactions then yes, it is true. And it is a pretty valid behaviour.

Django - post_save timing issue - how to implement a signal for action "database save finalized"

it seems to me as if the post_save signal is triggered not when the entire save process has been finalized in the database, but at the end of the model.save() call.
I made this observation when executing the following code:
models.py
class PurchaseOrder(models.Model):
total_value = models.DecimalField(max_digits=12, decimal_places=2, null=True, blank=True)
def purchase_order_post_save_handler(sender, instance, **kwargs):
project_po_value_update_thread = accounting.threads.ProjectPoValueUpdateThread(instance)
project_po_value_update_thread.start()
post_save.connect(purchase_order_post_save_handler, sender=PurchaseOrder)
threads.py
class ProjectPoValueUpdateThread(threading.Thread):
"""
thread to update the po.project with the latest po values
called via the save method of the purchase order
"""
def __init__(self, purchase_order):
self.purchase_order = purchase_order
threading.Thread.__init__(self)
def run(self):
"starts the thread to update project po values"
try:
project = self.purchase_order.project
#update active po budget
active_po_sum = PurchaseOrder.objects.filter(project=project, is_active=True).aggregate(total_value_sum=Sum('total_value'))
active_total_value_sum = active_po_sum['total_value_sum']
project.total_value = active_total_value_sum
project.save()
In some cases, this code did not update the project total_value correctly, as the instance (which I just saved) queried with PurchaseOrder.objects.filter(project=project, is_active=True) was obviously not updated. Thus it seems to me as the thread has overtaken the instance save method and queried an old version of the model.
I know how to overcome this specific problem (just take the latest value from the instance provided by the post_save signal), but I'm wondering how one could create a post_save signal that is triggered when the save action in the database finalized.
You are correct, the post_save signal is executed at the end of the save() method. That fact is in the Django docs:
django.db.models.signals.post_save
Like pre_save, but sent at the end of the save() method.
If your intent is to use a signal in several areas, you may be better suited to write your own.
edit: better link to signal definitions