Django AuditTrail Update - django

I want to audit one of my models with AuditTrail. But in some places of project I have such code :
q = Transaction.objects.filter(id = self.id).update(summa = self.summa)
Will Audit trail work whith this code or it works only on model.save()?

A little late answer but better late than never for completeness:
As can be seen in the AuditTrail source code, the connection of the AuditTrail with the model is through the post_save signal
models.signals.post_save.connect(_audit, sender=cls, weak=False)
(so there is no relation with database triggers that'd do automatic inserts etc).
Also, as can be seen in #isbadawi's answer here
django post_save signals on update
the post_save signal is fired only on save(). So, the answer is that AuditTrail will not work on your case (queryset update) :)

Related

Django Rest Framework call custom function when model is created

Suppose I want to do some extra stuff whenever a model is created in my Django App:
models.py:
class MyModel(models.Model):
name = models.CharField(max_length=255)
def do_some_extra_stuff(self):
// whatever...
return "I did some extra stuff"
Suppose the extra stuff is some long-running asynchronous task.
I am using Django Rest Framework and have a standard ModelSerializer and ViewSet.
My question is where is the best place to put the call to do_some_extra_stuff ? I can think of the following options:
In the serializer.create method
In the viewset.create method
In the model.save method
In a post_save signal
I am leaning towards #1 or #2 b/c I only want this fn to be called when normal users interact w/ the API - not when using the Django Admin or the shell.
This question will probably be closed because its opinion based, so here are my opinions on it!
Background Tasks
Use Celery or Django-RQ.
Celery is "better", but Django-RQ (and RQ itself) is almost always enough if you have good logging and error reporting. These aren't usually bank account transfers we're dealing with.
#2 - Viewset Based
I believe serializers should only serialize & validate. Because of this, any custom save logic needs to go in the ViewSet - up front, visible at the point use, and has access to everything without any serializer context tricks.
class MyViewSet(ModelViewSet):
def perform_create(self, serializer):
# pass a value into save(), which passes it to model.create()
serializer.save(org_id=self.request.user.secret_org_id)
# use the serializer instance that was created to enqueue a bg task
my_background_task.delay(item_pk=serializer.instance.pk)
This is just a preference. Its perfectly fine to put it in the serializer if that is how you normally do other tasks. The only goal is consistency.
This may make more sense if you imagine using 1 serializer for each action - list, create, and update.
Other Methods
#3 In the model.save method
The model.save() is not called in some rare cases, and its not the place I expect to see logic that runs other tasks. Some model field cleanups work there. Again this is a preference, but using this method with DRF seems unusual, as opposed to when you are in pure django and using the admin site.
#4 In a post_save signal
I feel post-save signals are too far away from the model, and also not called in some rare cases (same mostly as #3). I would rather be explicit with things like this, so I keep it close to where I expect to see it happen. We use post_save signals for other things, but they do suffer from not being able to see the context, and its difficult to pass additional parameters into them.

Can I detect ManyToManyField mutations before they are applied?

I have a Django app that broadcasts changes of some of its models to its clients, to keep them up-to-date.
I have a separate app for that, that binds to the post_save signal of these models and triggers the broadcast.
Problems have come with models with ManyToManyFields. I am not very familiar with Django's handling of this but I understand that these fields are actually updated after the model is saved.
I was able to use the m2m_changed signal to react to the post_* actions and dispatch the broadcast after the object is all up-to-date. However, in the post_save handler, I still need to detect that the m2m field is going to be mutated, to avoid broadcasting incomplete data. How can I detect this , either in the post_save signal handler or the model's save method ? Is there a way to raise a flag on an object when a m2m field is about to be mutated ?
Here's what I've tried :
Handle the pre_* actions of the m2m_changed signal to detect incoming mutation of the field but that does not work, because the signal gets fired after post_save, which is too late (data has already been broadcasted).
Store initial values when the model instance is created, and compare them in the overriden save method to look for changes. This does not work because the fields are not changed and report to be == the initial value ; plus I read on other questions that this practice could cause race conditions.
Thanks a lot for the help.
I finally sorted this out. For reference:
TL;DR: The model layer is not the right place to do this ; the view layer is.
I re-implemented the notification system at the view level instead of doing it in the model layer. I made a mixin that adds notification features to my views.
Thanks to this solution :
I got rid of signals
I got much better control over what’s going on, which simplified the notification handler code quite a bit.

Adding cache.clear() to global save in Django

Hopefully somebody knows a simple way to do this.
Is it possible to run the cache.clear() function on every save to any model?
I know Django docs show you how to implement per model, and overhead isn't an issue. I would like to clear the cache each and every time a change is made.
There are a couple of ways you could do this.
Firstly you could create a common abstract base class that inherits from models.Model and overrides save to do the cache clear, then make all your models inherit from that.
Another way might be to use signals - the pre and post save signals have sender as an optional argument, if you don't specify a particular model it will fire for all models.
#Daniel Rosemen. Sorry as I don't have much reputation I couldn't remove my comment.
Thanks for pointing me in the right direction.
It was as simple as:
from django.db.models.signals import post_save
from django.core.cache import cache
#receiver(post_save)
def clear_the_cache(**kwargs):
cache.clear()

Does django-reversion require me to re-define all of my model pre_save & post_save signals?

I recently added django-reversion to my project to keep a history of model changes. Now, any time I save a model in my system, I get the following warning in the log:
.../site-packages/reversion/models.py:218: UserWarning: pre_save
and post_save signals will not longer be sent for Revision and Version
models in django-reversion 1.8. Please use the pre_revision_commit and
post_revision_commit signals instead.
However, I'm not using the pre_save and post_save signals on the Revision and Version models themselves, or anything that extends those… Is this warning saying that I need to change the pre_save and post_save signals on any model in my project that makes use of versioning??
No, the deprecated signals are not for any model, they are just for the Revision and Version models. You can continue to use the pre_save and post_save signals on your own models.
http://django-reversion.readthedocs.org/en/latest/signals.html?highlight=post_save
I'm sure you've figured it out in the intervening 10 months, but I thought I'd clarify for the next visitor.

Django Haystack RealTimeSearchIndex on ManyToMany Relationships strange behaviour

Following a related (as yet unanswered) question, I did some investigation and found that the current implementation of Django Haystack's RealTimeSearchIndex makes no attempt to also update on related field (Many to Many) changes. I thought this would be an easy fix - after all, I could just extend RealTimeSearchIndex like this:
class RealTimeM2MSearchIndex(RealTimeSearchIndex):
def _setup_save(self, model):
signals.m2m_changed.connect(self.update_object, sender=model)
signals.post_save.connect(self.update_object, sender=model)
But then I realized (or at least assumed, since it's not working) that this only works if the M2M field is defined on the model itself, and not if it's the "reverse" side of the M2M relationship. Trying to fix that, I then did something like the following:
signals.m2m_changed.connect(self.update_object, sender=model.related_field.through)
Where related_field is the name of the specific Model on other side of the ManyToMany definition. Strangely enough, upon running, Django then complains that the Model has no such field, related_field.
And sure enough, if I inspect the object, Django has not yet extended the model to have the related_field field. If I inspect the same object when displaying a view, however, it does have that related_field.
Summary
So the problem seems to be that Django's automatic behavior to add an attribute to the reverse side of an M2M relationship has yet to happen when Haystack runs its code. How can I overcome this obstacle, and allow Haystack's RealTimeSearchIndex to also update on related field changes?
I think the simplest solution is to just use the built in RealTimeSearchIndex, and add a signal listener in your models.py to reindex the model on m2m_changed, or whenever. See my answer to the other question - you could easily modify it to index on m2m_changed instead of post_save.
Just tried implementing this myself, and your problem is the value of the sender argument in this line:
signals.m2m_changed.connect(self.update_object, sender=model)
I read the documentation for the m2m_changed signal and the sender will be something like MyModel.my_field.through so you need to use that. This means you can't have a generic class as you are trying to do, but will need to define the _setup_save method in each case, with the m2m_changed signal connected for each ManyToMany field that model has.
For example, if your model had two ManyToManyFields called region and sector, you could do:
# we implement these to force the update when the ManyToMany fields change
def _setup_save(self, model):
signals.m2m_changed.connect(self.update_object,
sender=MyModel.sector.through)
signals.m2m_changed.connect(self.update_object,
sender=MyModel.region.through)
signals.post_save.connect(self.update_object, sender=model)
You should also really define the _teardown_save() method:
def _teardown_save(self, model):
signals.m2m_changed.disconnect(self.update_object,
sender=MyModel.sector.through)
signals.m2m_changed.disconnect(self.update_object,
sender=MyModel.region.through)
signals.post_save.disconnect(self.update_object, sender=model)
(This is based on code I have tested, it appears to work - certainly no errors about fields not existing).
Update: Just read your question more closely. Is it possible that your model has the ManyToManyField added dynamically? Is the call to register() after you have defined all your classes?