Can I detect ManyToManyField mutations before they are applied? - django

I have a Django app that broadcasts changes of some of its models to its clients, to keep them up-to-date.
I have a separate app for that, that binds to the post_save signal of these models and triggers the broadcast.
Problems have come with models with ManyToManyFields. I am not very familiar with Django's handling of this but I understand that these fields are actually updated after the model is saved.
I was able to use the m2m_changed signal to react to the post_* actions and dispatch the broadcast after the object is all up-to-date. However, in the post_save handler, I still need to detect that the m2m field is going to be mutated, to avoid broadcasting incomplete data. How can I detect this , either in the post_save signal handler or the model's save method ? Is there a way to raise a flag on an object when a m2m field is about to be mutated ?
Here's what I've tried :
Handle the pre_* actions of the m2m_changed signal to detect incoming mutation of the field but that does not work, because the signal gets fired after post_save, which is too late (data has already been broadcasted).
Store initial values when the model instance is created, and compare them in the overriden save method to look for changes. This does not work because the fields are not changed and report to be == the initial value ; plus I read on other questions that this practice could cause race conditions.
Thanks a lot for the help.

I finally sorted this out. For reference:
TL;DR: The model layer is not the right place to do this ; the view layer is.
I re-implemented the notification system at the view level instead of doing it in the model layer. I made a mixin that adds notification features to my views.
Thanks to this solution :
I got rid of signals
I got much better control over what’s going on, which simplified the notification handler code quite a bit.

Related

How to not trigger `m2m_changed` signal when updating many to many relationship in Django?

The Use Case
In my case, I have two signals that are listening on two 2 m2m fields, each of those fields are in different models. The problem happens when one signal is triggered, it triggers the other signal and vice versa, which will result in a recursive loop which will never ends. I need a convenient way to run one signal without triggering the second signal.
Understand More
If you are curious to know how this case could happen: I have two models which I need to make them mutually synced; if I updated the m2m field in the one model, I need those changes to be reflected on another m2m field in another model and vice versa.
The solution is to use set instead of: add, remove, or clear. Because all of these trigger m2m_changed signal multiple times.
This is not mentioned in the documentation talking about set or the part talking about m2m_changed

To use signals or override model save method?

Simple use case:
After a user updates a record, I want to get the changed fields and save them in a history table. I'm using django-ditryfields to grab this history. So my thought process was to use the pre_save signal to grab all the 'dirty' fields and them store them in my history table.
Problem there is that I can't get request.user while using signals. I need this to see which user has made the change to the record. My other thought was just to override the save method of my model but then I also can't get request.user from a model directly either. I would have to send a **kwarg['user'] with the user info from the view to get this info. This is fine but I am going to be making save calls from a bunch of different places around the code. I don't want to have to keep passing request.user every time I edit an object. This is why I'd love to have one spot, like a signal, to handle all of this. Perhaps some middleware I'm not familiar with?
Is there a better way to achieve such a thing?
You cannot access the user object from a signal.
You can consider using this third party package: django-requestprovider to access the request object in the signal.
The other way would be to overriding the models' save method.

What can you do with Ember Data Models when in the error state?

I'm struggling to understand the workflow that would be used in the following scenario:
A user creates a model, let's call it Product. We present them with a form to fill in. The save errors for some reason other than validations (timeout, access denied etc...) In Ember, this puts the model into an error state. From a UI perspective, all I want to do is put a message on the screen (easy) and allow the user to try again (apparently not so easy).
I've seen it written many times not to reuse a transaction. I understand the logic of that. In the case of a new Product, I simple create another new Product, merge in the data from the original product (attributes, relationships) and replace the content of my controller with the new Product. This wasn't hard and appears to work nicely, although there may be (hopefully) a better way.
However, when I'm editing a Product, I have run into a serious issue and the above solution does not work. The Product model is now in the error state and I can not find any way to get a copy of this Product that isn't also in the same state.
What I cant' figure out is what I can do with this model once it hits the error state. I have tried the following:
Rollback: This doesn't work. You can't rollback a transaction in the error state.
Reload: Same as above. Not allowed to reload a record in the error state.
Grab a new copy of the record: So I try App.Product.find(id) with the same id as the existing record. It just gives me a copy of the existing record, in the error state.
I'm hoping I'm missing something fairly basic here. Is it possible to roll a record nicely out of an error state (or invalid state for that matter)?
If there is a simple way to change the state of these models, should we still be creating a new transaction for further attempts to commit?
So after a few days of reading source and experimenting, I have come to the conclusion that this is functionality that is not yet implemented. To move a record into another state you are supposed to send an event to it which passes it on the statemanager. There appears to be no events registered on the error state that allows us to recover the record.
There is an ugly workaround - I can call transitionTo on the statemanager of the record and force it into the state we want. I did not decide to do this lightly, but at this point I must continue on with the project while I wait for ember-data to evolve. So if the record is so far unsaved, we can rescue it from an invalid or error state by calling:
model.get('stateManager').transitionTo('loaded.created.uncommitted')
or for an existing record:
model.get('stateManager').transitionTo('loaded.updated')
Once this has been called, you can attempt to call commit again on the transaction that the model resides in. This will be the default transaction as the behaviour of ember-data is to move a model into the default transaction once commit has been called on it's original transaction. You can always retrieve the current transaction for a model by calling model.get('transaction')
So at the end of this, I have a way to create the typical CRUD scenario that we might see in Ruby on Rails, but I don't believe this is the ideal way to do it. I do believe however that at this point in time, neither does the ember-data team.
For those of you interested in having CRUD functionality as controller and route mixins for Ember, I have a Gist that contains the code I cam currently using. This is working fine, recovers from save errors as well as validation errors. Hopefully I can continue to refine this as ember-data evolves.
With the addition of DS.Errors in 1.0.0-beta5 (see https://github.com/emberjs/data/commit/994f3b234ef899b79138ddece60d8b214c5449e3) you should be able to call...
record.get("errors").clear();
This will clear out the previous errors and triggers becameValid.
You can trigger a becameValid event on it:
record.send("becameValid");
This should transition the model to uncommitted state.
You could try creating a parallel representation of the model as an Ember.Object that is not persisted but has the same properties as your persisted model. If your ajax bounces back in an error state you can use the error callback provided by the ajax method to do something.
In this case, the "something" might be to delete the record, and then clone the properties from your dummy object into a new record and re-save the record. On a success callback simply destroy your temp object and if all records are clean then clear your temp objects (to prevent lingering temp objects).
This could also be insane... but it strikes me as an option.

Django AuditTrail Update

I want to audit one of my models with AuditTrail. But in some places of project I have such code :
q = Transaction.objects.filter(id = self.id).update(summa = self.summa)
Will Audit trail work whith this code or it works only on model.save()?
A little late answer but better late than never for completeness:
As can be seen in the AuditTrail source code, the connection of the AuditTrail with the model is through the post_save signal
models.signals.post_save.connect(_audit, sender=cls, weak=False)
(so there is no relation with database triggers that'd do automatic inserts etc).
Also, as can be seen in #isbadawi's answer here
django post_save signals on update
the post_save signal is fired only on save(). So, the answer is that AuditTrail will not work on your case (queryset update) :)

Advice on django form change emailing

Goal: On submission of a form to add/update/delete objects an email is sent out with the current contents of the DB object and the new contents of the DB object in html.
Example
Object Title was oldTitle and has been changed to newTitle
Object Date was oldDate and has been changed to newDate
My assumption this can be done two different ways. Directly through send_mail or via signals. My gut leans towards using signals to make sure I can grab previous content and the new content but I am not quite sure if this is the right way to go. Any advice you can give would be much appreciated. I couldn't find very much on this subject online.
Queue up the message to be sent in the pre_save signal, and send (or reap) them in the post_save signal.
Try overriding save() in your model objects, and grabbing the fields in question before calling super().
Here's a database email queue I wrote, which may help you with the actual sending of the mail:
http://gist.github.com/629663