I have a model, on which I need to edit objects given certain states.
Using post_save signals I can identify the given state, and from there run the specified method(s).
My issue now is that in these methods I need to save the object after editing, causing end endless save-loop. Is there a 'correct way' to handling this kind of situation, or are there only bad or worse kinds of workarounds?
In order to avoid this kind of loop, you need to disconnect at the beginning of your custom method, and then reconnect to signal afterwards :
Inside your method connected to signal :
signals.post_save.disconnect(YourModel.your_custom_method, sender=YourModel)
# Do your stuff
instance.save()
signals.post_save.connect(YourModel.your_custom_method, sender=YourModel)
Note : I'm using "YourModel.your_custom_method because I tend to place it inside the model, this is obviously not mandatory.
EDIT :
This SO question brings the use of update() instead of save() as a cleaner way, it might be what you're looking for.
Django post_save preventing recursion without overriding model save()
Although it might be cleaner I wouldn't think of it as a perfect solution unless properly managed with caching, since you'll be forced to hit the database to retrieve the queryset while you already have the instance in your hands as argument of the signal...
The idea of save_without_signal seems interesting but I wonder how it could be done without the above caveats.
Hope this helps,
Regards,
Related
I am new to Django and want to know deeper about the concept of signals.
I know how it works but really don't understand when should one really use it.
From the doc it says 'They’re especially useful when many pieces of code may be interested in the same events.'
What are some real applications that use signals for its advantage?
e.x. I'm trying to make a phone verification after user signup. Because it can be integrated inside the single app and the event that interested for the signal is only this 'verify' function, therefore I don't really need signal. I can just pass the information from one view to the other, rather than using pre_save signal from the registration.
I'm sorry if my question is kind of basic. But I really want to know some insight what is the real application, in which many codes interested in one particular event and what are some trade off in my application.
Thanks!!
Often signals is used when you need to do some database-specific low-level stuff. For example, if you use ElasticSearch for better searching documents on your site, you may want to automatically update search indexes, when new document is created or old one was edited.
Also you may have some complex logic of managing database objects. For example, you may need some specific logic of deleting object. For example, when user is deleted, you may want change all the links to his profile by some placeholder, or when new message is created or other action is performed by user, you want to update "last visited" field in user's profile and there's no direct relation between this action and updating the profile.
But when you're just implementing business-logic as in your example with verification, you don't need to use signals, because you don't need any universal logic related to deleting/creating/editing any object: you have a certain object with which you work and can do stuff directly.
I'm wondering what the appropriate things to put in my model's clean() method are.
Does it make sense to put all the verification of and manipulation to a model's properties to ensure it is valid (ie. business logic)? There is a lot of that in my case and I'm wondering if it makes sense to execute it all every time a model is saved.
For example i'm doing things like :
- if a video is marked as private, remove all its references in playlsts
- ensure that the video's title is unique with relation to the users other videos
- etc.
some of the things i'm doing only really need to be done on creation of a new video - so checking/ setting them every time the model is saved also seems excessive.
Is this the correct use of the clean() method?
Clearing relationships is probably best handled by a signal. To validate your signals are working properly, you can write a unit test.
Validating that the title is unique is something that definitely belongs in a form/model validator. To me, that seems like a better separation of concerns.
I've coded myself into a corner sort of with my data abstraction scheme and it's resulted in my needing editorOpened(QModelIndex) and editorClosed(QmodelIndex) signals in my views (QTableView almost exclusively).
The reason being that my data classes have automatic behavior that needs to be block/disabled during editing, then re-enabled afterwards.
At first I thought to try to do it with custom delegates but ran into problems for a couple reasons: one being just that it seems a bit excessive to use a custom delegate providing the same behavior for every single item, in other words it seems like it ought to be done for all items by the view itself. The second problem being that the delegates seem to be const which prevents me from setting an internal handle to the data object within the delegate.
Looking at the view methods, I found QAbstractItemView::edit and QAbstractItemView::closeEditor which would be perfect candidates for re-implementing with opened() and closed() signals, however I need an index/handle to the specific item being edited, which I don't think I can obtain from within those methods...
At this point I have no idea what else I could do. I'd appreciate any tips or pointers in the right direction! Thanks for reading
I solved it on my own...
I found QAbstractItemDelegate::editorEvent which is non-const (I must have not been looking as closely as I thought when reading the docs before).
I was able to set an internal handle in the custom delegate within this method, which allowed me to simply create a slot to do what I needed upon closing, and connect the closeEditor(QWidget*,QAbstractItemDelegate::EndEditHint) signal to it.
I am curious what others think about this problem...
I have been going back and forth in the past few days about using QuerySet.update() versus ModelInstance.save(). Obviously if there are lots of fields being changed, I'd use save(), but for updating a couple of fields, I think it's better to use QuerySet.update(). The benefit of using QuerySet.update() is that you can have multiple threads running update() at the same time, on different fields of the same object, and you won't have race issues. The default save() method saves all the fields, so parallel save() from two threads will be problematic.
So then the issue is what if you have overloaded, custom save() methods. The best I can think of is to abstract whatever in the custom save() method into separate updater methods that actually uses QuerySet.update() to set a couple of fields in the model. Has anyone used this pattern?
What's a bit irritating is that in Django Admin, even in editing in change list mode where you are editing just one field, the entire model is saved. This basically means if someone have a change list open on his/her browser, while some where else in the system a field gets updated, that updated value will be thrown away when this user saves changes from the change list. Is there a solution to this problem?
Thoughts?
Thanks.
The main reason for using QuerySet.update() is that you can update more than one object with just one database query, while every call to an object's save method will hit the database!
Another point worth mentioning is, that django's pre_save & post_save signals are only sent when you call an object's save-method, but not upon QuerySet.update().
Coming to the conflict issues you describe, I think it would also be irritating if you hit 'save' and then you have to discover that afterwards some values are the same as when you changed them, but some that you left unchanged have changed?!? Of course it's up to you to modify the admin's save_model or the object's save method to the bahaviour you suggest.
The problem you described about Django Admin is essentially about updating a model instance using an outdated copy. It is easy to fix by adding a version number to each model instance and increment the number on each update. Then in the save method of the model, just make sure what you are saving is not behind what is already in the database.
I want to make sure when there are parallel writes to the same object, each write updates a different fields, they don't overwrite each other's values.
Depending on the application, this may or may not be a sensible thing. Saving a whole model even if only a single field is updated can often avoid breaking integrity of data. Thinking about the following example about travel itinerary of three-leg flight. Assume there is an instance of three fields representing three legs and three fields are SF->LA, LA->DC, DC->NY. Now if one update is to update the first two legs to SF->SD, SD->DC, and another update is to update the last two legs to LA->SJ, SJ->NY, and if you allow both to happen with update instead of saving the full model instance, you would come out with a broken itinerary of SF->SD, LA->SJ, SJ->NY.
I am working on a Django project and I want to send a signal when something gets added to some model's related set. E.g. we have an owner who has a set of collectables, and each time the method owner.collectable_set.add(something) is getting called, I want a signal like collectable_added or something. Signals are clear to me, but I don't know which manager(?) contains the "add" method that I want to override.
Edit for Xavier's request to provide more details: you can easily override a model’s save method, by simply defining it and calling the "super-save" so it gets properly saved with some extra functionality. But I wonder where to override a related set's add method.
Gosh, I think I haven't brought in any further details, but I think it should be clear what I want to do even from the first paragraph.
Edit 2: This is the method I want to override. Is it recommended to do so, or do you suggest another way to place the sending of the signal?
This is the solution I found, the m2m_changed signal. Took me quite some searching and reading. Furthermore, I found out that it is not trivial to extend the ManyRelatedManager class, which would have been the other option. But with the m2m_changed signal I can rely on built-in functions which is the preferred way most of the time.
I think you're looking for the RelatedManager Class.
After much searching (thanks to this Paul's hint), I came across this snippet that helped to explain the m2m_changed implementation to intercept not override the add method on the ManyRelatedManager. It appears that the manager on a many-to-many relationship happens on the fly, so it's not trivial to override the method.