I'm overriding Django's model delete method in order to delete orphan files in the disk for image fields, something like this:
class Image(models.Model):
img = models.ImageField(upload_to=get_image_path)
...
def delete(self, *args, **kwargs):
self.img.delete()
super(Image, self).delete(*args, **kwargs)
This works fine when I delete single objects from the admin, but when I select multiple objects and delete them, this doesn't seem to get called. I have been googling for a while but haven't hit the right keywords to get the answer for this, nor the official documentation seems to talk about this subject.
It does:
The delete() method does a bulk delete and does not call any delete() methods on your models. It does, however, emit the pre_delete and post_delete signals for all deleted objects (including cascaded deletions).
For that to work, you can override delete method on QuerySet, and then apply that QuerySet as manager:
class ImageQuerySet(models.QuerySet):
def delete(self, *args, **kwargs):
for obj in self:
obj.img.delete()
super(ImageQuerySet, self).delete(*args, **kwargs)
class Image(models.Model):
objects = ImageQuerySet.as_manager()
img = models.ImageField(upload_to=get_image_path)
...
def delete(self, *args, **kwargs):
self.img.delete()
super(Image, self).delete(*args, **kwargs)
I believe this issue is addressed in the docs
where it says:
Overridden model methods are not called on bulk operations
Note that the delete() method for an object is not necessarily called when deleting objects in bulk using a QuerySet or as a result of a cascading delete. To ensure customized delete logic gets executed, you can use pre_delete and/or post_delete signals.
Unfortunately, there isn’t a workaround when creating or updating objects in bulk, since none of save(), pre_save, and post_save are called.
As suggested in the docs above, I believe a better solution is to use the post_delete signal, like so:
from django.db.models.signals import post_delete
from django.dispatch import receiver
class Image(models.Model):
img = models.ImageField(upload_to=get_image_path)
...
#receiver(post_delete, sender=Image)
def delete_image_hook(sender, instance, using, **kwargs):
instance.img.delete()
Unlike overriding the delete method, the delete_image_hook function should be called on bulk deletes and cascading deletes as well. Here is more information on using Django's Signals: https://docs.djangoproject.com/en/1.11/topics/signals/#connecting-to-signals-sent-by-specific-senders
Note on previous answers:
Some of the earlier posts suggest overriding the delete method of QuerySet, which may have performance implications and other unintended behavior. Perhaps those answers were written before Django's Signals were implemented, but I think using Signals is a cleaner approach.
Delete method of queryset works directly on the database. It does not call Model.delete() methods. From the docs:
Keep in mind that this will, whenever possible, be executed purely in SQL, and so the delete() methods of individual object instances will not necessarily be called during the process. If you’ve provided a custom delete() method on a model class and want to ensure that it is called, you will need to “manually” delete instances of that model (e.g., by iterating over a QuerySet and calling delete() on each object individually) rather than using the bulk delete() method of a QuerySet.
If you want to override Django administration interface's default behavior, you can write a custom delete action:
https://docs.djangoproject.com/en/dev/ref/contrib/admin/actions/
Another method is to override post_delete (or pre_delete) signal instead of delete method:
https://docs.djangoproject.com/en/dev/ref/signals/#django.db.models.signals.post_delete
Like pre_delete, but sent at the end of a model’s delete() method and a queryset’s delete() method.
The accepted answer may not work for everyone. I couldn't get it to work on Django 3.2, but it may only be because I already had a custom manager and was not confident that I could combine my customizations to a models.Manager with customizations to a models.QuerySet.
I found that overriding delete_queryset (available in Django 2.1+) on the model's admin (as described in this thorough and fully-illustrated answer from user Kushan Gunasekera to another related SO question) was quick and easy.
Related
I have a User model and a UserImage model that contains a foreign key to a User. The foreign key is set to CASCADE delete.
Here is what the receivers look like in my models.py:
#receiver(pre_delete, sender=User)
def deleteFile(sender, instance, **kwargs):
print("User pre_delete triggered")
instance.thumbnail.delete()
#receiver(pre_delete, sender=UserImage)
def deleteFile(sender, instance, **kwargs):
print("UserImage pre_delete triggered")
instance.image.delete()
When I execute the following lines of code:
>>> User.objects.last().delete()
"UserImage pre_delete triggered"
For some reason the associated UserImage signal is being received but the actual User model's signal is not.
Am I missing something?
If you read the documentation carefully you will see that the delete() method on a model will execute purely in SQL (if possible). So the delete() method on UserImage will not be called by Django, thus the signal will not be triggered. If you want it to be triggered you could override the delete method on your User model to also call the delete() on the related object. Something like this:
class User(models.Model):
def delete(self, using=None):
self.userimage_set.all().delete()
super().delete(using=using)
UPDATE:
I did not read the question correctly so I have to update my answer. I think what is happening is that both signals have the same name and thus the first one is overwritten by the second one, and thus only the second one is executed. I would suggest changing the function name to something else and see if that changes things.
I have 3 models: User, UserItem (the m2m thourgh), and Item.
A User can create an Item. This automatically creates a UserItem.
A different User can see that Item, and add it to their own list of items, creating another UserItem.
If that first User wants to delete the Item, the other User won't be happy - it needs to stay, but appear gone for the initial User. However, if there's only one User still related to it, then the Item is safe to delete, and should be deleted to avoid filling the database with dead records.
This is how I think it should be handled:
Item delete call made to API from User
Item pre_delete checks if item.user_set > 1
If True, manually delete the UserItem, leave Item where it is. If False, delete the Item
This way UserItem isn't exposed via the API, and management for a client is simplified.
Is this the right/common way to go? How can it be done with Django? I'm unsure how to prevent Item.delete() from happening within pre_delete without raising an exception, but as this is expected behaviour raising an exception doesn't seem like the right way to do this.
This seems fine to me. But instead of working with signals, you could override the delete() method on the Item model. See the official documentation for an example with the save() method. Your delete() method could be implemented likewise, i.e. that it wont call the super() when there are still other Users with a UserItem.
From django docs: "If you want customized deletion behavior, you can override the delete() method."
I think it's what you want:
def delete(self, *args, **kwargs):
if item.user_set > 1:
return
else:
super(Item, self).delete(*args, **kwargs) # Call the "real" delete() method.
Here's what I went with. It keeps the logic in the model, but the view gives it the current user.
I thought it best to keep out of delete() because an admin user should be able to delete an Item regardless of related users, and there is no simple way to access the current user within delete().
Constructive criticism welcome!
models.py
class Item(TimeStampedModel):
...
def delete_item_or_user_item(self, user):
"""
Delete the Item if the current User is the only User related to it.
If multiple Users are related to the Item, delete the UserItem.
"""
if UserItem.objects.filter(item=self).count() > 1:
UserItem.objects.filter(item=self, user=user).delete()
else:
self.delete()
views.py
class ItemViewSet(viewsets.ModelViewSet):
...
def perform_destroy(self, instance):
instance.delete_item_or_user_item(self.request.user)
Is there a way, hopefully without breaking admin, to disable editing existing model instances on the ORM level?
I'm not talking about removing 'Save' and 'Save and continue' buttons from templates - there should be no operations that can change the values of a committed instance of a model.
Preferably, the 'Save As' option should work instead.
Overwrite the save function for your model like so:
class MyModel(models.Model):
def save(self, *args, **kwargs):
if self.pk is None:
super(MyModel, self).save(*args, **kwargs)
This function only call the superclass save function (which actually saves the change) if there is no pk, e.g. the model instance is new.
You could override your model class's save() (do nothing if self.pk) and delete (always do nothing)
But really, the database level is the safest place for that. For example, in PostgreSQL you could write two simple rules:
CREATE RULE noupd_myapp_mymodel AS ON UPDATE TO myapp_mymodel
DO NOTHING;
CREATE RULE nodel_myapp_mymodel AS ON DELETE TO myapp_mymodel
DO NOTHING;
Either way, the admin wouldn't know anything about this, so everything still looks editable. See my answer to Whole model as read-only for an attempt at making a model read-only in the admin. For your purposes, keep the add permission as-is, and only declare all fields read-only when not adding.
EDIT: One reason why overriding delete() in your model class is not safe, is the fact that "bulk delete" (Queryset.delete(), e.g. admin checkboxes action) will not call the individual instances' delete() method, it will go straight to SQL: https://docs.djangoproject.com/en/dev/topics/db/queries/#deleting-objects
For those who need to prevent MyModel.objects.filter(pk=123).update(name="bob"):
class NoUpdateQuerySet(models.QuerySet):
def update(self, *args, **kwargs):
pass
class MyModel(models.Model):
objects = NoUpdateQuerySet.as_manager()
...
Django docs - link
I have a model which is overriding save() to slugify a field:
class MyModel(models.Model):
name = models.CharField(max_length=200)
slug = models.SlugField(max_length=200)
def save(self, *args, **kwargs):
self.slug = slugify(self.name)
super(MyModel, self).save(*args, **kwargs)
When I run load data to load a fixture, this save() does not appear to be called because the slug field is empty in the database. Am I missing something?
I can get it to work by a pre_save hook signal, but this is a bit of a hack and it would be nice to get save() working.
def mymodel_pre_save(sender, **kwargs):
instance = kwargs['instance']
instance.slug = slugify(instance.name)
pre_save.connect(mymodel_pre_save, sender=MyModel)
Thanks in advance.
No you're not. save() is NOT called by loaddata, by design (its way more resource intensive, I suppose). Sorry.
EDIT: According to the docs, pre-save is not called either (even though apparently it is?).
Data is saved to the database as-is, according to https://docs.djangoproject.com/en/dev/ref/django-admin/#what-s-a-fixture
I'm doing something similar now - I need a second model to have a parallel entry for each of the first model in the fixture. The second model can be enabled/disabled, and has to retain that value across loaddata calls. Unfortunately, having a field with a default value (and leaving that field out of the fixture) doesn't seem to work - it gets reset to the default value when the fixture is loaded (The two models could have been combined otherwise).
So I'm on Django 1.4, and this is what I've found so far:
You're correct that save() is not called. There's a special DeserializedObject that does the insertion, by calling save_base() on the Model class - overriding save_base() on your model won't do anything since it's bypassed anyway.
#Dave is also correct: the current docs still say the pre-save signal is not called, but it is. It's behind a condition: if origin and not meta.auto_created
origin is the class for the model being saved, so I don't see why it would ever be falsy.
meta.auto_created has been False so far with everything I've tried, so I'm not yet sure what it's for. Looking at the Options object, it seems to have something to do with abstract models.
So yes, the pre_save signal is indeed being sent.
Further down, there's a post_save signal behind the same condition that is also being sent.
Using the post_save signal works. My models are more complex, including a ManyToMany on the "Enabled" model, but basically I'm using it like this:
from django.db.models.signals import post_save
class Info(models.Model):
name = models.TextField()
class Enabled(models.Model):
info = models.ForeignKey(Info)
def create_enabled(sender, instance, *args, **kwards):
if Info == sender:
Enabled.objects.get_or_create(id=instance.id, info=instance)
post_save.connect(create_enabled)
And of course, initial_data.json only defines instances of Info.
Sorry for some crazy subj.
I'd like to override django models save method and call some additional code if the model instance is newly created.
Sure I can use signals or check if the model have empty pk field and if yes, create temporary variable and later call a code:
Class EmailModel(models.Model):
email = models.EmailField()
def save(self, *args, **kwargs)
is_new = self.pk is None
super(EmailModel, self).save(*args, **kwargs)
# Create necessary objects
if is_new:
self.post_create()
def post_create(self):
# do job, send mails
pass
But I like to have some beautiful code and avoid using temporary variable in save method.
So the question is: is it possible to find if the instance of model is newly created object just after super save_base parent method call?
I've checked django sources can't find how to do that in right way.
Thanks
We have related post
For real - signals are best approch in this case.
You could use post_save() signal and in the listener just check if the credit_set exist for current model instance and if not - create one. That would be my choice - there is no need to overdo such a simple task.
Of course if you really need to know exactly when the model was initiated (I doubt it) use post_init() signal. You don't need to override save() method just to set some additional variables. Just catch post_init() signal, or pre_save(), and just change/add what you want. IMHO there is no sense to override save() method and check if this is new instance or not - that's why the signals are there.