In Django, how do I avoid/skip/undo a pre-save event? - django

I am using django_extensions TimeStampedModel, which provides a modified field that sets itself via a pre_save event. Which is great, except I am converting an old schema and want to preserve the original modified datestamp. How can I monkeypatch, avoid, cancel, or replace the pre_save'd modified with another value?

In the end, I just did an end-around:
from django.db import connection
cursor = connection.cursor()
cursor.execute("update %s set modified='%s' where id=%s" % (
my_model._meta.db_table, desired_modified_date, my_model.id))

You cannot. Not in the sense you're asking.
What you can is create a fake field and populate it on clean().
Class MyModel(models.Model):
def clean(self):
self._modified = self.modified
...
#receiver(pre_save, sender=MyModel)
def receiver_(self, *args, **kwargs):
self.modified = self._modified
So you're backing up the field value and putting it back later. notes: ensure your application is loaded later.

Related

Circumventing F expression problem of django-simple-history by overriding save

Django-simple-history is inserting new record on each save of target model. In docs the problem with F expressions is described. I try to circumvent this with overriden save method.
def save(self, *args, **kwargs):
super().save(*args, **kwargs)
# some other actions
self.refresh_from_db()
But it seems, that this is not working. Is the post_save signal of base model called directly after super().save() call? If so, is there a way to solve this problem keeping the F expression in target model update?
UPDATE: Saved instance has one of its attributes defined using an F expression, so this code is called in some other module:
instance.some_attribute = (F('some_attribute') + 15)
instance.save(update_fields=['some_attribute'])
This throws an error in django-simple-history's post_save signal, when it tries to insert a extended copy of instance to history table. I tried to refresh the instance in overriden save method to get rid of the F expression in some_attribute so the actual value is loaded. From the traceback it seems that the post_save is called right after super().save() call, before the refresh. Is it the way Django post_save with overriden save works? If so, is there a way to not change the update code (leave the update with F expression) and solve the history insert in model's save?
django-simple-history provides signals for before and after the historical record is created: https://django-simple-history.readthedocs.io/en/2.7.0/signals.html
I suggest using these to update the instance before it gets saved to the historical table. Something like this should work:
from django.dispatch import receiver
from simple_history.signals import (
pre_create_historical_record,
post_create_historical_record
)
#receiver(pre_create_historical_record)
def pre_create_historical_record_callback(sender, **kwargs):
instance = kwargs["instance"]
history_instance = kwargs["history_instance"]
if isinstance(instance, ModelYouWantToRefresh)
instance.refresh_from_db()
history_instance.some_attribute = instance.some_attribute
Based on Ross Mechanic answer I made a universal solution
from django import dispatch
from django.db.models import expressions
from simple_history import signals
#dispatch.receiver(
signals.pre_create_historical_record, dispatch_uid="simple_history_refresh"
)
def remove_f_expressions(sender, instance, history_instance, **kwargs) -> None: # noqa
f_expression_fields = []
for field in history_instance._meta.fields: # noqa
field_value = getattr(history_instance, field.name)
if isinstance(field_value, expressions.BaseExpression):
f_expression_fields.append(field.name)
if f_expression_fields:
instance.refresh_from_db()
for field_name in f_expression_fields:
field_value = getattr(instance, field_name)
setattr(history_instance, field_name, field_value)

Check previous model value on save

I have a model that saves an Excursion. The user can change this excursion, but I need to know what the excursion was before he change it, because I keep track of how many "bookings" are made per excursion, and if you change your excursion, I need to remove one booking from the previous excursion.
Im not entirely sure how this should be done.
Im guessing you use a signal for this?
Should I use pre_save, pre_init or what would be the best for this?
pre_save is not the correct one it seems, as it prints the new values, not the "old value" as I expected
#receiver(pre_save, sender=Delegate)
def my_callback(sender, instance, *args, **kwargs):
print instance.excursion
Do you have several options.
First one is to overwrite save method:
#Delegate
def save(self, *args, **kwargs):
if self.pk:
previous_excursion = Delegate.objects.get(self.pk).excursion
super(Model, self).save(*args, **kwargs)
if self.pk and self.excursion != previous_excursion:
#change booking
Second one is binding function to post save signal + django model utils field tracker:
#receiver(post_save, sender=Delegate)
def create_change_booking(sender,instance, signal, created, **kwargs):
if created:
previous_excursion = get it from django model utils field tracker
#change booking
And another solution is in pre_save as you are running:
#receiver(pre_save, sender=Delegate)
def my_callback(sender, instance, *args, **kwargs):
previous_excursion = Delegate.objects.get(self.pk).excursion
if instance.pk and instance.excursion != previous_excursion:
#change booking
You can use django model utils to track django model fields. check this example.
pip install django-model-utils
Then you can define your model and use fieldtracker in your model .
from django.db import models
from model_utils import FieldTracker
class Post(models.Model):
title = models.CharField(max_length=100)
body = models.TextField()
tracker = FieldTracker()
status = models.CharField(choices=STATUS, default=STATUS.draft, max_length=20)
after that in post save you can use like this :
#receiver(post_save, sender=Post)
def my_callback(sender, instance,*args, **kwargs):
print (instance.title)
print (instance.tracker.previous('title'))
print (instance.status)
print (instance.tracker.previous('status'))
This will help you a lot to do activity on status change. as because overwrite save method is not good idea.
As an alternative and if you are using Django forms:
The to-be version of your instance is stored in form.instance of the Django form of your model. On save, validations are run and this new version is applied to the model and then the model is saved.
Meaning that you can check differences between the new and the old version by comparing form.instance to the current model.
This is what happens when the Django Admin's save_model method is called. (See contrib/admin/options.py)
If you can make use of Django forms, this is the most Djangothic way to go, I'd say.
This is the essence on using the Django form for handling data changes:
form = ModelForm(request.POST, request.FILES, instance=obj)
new_object = form.instance # not saved yet
# changes are stored in form.changed_data
new_saved_object = form.save()
form.changed_data will contain the changed fields which means that it is empty if there are no changes.
There's yet another option:
Django's documentation has an example showing exactly how you could do this by overriding model methods.
In short:
override Model.from_db() to add a dynamic attribute containing the original values
override the Model.save() method to compare the new values against the originals
This has the advantage that it does not require an additional database query.

Django, post_save signal recrusion. How to bypass signal firing

I have a situation where when one of my models is saved MyModel I want to check a field, and trigger the same change in any other Model with the same some_key.
The code works fine, but its recursively calling the signals. As a result I am wasting CPU/DB/API calls. I basically want to bypass the signals during the .save(). Any suggestions?
class MyModel(models.Model):
#bah
some_field = #
some_key = #
#in package code __init__.py
#receiver(models_.post_save_for, sender=MyModel)
def my_model_post_processing(sender, **kwargs):
# do some unrelated logic...
logic = 'fun! '
#if something has changed... update any other field with the same id
cascade_update = MyModel.exclude(id=sender.id).filter(some_key=sender.some_key)
for c in cascade_update:
c.some_field = sender.some_field
c.save()
Disconnect the signal before calling save and then reconnect it afterwards:
post_save.disconnect(my_receiver_function, sender=MyModel)
instance.save()
post_save.connect(my_receiver_function, sender=MyModel)
Disconnecting a signal is not a DRY and consistent solution, such as using update() instead of save().
To bypass signal firing on your model, a simple way to go is to set an attribute on the current instance to prevent upcoming signals firing.
This can be done using a simple decorator that checks if the given instance has the 'skip_signal' attribute, and if so prevents the method from being called:
from functools import wraps
def skip_signal(signal_func):
#wraps(signal_func)
def _decorator(sender, instance, **kwargs):
if hasattr(instance, 'skip_signal'):
return None
return signal_func(sender, instance, **kwargs)
return _decorator
Based on your example, that gives us:
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(post_save, sender=MyModel)
#skip_signal()
def my_model_post_save(sender, instance, **kwargs):
instance.some_field = my_value
# Here we flag the instance with 'skip_signal'
# and my_model_post_save won't be called again
# thanks to our decorator, avoiding any signal recursion
instance.skip_signal = True
instance.save()
Hope This helps.
A solution may be use update() method to bypass signal:
cascade_update = MyModel.exclude(
id=sender.id).filter(
some_key=sender.some_key).update(
some_field = sender.some_field )
"Be aware that the update() method is converted directly to an SQL statement. It is a bulk operation for direct updates. It doesn't run any save() methods on your models, or emit the pre_save or post_save signals"
You could move related objects update code into MyModel.save method. No playing with signal is needed then:
class MyModel(models.Model):
some_field = #
some_key = #
def save(self, *args, **kwargs):
super(MyModel, self).save(*args, **kwargs)
for c in MyModel.objects.exclude(id=self.id).filter(some_key=self.some_key):
c.some_field = self.some_field
c.save()

How do I get Django Admin to delete files when I remove an object from the database/model?

I am using 1.2.5 with a standard ImageField and using the built-in storage backend. Files upload fine but when I remove an entry from admin the actual file on the server does not delete.
You can receive the pre_delete or post_delete signal (see #toto_tico's comment below) and call the delete() method on the FileField object, thus (in models.py):
class MyModel(models.Model):
file = models.FileField()
...
# Receive the pre_delete signal and delete the file associated with the model instance.
from django.db.models.signals import pre_delete
from django.dispatch.dispatcher import receiver
#receiver(pre_delete, sender=MyModel)
def mymodel_delete(sender, instance, **kwargs):
# Pass false so FileField doesn't save the model.
instance.file.delete(False)
Try django-cleanup
pip install django-cleanup
settings.py
INSTALLED_APPS = (
...
'django_cleanup.apps.CleanupConfig',
)
Django 1.5 solution: I use post_delete for various reasons that are internal to my app.
from django.db.models.signals import post_delete
from django.dispatch import receiver
#receiver(post_delete, sender=Photo)
def photo_post_delete_handler(sender, **kwargs):
photo = kwargs['instance']
storage, path = photo.original_image.storage, photo.original_image.path
storage.delete(path)
I stuck this at the bottom of the models.py file.
the original_image field is the ImageField in my Photo model.
This code runs well on Django 1.4 also with the Admin panel.
class ImageModel(models.Model):
image = ImageField(...)
def delete(self, *args, **kwargs):
# You have to prepare what you need before delete the model
storage, path = self.image.storage, self.image.path
# Delete the model before the file
super(ImageModel, self).delete(*args, **kwargs)
# Delete the file after the model
storage.delete(path)
It's important to get the storage and the path before delete the model or the latter will persist void also if deleted.
You need to remove the actual file on both delete and update.
from django.db import models
class MyImageModel(models.Model):
image = models.ImageField(upload_to='images')
def remove_on_image_update(self):
try:
# is the object in the database yet?
obj = MyImageModel.objects.get(id=self.id)
except MyImageModel.DoesNotExist:
# object is not in db, nothing to worry about
return
# is the save due to an update of the actual image file?
if obj.image and self.image and obj.image != self.image:
# delete the old image file from the storage in favor of the new file
obj.image.delete()
def delete(self, *args, **kwargs):
# object is being removed from db, remove the file from storage first
self.image.delete()
return super(MyImageModel, self).delete(*args, **kwargs)
def save(self, *args, **kwargs):
# object is possibly being updated, if so, clean up.
self.remove_on_image_update()
return super(MyImageModel, self).save(*args, **kwargs)
You may consider using a pre_delete or post_delete signal:
https://docs.djangoproject.com/en/dev/topics/signals/
Of course, the same reasons that FileField automatic deletion was removed also apply here. If you delete a file that is referenced somewhere else you will have problems.
In my case this seemed appropriate because I had a dedicated File model to manage all of my files.
Note: For some reason post_delete doesn't seem to work right. The file got deleted, but the database record stayed, which is completely the opposite of what I would expect, even under error conditions. pre_delete works fine though.
Maybe it's a little late. But the easiest way for me is to use a post_save signal. Just to remember that signals are excecuted even during a QuerySet delete process, but the [model].delete() method is not excecuted during the QuerySet delete process, so it's not the best option to override it.
core/models.py:
from django.db import models
from django.db.models.signals import post_delete
from core.signals import delete_image_slide
SLIDE1_IMGS = 'slide1_imgs/'
class Slide1(models.Model):
title = models.CharField(max_length = 200)
description = models.CharField(max_length = 200)
image = models.ImageField(upload_to = SLIDE1_IMGS, null = True, blank = True)
video_embed = models.TextField(null = True, blank = True)
enabled = models.BooleanField(default = True)
"""---------------------------- SLIDE 1 -------------------------------------"""
post_delete.connect(delete_image_slide, Slide1)
"""--------------------------------------------------------------------------"""
core/signals.py
import os
def delete_image_slide(sender, **kwargs):
slide = kwargs.get('instance')
try:
os.remove(slide.image.path)
except:
pass
This functionality will be removed in Django 1.3 so I wouldn't rely on it.
You could override the delete method of the model in question to delete the file before removing the entry from the database completely.
Edit:
Here is a quick example.
class MyModel(models.Model):
self.somefile = models.FileField(...)
def delete(self, *args, **kwargs):
somefile.delete()
super(MyModel, self).delete(*args, **kwargs)
Using the post_delete is for sure the right way to go. Sometimes though things can go wrong, and files don't get deleted. There is of course the case that you have a bunch of old files that weren't deleted before post_delete was used. I created a function that deletes files for objects based on if the file the object references does not exist then delete object, if the file does not have an object, then also delete, also it can delete based on an "active" flag for an object.. Something I added to most of my models. You have to pass it the objects you want to check, the path to the objects files, the file field and a flag to delete inactive objects:
def cleanup_model_objects(m_objects, model_path, file_field='image', clear_inactive=False):
# PART 1 ------------------------- INVALID OBJECTS
#Creates photo_file list based on photo path, takes all files there
model_path_list = os.listdir(model_path)
#Gets photo image path for each photo object
model_files = list()
invalid_files = list()
valid_files = list()
for obj in m_objects:
exec("f = ntpath.basename(obj." + file_field + ".path)") # select the appropriate file/image field
model_files.append(f) # Checks for valid and invalid objects (using file path)
if f not in model_path_list:
invalid_files.append(f)
obj.delete()
else:
valid_files.append(f)
print "Total objects", len(model_files)
print "Valid objects:", len(valid_files)
print "Objects without file deleted:", len(invalid_files)
# PART 2 ------------------------- INVALID FILES
print "Files in model file path:", len(model_path_list)
#Checks for valid and invalid files
invalid_files = list()
valid_files = list()
for f in model_path_list:
if f not in model_files:
invalid_files.append(f)
else:
valid_files.append(f)
print "Valid files:", len(valid_files)
print "Files without model object to delete:", len(invalid_files)
for f in invalid_files:
os.unlink(os.path.join(model_path, f))
# PART 3 ------------------------- INACTIVE PHOTOS
if clear_inactive:
#inactive_photos = Photo.objects.filter(active=False)
inactive_objects = m_objects.filter(active=False)
print "Inactive Objects to Delete:", inactive_objects.count()
for obj in inactive_objects:
obj.delete()
print "Done cleaning model."
This is how you can use this:
photos = Photo.objects.all()
photos_path, tail = ntpath.split(photos[0].image.path) # Gets dir of photos path, this may be different for you
print "Photos -------------->"
cleanup_model_objects(photos, photos_path, file_field='image', clear_inactive=False) # image file is default
make sure you write "self" before the file. so example above should be
def delete(self, *args, **kwargs):
self.somefile.delete()
super(MyModel, self).delete(*args, **kwargs)
I've forgotten the "self" before my file and that didn't work as it was looking in the global namespace.
If you already have number of unused files in your project and want to delete them, you can use django utility django-unused-media
Django 2.x Solution:
There's no need to install any packages! It's very easy to handle in Django 2. I've tried following solution using Django 2 and SFTP Storage (however I think it would work with any storages)
First write a Custom Manager. So if you want to be able to delete files of a model by using objects methods, you must write and use a [Custom Manager][3] (for overriding delete() method of objects):
class CustomManager(models.Manager):
def delete(self):
for obj in self.get_queryset():
obj.delete()
Now you must delete image before deleting deleting the model itself and for assigning the CustomManager to the model, you must initial objects inside your model:
class MyModel(models.Model):
image = models.ImageField(upload_to='/pictures/', blank=True)
objects = CustomManager() # add CustomManager to model
def delete(self, using=None, keep_parents=False):
objects = CustomManager() # just add this line of code inside of your model
def delete(self, using=None, keep_parents=False):
self.image.storage.delete(self.song.name)
super().delete()
I may have a special case since I am using the upload_to option on my file field with dynamic directory names but the solution I found was to use os.rmdir.
In models:
import os
...
class Some_Model(models.Model):
save_path = models.CharField(max_length=50)
...
def delete(self, *args,**kwargs):
os.rmdir(os.path.join(settings.MEDIA_ROOT, self.save_path)
super(Some_Model,self).delete(*args, **kwargs)

django: recursion using post-save signal

Here's the situation:
Let's say I have a model A in django. When I'm saving an object (of class A) I need to save it's fields into all other objects of this class. I mean I need every other A object to be copy of lat saved one.
When I use signals (post-save for example) I get a recursion (objects try to save each other I guess) and my python dies.
I men I expected that using .save() method on the same class in pre/post-save signal would cause a recursion but just don't know how to avoid it.
What do we do?
#ShawnFumo Disconnecting a signal is dangerous if the same model is saved elsewhere at the same time, don't do that !
#Aram Dulyan, your solution works but prevent you from using signals which are so powerful !
If you want to avoid recursion and keep using signals (), a simple way to go is to set an attribute on the current instance to prevent upcoming signals firing.
This can be done using a simple decorator that checks if the given instance has the 'skip_signal' attribute, and if so prevents the method from being called:
from functools import wraps
def skip_signal():
def _skip_signal(signal_func):
#wraps(signal_func)
def _decorator(sender, instance, **kwargs):
if hasattr(instance, 'skip_signal'):
return None
return signal_func(sender, instance, **kwargs)
return _decorator
return _skip_signal
We can now use it this way:
from django.db.models.signals import post_save
from django.dispatch import receiver
#receiver(post_save, sender=MyModel)
#skip_signal()
def my_model_post_save(sender, instance, **kwargs):
# you processing
pass
m = MyModel()
# Here we flag the instance with 'skip_signal'
# and my_model_post_save won't be called
# thanks to our decorator, avoiding any signal recursion
m.skip_signal = True
m.save()
Hope This helps.
This will work:
class YourModel(models.Model):
name = models.CharField(max_length=50)
def save_dupe(self):
super(YourModel, self).save()
def save(self, *args, **kwargs):
super(YourModel, self).save(*args, **kwargs)
for model in YourModel.objects.exclude(pk=self.pk):
model.name = self.name
# Repeat the above for all your other fields
model.save_dupe()
If you have a lot of fields, you'll probably want to iterate over them when copying them to the other model. I'll leave that to you.
Another way to handle this is to remove the listener while saving. So:
class Foo(models.Model):
...
def foo_post_save(instance):
post_save.disconnect(foo_post_save, sender=Foo)
do_stuff_toSaved_instance(instance)
instance.save()
post_save.connect(foo_post_save, sender=Foo)
post_save.connect(foo_post_save, sender=Foo)