Using proxy model as a signal sender - django

I have a proxy model.
class ProxyModel(ParentModel):
objects = ProxyModelManager()
class Meta:
proxy = True
I'm trying to define my proxy model as a sender.
#receiver(post_save, sender=core_models.ProxyModel)
def test_receiver(sender, instance, **kwargs):
print("test receiver")
But the function is not called after the model is saved.
Is it possible to define proxy model as sender? If so how can I do ?
Thanks.
Edit: I noticed that, If I save the object like so,
>>> from core.models Import ProxyModel
>>> p = ProxyModel.objects.get(name="Object Name")
>>> p.save()
>>> test receiver
The function is being called and prints "test receiver".
But, If I save same object via admin panel through parent model, It doesn't.

Related

Django why model foreign key cascade will not trigger delete?

there two basic ways to do something when an instance gets deleted:
Overwrite Model.delete
Signal
I used to reckon both of them serve the same purpose, just provides different ways of writing, but works exactly.
However, in this occasion, I realise I was wrong:
class Human(models.Model):
name = models.CharField(max_length=20)
class Pet(models.Model):
name = models.CharField(max_length=20)
owner = models.ForeignKey(Human, related_name="pet", on_delete=models.CASCADE)
def delete(self, *args, **kwargs):
print('------- Pet.delete is called')
return super().delete(*args, **kwargs)
h = Human(name='jason')
h.save()
p = Pet(name="dog", owner=h)
p.save()
h.delete()
# nothing is shown
Why Pet.delete Is not firing at Human.delete By the foreign cascade? Does I have to apply a signal on this? If so, would it cost more performance?
I am building something very heavy, comment system, filter decent records and delete when the commented target get deleted, the comment model has many null-able foreign key fields, with models.CASCADE Set, only one of them is assigned with value. But in product delete view, I call product.delete Then triggers cascade, but comment.delete Is not firing.
Currently, the project has delete Defined on many models, with assumption that it is always triggered when the instance get removed from database, and it is tremendous work to rewrite it in signal. Is there a way to call delete When at cascading? (I know it is likely impossible since it is a database field specification)
I implement a mix-in for Commendable models with extra methods defined, therefore, I decided to modify delete method to signal to something like this:
from django.db import models
from django.dispatch import receiver
from django.db.models.signals import pre_delete
# Create your models here.
class Base:
def __init_subclass__(cls):
#receiver(pre_delete, sender=cls)
def pet_pre_delete1(sender, instance, **kwargs):
print('pet pre delete 1 is called')
#receiver(pre_delete, sender=cls)
def pet_pre_delete2(sender, instance, **kwargs):
print('pet pre delete 2 is called')
class Human(models.Model):
name = models.CharField(max_length=20)
def __str__(self):
return f'<human>{self.name}'
class Pet(Base, models.Model):
name = models.CharField(max_length=20)
owner = models.ForeignKey(Human, related_name="pet", on_delete=models.CASCADE)
def __str__(self):
return f'<pet>{self.name}'
# ------- Pet.delete is called
# pet pre delete 1 is called
# pet pre delete 2 is called
it works fine in testing, I wonder if there is any risk using this, would it be garbage collected?

Attach a related object, to the newly created object using django-rest-framework

I've got a Model looking something like this
class SomeModel(models.Model)
user = models.ForeignKey(User, null=True)
some_info = models.CharField(max_length=200)
There can exist a lot of these without a User attached. But at some point I might want to attach a user to some of them.
I don't know exactly where I should do this though. I have this serializer that works fine for creating users:
class UserSerializer(serializers.ModelSerializer):
class Meta:
model = User
fields = ('username', 'password')
write_only_fields = ('password',)
def restore_object(self, attrs, instance=None):
user = super(UserSerializer, self).restore_object(attrs, instance)
user.set_password(attrs['password'])
user.save()
return user
But where is the correct place to connect it to an already existing SomeModel? Preferrably with easy permission handling. (APIView?)
Yes, you can… I guess your point is to attach a user on the creation of your “SomeModel” new object. You do this on your view where you create the new object, not the user, the “someModel” object, I give you a quick example:
from rest_framework import generics
class SomeModelListCreateAPIView(generics.ListCreateAPIView):
"""
Api List and Create SomeModel Objects
"""
model = SomeModel
serializer_class = SomeModelSerializer
def pre_save(self, obj):
obj.user = self.request.user
Hook it up to your URL and everytime you create a new SomeModel object in your API, the user will be the one creating it… on your “pre_save” method, you can apply all the logic you want.
And offcourse the serializer:
class SomeModelSerializer(serializers.ModelSerializer):
user = serializers.Field(source='user.username', label="user")
class Meta:
model = User
fields = ('user', 'some_info')
Hope this helps
Edit on your comment
I think there are two ways if you want to connect SomeModel to your new User created.
You have a view for creating a user, in the example before it had "pre_save" method.
You ApiView to create the user, updated or deleted also have a "post_save" method. So create your user and in the post_method put your logic to hook it up to the "SomeModel" object that you want.
class UserListCreateAPIView(generics.ListCreateAPIView):
"""
Api List and Create User Objects and hook it to SomeModel objects
"""
model = User
serializer_class = UserSerializer
def post_save(self, obj, created=False):
somemodel_object = SomeModel.object.get(pk=1)
somemodel_object.user = obj
somemodel_object.save()
this is one way, and you also have pre_save if you want to do something before save the user or post_save as you see here or pre_delete or post_delete... play around.
The second choice would be signals, you send a signal when saving the user and catch it on your SomeModel and do some logic there, but i found this more messy, but as you like:
https://docs.djangoproject.com/en/dev/topics/signals/
This helps?

Create an element based on a model while created an element of another model?

I'm pretty sure my question isn't clear but I didn't know how to put it.
The thing is : I have a Task model, and a TaskHistory model. When I create a Task model, I want automatically to create an associated TaskHistory object.
For example, I want to create a Task at datetime.now(), with the value "example", and when I do this, a TaskHistory object is created, with
created_on = datetime.now()
last_modification = datetime.now()
old_value = "example"
new_value = "example".
Can I do that in the model field of my Task model ?
I'm pretty new to Django.
Edit : the two models are linked to each other, my TaskHistory object would have task = Task if it's created like that.
If all the fields you've specified is available on the TaskHistory model you could override the create() method.
class Task(models.Model):
#other fields snipped for brevity
history = models.ForeignKey(TaskHistory)
#classmethod
def create(cls, *args, **kwargs):
task = cls(**kwargs)
history_kwargs = {'created_on':datetime.now(), 'last_modification':datetime.now(), 'old_value':'None', 'new_value': 'wee'}
history = TaskHistory(**history_kwargs)
history.save()
task.history = history
return task
and in your view or what have you call it like this
task = Task.create(insert_values_here_for_Task)
task.save()
Two ways to do this:
Override the save method on your Task model.
Use the post_save signal.
This is how you do it by using signals:
from django.db.models.signals import post_save
from django.dispatch import receiver
from myapp.models import Task
#receiver(post_save, sender=Task)
def my_handler(sender, **kwargs):
task=sender
history_kwargs = {'created_on':datetime.now(), 'last_modification':datetime.now(), 'old_value':'None', 'new_value': 'wee'}
history = TaskHistory(**history_kwargs)
history.save()
task.history = history
task.save

Django post_save signal on parent class with multi-table inheritance

In Django, if you have models that use multi-table inheritance, and you define a receiver for a post_save signal on the parent class, does that receiver function get called when an instance of the child class is saved?
Borrowing an example from another question:
class Animal(models.Model):
category = models.CharField(max_length=20)
class Dog(Animal):
color = models.CharField(max_length=10)
def echo_category(sender, **kwargs):
print "category: '%s'" % kwargs['instance'].category
post_save.connect(echo_category, sender=Animal)
If I do:
>>> dog = Dog.objects.get(...)
>>> dog.category = "canine"
>>> dog.save()
Will the echo_category receiver function be called?
post_save.connect(my_handler, ParentClass)
# connect all subclasses of base content item too
for subclass in ParentClass.__subclasses__():
post_save.connect(my_handler, subclass)
have a nice day!
Check out:
https://code.djangoproject.com/ticket/9318
It appears that most propagate the signal to the super in the subclass.
No, it will not be called. See #9318 in Django trac.
I managed to get inherited signal receivers working with the #receiver decorator. See relevant Django documentation
from django.db import models
from django.db.models.signals import post_save
from django.dispatch import receiver
class Animal(models.Model):
category = models.CharField(max_length=20)
#receiver(post_save)
def echo_category(sender, **kwargs):
print ("category: '%s'" % kwargs['instance'].category)
class Dog(Animal):
color = models.CharField(max_length=10)
This solution is valid in Python 3.6.8 Django 2.2
When I do this
>>> from myapp.models import Dog
>>> dog = Dog()
>>> dog.category = "canine"
>>> dog.save()
category: 'canine'
>>>
No problems. Everything seems to work from the shell.
Slightly unrelated, but when I edited models through the admin panel There was an issue with it getting called twice so I filtered them by checking the 'created' kwarg. In one call it was false, the other it was true so I just put in a simple if block.
Credit for that workaround goes to Pratik Mandrekar and his answer:
from django.db import models
from django.db.models.signals import post_save
from django.dispatch import receiver
class Animal(models.Model):
category = models.CharField(max_length=20)
#receiver(post_save)
def echo_category(sender, **kwargs):
if not kwargs.get('created'):
print ("category: '%s'" % kwargs['instance'].category)
class Dog(Animal):
color = models.CharField(max_length=10)

Define an attribute in a model, like an object of the other model in Django

Is posible to define an attribute in a data model like an object of other data model in Django?
This is the scenary:
models.py
class Inmueble(models.Model):
calle = models.CharField(max_length=20, verbose_name="Calle")
numero = models.CharField(max_length=6, verbose_name="Numero")
piso = models.IntegerField(verbose_name="Piso", blank=True, null=True)
galeria_id = models.OneToOneField(Galeria, verbose_name="Galería del Inmueble")
class Galeria(Gallery):
nombre = models.CharField(max_length=30, verbose_name="Nombre")
The point is: I need to create a new Galeria object automatically every time an Inmueble object is created. Thanks in advance!
Analía.
There are two ways to handle this:
Override the save() method for the Inmueble model.
Create a signal handler on Galeria that receives signals emitted by Inmueble
Both methods would work and are acceptable, however I recommend using a signal for a couple reasons:
It's a bit more de-coupled. If later you change or remove Galeria, your code doesn't break
The signal handler for postsave includes a boolean value to indicate whether the model is being created or not. You could technically implement the same functionality in model save() by checking if the model has a .id set or not, but IMO the signal is a cleaner solution.
Here's an idea of the code for both of these...
Using a Signal (recommended)
from django.db.models.signals import post_save
from wherever.models import Inmueble
class Galeria(Gallery):
# ...
def inmueble_postsave(sender, instance, created, **kwargs):
if created:
instance.galeria_id = Galeria.objects.create()
instance.save()
post_save.connect(inmueble_postsave, sender=Inmueble, dispatch_uid='galeria.inmueble_postsave')
Overriding Model save() Method
from wherever.models import Galeria
class Inmueble(models.Model):
# ...
def save(self, force_insert=False, force_update=False):
# No Id = newly created model
if not self.id:
self.galeria_id = Galeria.objects.create()
super(Inmueble, self).save()
Maybe
AutoOneToOneField is the answer.
Finally, I did:
from django.db.models.signals import post_save
class Galeria(Gallery):
inmueble_id = models.ForeignKey(Inmueble)
def inmueble_postsave(sender, instance, created, **kwargs):
if created:
instance = Galeria.objects.create(inmueble_id=instance, title=instance.calle+' '+instance.numero, title_slug=instance.calle+' '+instance.numero)
instance.save()
post_save.connect(inmueble_postsave, sender=Inmueble, dispatch_uid='galeria.inmueble_postsave')