Adding to the "constructor" of a django model - django

I want to do an extra initalization whenever instances of a specific django model are created. I know that overriding __init__ can lead to trouble. What other alternatives should I consider?
Update. Additional details: The intent is to initialize a state-machine that the instances of that model represent. This state-machine is provided by an imported library, and it's inner state is persisted by my django-model. The idea is that whenever the model is loaded, the state machine would be automatically initialized with the model's data.

Overriding __init__ might work, but it's bad idea and it's not the Django way.
The proper way of doing it in Django is using signals.
The ones that are of interest to you in this case are pre_init and post_init.
django.db.models.signals.pre_init
Whenever you instantiate a Django
model, this signal is sent at the beginning of the model’s __init__()
method.
django.db.models.signals.post_init
Like pre_init, but this one is sent
when the __init__(): method finishes
So your code should be something like
from django.db import models
from django.db.models.signals import post_init
class MyModel(models.Model):
# normal model definition...
def extraInitForMyModel(**kwargs):
instance = kwargs.get('instance')
do_whatever_you_need_with(instance)
post_init.connect(extraInitForMyModel, MyModel)
You can as well connect signals to Django's predefined models.

While I agree that there often is a better approach than overriding the __init__ for what you want to do, it is possible and there might be cases where it could be useful.
Here is an example on how to correctly override the __init__ method of a model without interfering with Django's internal logic:
from django.db import models
class Book(models.Model):
title = models.CharField(max_length=100)
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
# add your own logic

The two suggested methods in the docs rely on the instance being created in an arbitrary way:
Add a classmethod on the model class:
from django.db import models
class Book(models.Model):
title = models.CharField(max_length=100)
#classmethod
def create(cls, title):
book = cls(title=title)
# do something with the book
return book
book = Book.create("Pride and Prejudice")
Add a method on a custom manager:
class BookManager(models.Manager):
def create_book(self, title):
book = self.create(title=title)
# do something with the book
return book
class Book(models.Model):
title = models.CharField(max_length=100)
objects = BookManager()
book = Book.objects.create_book("Pride and Prejudice")
If that is your case, I would go that way. If not, I would stick to #vartec's answer.

Related

Adding a common validation to all text fields on all serializers

I am looking for a way (or several ways if needed) to add a common/shared validation function to all text fields in an DRF API. I hope to be able to do this in the least intrusive way possible, since there are already so many serializers throughout the API.
This is a horrible thing, and wrong, but its a requirement. Saying "don't do that" or "you shouldn't do this" is not helpful. I know. Its not up to me.
Given a serializer like this:
class MySerializer(ModelSerializer):
description = CharField()
class Meta:
model = SomeModel
fields = ["name", "description"]
... both of these would somehow run a validation function. For example, in the base CharField the framework adds two validators, and essentially I'd like to add a third.
class CharField(Field): # site-packages/rest_framework/fields.py
def __init__(self):
..
self.validators.append(ProhibitNullCharactersValidator())
self.validators.append(ProhibitSurrogateCharactersValidator())
Is there some clever way to do this? I don't want to resort to literally hacking the source code, or replacing CharField throughout the application.
The solution I ended up going with is below. It loads at django startup in my settings module which has a nice z_patches.py where other things like this live (replacing the default filter classes, etc)
def wrap_init(old_init):
#functools.wraps(old_init)
def __new_init__(self, **kwargs):
old_init(self, **kwargs)
self.validators.append(MyCustomValidator())
return __new_init__
CharField.__init__ = wrap_init(CharField.__init__)
If you absolutely know the risks, then you could do something like this in one of your apps.py:
from django.apps import AppConfig
from rest_framework.fields import Field
from rest_framework.serializers import CharField
def _init(self, **kwargs):
...
Field.__init__(self, **kwargs)
...
self.validators.append(YourCustomValidator())
class MyAppConfig(AppConfig):
...
def ready(self):
CharField.__init__ = _init

Log all save/update/delete actions in all django models

There are several models in my django app. Some of them derive from models.Model, some - from django-hvad's translatable model.
I want to log every save/delete/update operation on them. I am aware of standard django logger of admin actions, but they are too brief and non-verbose to satisfy my needs.
Generally speaking, one common way to achieve this is to define super-class with these operations and extend each model from it. This is not my case because some of my models are translatable and some are not.
Second way are aspects/decorators. I guess, python/django must have something like that, but I don't know what exactly :)
Please, provide me with the most suitable way to do this logging.
Thanks!
You could write a mixin for your model.
import logging
class LogOnUpdateDeleteMixin(models.Model):
pass
def delete(self, *args, **kwargs):
super(LogOnUpdateDeleteMixin, self).delete(*args, **kwargs)
logging.info("%s instance %s (pk %s) deleted" % (str(self._meta), str(self), str(self.pk),) # or whatever you like
def save(self, *args, **kwargs):
super(LogOnUpdateDeleteMixin, self).save(*args, **kwargs)
logging.info("%s instance %s (pk %s) updated" % (str(self._meta), str(self), str(self.pk),) # or whatever you like
class Meta:
abstract = True
Now just use it in your model.
class MyModel(LogOnUpdateDeleteMixin, models.Model):
...
# Update/Delete actions will write to log. Re-use your mixin as needed in as many models as needed.
You can re-use this mixin again and again. Perform translation as you wish, set some attributes in your models and check for them in the mixin.

Subclassing AbstractUser in Django for two types of users

I'm developing a school database system in Django 1.5, and was planning on having a number of different user types (Student, Staff, Parent) which subclass AbstractUser (actually, another abstract subclass of AbstractUser). I was just attempting to add an externally developed app to my system, which uses User in a ForeignKey for some of its models, however, this fails as my user type is not a 'User' instance. I can't set the apps models to use AbstractUser as one can't use abstract classes for Foreign Keys. I was then considering adding to my settings.py AUTH_USER_MODEL = 'myapp.MyUser' and using settings.AUTH_USER_MODEL in place of User for the ForeignKey in the app. However, I have 3 different user types, so can't do this either.
An earlier prototype used Django 1.4, which did not support custom User models, hence had a reference to a User instead, but this required an extra join for every query, which was leading to quite complex queries. Is this the only way I can go forward with this, or is there another solution?
I have successfully used the following solution:
1. Create SchoolUser class in models.py - this will be your AUTH_USER_MODEL class
TYPES = (('Student', 'Student'), ('Staff', 'Staff'), ('Parent', 'Parent'), )
class SchoolUser(AbstractUser):
type = models.CharField(max_length=10, choices=TYPES, default='Student')
2. Create users.py file and put whole users logic there. Have one abstract class that all others inherit from and which will implement the factory method:
class UserManager(object):
def __init__(self, user):
self.user = user
#classmethod
def factory(cls, user):
"""
Dynamically creates user object
"""
if cls.__name__.startswith(user.type): # Children class naming convention is important
return cls(user)
for sub_cls in cls.__subclasses__():
result = sub_cls.factory(user)
if result is not None:
return result
Sample children classes (also go to users.py file):
class StudentUser(UserManager):
def do_something(self):
pass
class StaffUser(UserManager):
def do_something(self):
pass
class ParentUser(UserManager):
def do_something(self):
pass
Views is where the magic happens ;)
def my_view(request):
school_user = UserManager.factory(request.user)
if school_user.do_something: # each class can have different behaviour
This way you don't need to know, which type of user it is, just implement your logic.
I hope this is clear enough, if not let me know!

Django model.save() not working with loaddata

I have a model which is overriding save() to slugify a field:
class MyModel(models.Model):
name = models.CharField(max_length=200)
slug = models.SlugField(max_length=200)
def save(self, *args, **kwargs):
self.slug = slugify(self.name)
super(MyModel, self).save(*args, **kwargs)
When I run load data to load a fixture, this save() does not appear to be called because the slug field is empty in the database. Am I missing something?
I can get it to work by a pre_save hook signal, but this is a bit of a hack and it would be nice to get save() working.
def mymodel_pre_save(sender, **kwargs):
instance = kwargs['instance']
instance.slug = slugify(instance.name)
pre_save.connect(mymodel_pre_save, sender=MyModel)
Thanks in advance.
No you're not. save() is NOT called by loaddata, by design (its way more resource intensive, I suppose). Sorry.
EDIT: According to the docs, pre-save is not called either (even though apparently it is?).
Data is saved to the database as-is, according to https://docs.djangoproject.com/en/dev/ref/django-admin/#what-s-a-fixture
I'm doing something similar now - I need a second model to have a parallel entry for each of the first model in the fixture. The second model can be enabled/disabled, and has to retain that value across loaddata calls. Unfortunately, having a field with a default value (and leaving that field out of the fixture) doesn't seem to work - it gets reset to the default value when the fixture is loaded (The two models could have been combined otherwise).
So I'm on Django 1.4, and this is what I've found so far:
You're correct that save() is not called. There's a special DeserializedObject that does the insertion, by calling save_base() on the Model class - overriding save_base() on your model won't do anything since it's bypassed anyway.
#Dave is also correct: the current docs still say the pre-save signal is not called, but it is. It's behind a condition: if origin and not meta.auto_created
origin is the class for the model being saved, so I don't see why it would ever be falsy.
meta.auto_created has been False so far with everything I've tried, so I'm not yet sure what it's for. Looking at the Options object, it seems to have something to do with abstract models.
So yes, the pre_save signal is indeed being sent.
Further down, there's a post_save signal behind the same condition that is also being sent.
Using the post_save signal works. My models are more complex, including a ManyToMany on the "Enabled" model, but basically I'm using it like this:
from django.db.models.signals import post_save
class Info(models.Model):
name = models.TextField()
class Enabled(models.Model):
info = models.ForeignKey(Info)
def create_enabled(sender, instance, *args, **kwards):
if Info == sender:
Enabled.objects.get_or_create(id=instance.id, info=instance)
post_save.connect(create_enabled)
And of course, initial_data.json only defines instances of Info.

Loose coupling of apps & model inheritance

I have a design question concerning Django. I am not quite sure how to apply the principle of loose coupling of apps to this specific problem:
I have an order-app that manages orders (in an online shop). Within this order-app I have two classes:
class Order(models.Model):
# some fields
def order_payment_complete(self):
# do something when payment complete, ie. ship products
pass
class Payment(models.Model):
order = models.ForeignKey(Order)
# some more fields
def save(self):
# determine if payment has been updated to status 'PAID'
if is_paid:
self.order.order_payment_complete()
super(Payment, self).save()
Now the actual problem: I have a more specialized app that kind of extends this order. So it adds some more fields to it, etc. Example:
class SpecializedOrder(Order):
# some more fields
def order_payment_complete(self):
# here we do some specific stuff
pass
Now of course the intended behaviour would be as follows: I create a SpecializedOrder, the payment for this order is placed and the order_payment_complete() method of the SpecializedOrder is called. However, since Payment is linked to Order, not SpecializedOrder, the order_payment_complete() method of the base Order is called.
I don't really know the best way to implement such a design. Maybe I am completely off - but I wanted to build this order-app so that I can use it for multiple purposes and wanted to keep it as generic as possible.
It would be great if someone could help me out here!
Thanks,
Nino
I think what you're looking for is the GenericForeignKey from the ContentTypes framework, which is shipped with Django in the contrib package. It handles recording the type and id of the subclass instance, and provides a seamless way to access the subclasses as a foreign key property on the model.
In your case, it would look something like this:
from django.db import models
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes import generic
class Payment(models.Model):
order_content_type = models.ForeignKey(ContentType)
order_object_id = models.PositiveIntegerField()
order = generic.GenericForeignKey('order_content_type', 'order_object_id')
You don't need to do anything special in order to use this foreign key... the generics handle setting and saving the order_content_type and order_object_id fields transparently:
s = SpecializedOrder()
p = Payment()
p.order = s
p.save()
Now, when your Payment save method runs:
if is_paid:
self.order.order_payment_complete() # self.order will be SpecializedOrder
The thing you want is called dynamic polymorphism and Django is really bad at it. (I can feel your pain)
The simplest solution I've seen so far is something like this:
1) Create a base class for all your models that need this kind of feature. Something like this: (code blatantly stolen from here)
class RelatedBase(models.Model):
childclassname = models.CharField(max_length=20, editable=False)
def save(self, *args, **kwargs):
if not self.childclassname:
self.childclassname = self.__class__.__name__.lower()
super(RelatedBase, self).save(*args, **kwargs)
#property
def rel_obj(self):
return getattr(self, self.childclassname)
class Meta:
abstract = True
2) Inherit your order from this class.
3) Whenever you need an Order object, use its rel_obj attribute, which will return you the underlying object.
This solution is far from being elegant, but I've yet to find a better one...