I have a bunch of classes that I'm now trying to incorporate into django.
For example, I have a Base class that all my other classes derive from:
class Base:
def __init__(self, label: str = 'Base'):
self.label = label
An example of a sublcass would be a Person class:
from typing import Any, Dict
class Person(Base):
def __init__(self, name: str, attributes_to_options: Dict[str, Any], **kwargs):
super().__init__(**kwargs)
self.name = name
self.attributes_to_options = attributes_to_options
I would use this as:
alex = Person(name='Alex', attributes_to_options={'age': 10, 'is_happy': True}, label='Person:Alex')
My question is, how do I incorporate such a class into django? Is it as simple as inheritting from models.Model? e.g.
from django.db import models
class Person(Base, models.Model):
def __init__(self, name: str, attributes_to_options: Dict[str, Any], **kwargs):
super().__init__(**kwargs)
self.name = name
self.attributes_to_options = attributes_to_options
But then how do I specify the models.CharField for the two attributes name and attributes_to_options?
Thanks for any help here.
Bear in mind that in general, any Django Model subclass corresponds to a database table. Inheriting from such a class ("concrete inheritance") means that another database table will be created with a one-to-one linkage between rows, and that every query will implicitly perform a join in the DB. This is bad for performance. But for tables with not very many rows or for tables queried infrequently, you possibly don't need to care.
Django provides two special cases that can be defined via the Meta class in the Model subclass. The first is an "Abstract Base Class" which allow you to define a bunch of stuff which will be present in any derived model. IN the case of fields, they are "copied" into the class that inherits them, rather than having their own DB table. The second is a "Proxy" class, which allows you to place a new set of methods on top of an existing database table, and which goes some way towards allowing polymorphic models. Careful reading of the Django doc. is a good idea.
I have experimentally established that one can also use mix-in classes in the same way as one uses them with class-based views. In other words,
class ExtraMethodsMixin( object): # NB must inherit from object
# NB no model field definitions allowed here
#property
def something_or_other(self):
return something_based_on_model_fields_defined_elsewhere
#etc.
and then
class Foo( ExtraMethodsMixin, models.Model): # NB mixin goes first
# define names and fields that the ExtraMethodsMixin uses
# (and anything else that a Foo needs)
...
The one snag I have found is that migrations do remember the dependency of Foo on ExtraMethodsMixin and so it's a PITA should you desire to completely remove the mixin at a later date. However, you can stub it out to a single pass statement without any problems, so this is probably not a significant worry. My other worry is that this usage is completely undocumented (other than as standard Python), so it's just about possible that it's trampling on Django internals in some very subtle way I've not yet spotted. So I'm definitely a bit short of recommending this technique.
Related
In Django, is there a way to combine two models that inherit the same class into one serializer?
from django.db.models import Model
class A(Model):
a = IntegerField(...)
# other fields...
class Meta:
abstract = True
class B(A):
# There may or may not be such a thing as class C.
class C(A):
# There may or may not be such a thing as class B.
I have code like above.
Could it be possible to create a serializer based on the class A model?
I tried to create a view table in SQL, but gave up due to performance issues.
Any good ideas please. It's so painful...😥
In general it is not a good idea to reuse serializes because doing so may expose you to unexpected behavior, when something changes in the base serializer or when you add/remove attributes to one of your models.
If model B and C have some attributes in common, then perhaps, you should consider changing your DB design.
However, in this case I would define 2 serializes that have the attribute of the abstract model:
class BSerializer(serializers.Serializer):
a = serialzier.IntegerField()
...
class CSerializer(serialziers.Serializer):
a = serializer.IntegerField()
...
I have a use case where a particular class can either be transient or persistent. Transient instances are build from a JSON payload on a PUT call, and may either be persisted to the database or used during the server call and then either returned or discarded. What is best practice for this case? My options seem to be:
Write two classes, one of which is a models.Model subclass, and the other of which isn't, and make them implement the same API, or
Use the Model subclass, but be careful not to call save().
Is either of these preferable, according to conventional use of Django models?
You'll need both:
abstract = True is useful if inheritants still should be concrete models, so that no table should be created just for the parent class. It allows you to opt out of multi-table inheritance, and instead have the shared attributes duplicated to inheritants tables instead (abstract base inheritance).
managed = False is useful if the inheriting class should never be persisted at all. Django migrations and fixtures won't generate any database table for this.
class TransientModel(models.Model):
"""Inherit from this class to use django constructors and serialization but no database management"""
def save(*args, **kwargs):
pass # avoid exceptions if called
class Meta:
abstract = True # no table for this class
managed = False # no database management
class Brutto(TransientModel):
"""This is not persisted. No table app_brutto"""
#do more things here
pass
In order to remain as DRY as possible, you could have an abstract mock class deriving your model:
class A(models.Model):
# fields'n'stuff
class TransientA(A):
def save(*args, **kwargs):
pass # avoid exceptions if called
class Meta:
abstract = True # no table created
Now, even if you call save on it anywhere (even in methods inherited from A), you'll be shooting blanks.
Assuming a simple set of inherited Model classes, like this:
class BaseObject(models.Model):
some_field = models.SomeField(...)
class AwesomeObject(BaseObject):
awesome_field = models.AwesomeField(...)
class ExcellentObject(BaseObject):
excellent_field = models.ExcellentField(...)
and a query that looks like this:
found_objects = BaseObject.objects.filter(some_field='bogus')
What's the best way to take each found object and turn it back into it's derived class? The code I'm using now is like this:
for found in found_objects:
if hasattr(found, 'awesomeobject'):
ProcessAwesome(found.awesomeobject)
elif hasattr(found, 'excellentobject'):
ProcessExcellent(found.excellentobject):
But, it feels like this is an abuse of "hasattr". Is there a better way to do this without creating an explicit "type" field on the base class?
For this specific problem, there is django-polymorphic. It works by using the content type framework in Django to store the model ID which the derived table points to. When you evaluate the queryset, it will upcast all models their specific type.
You'll get:
>>> BaseProject.objects.all()
[ <AwesomeObject>, <ExcellentObject>, <BaseObject>, <AwesomeObject> ]
That's the best way that I know of. Unfortunately, inheritance is a little clunky in this regard. Multiple table inheritance is basically just a one-to-one relationship between the parent model and the extra fields the child adds, which is why that hasattr trick works. You can think of each of those as a OneToOneField attribute on your parent model. When you think of it that way, Django has no way of knowing which child to return or even if to return a child, so you have to handle that logic yourself:
I tend to create a method on the parent such as get_child, which simply cycles through the attributes and returns the one that pops:
class BaseObject(models.Model):
some_field = models.SomeField(...)
def get_child(self):
if hasattr(self, 'awesomeobject'):
return ProcessAwesome(found.awesomeobject)
elif hasattr(self, 'excellentobject'):
return ProcessExcellent(found.excellentobject):
else:
return None
At least then, you can just call found.get_child(), and maybe forget about the hackery that gets you there.
Going from a base class to a derived class is generally a sign of bad design in a program. The method you propose, using hasattr, can be a serious problem. I'll show you:
# defined in some open source library
class MyObject(object):
def what_is_derived(self):
if hasattr(self, 'derived1'):
return 'derived1'
elif hasattr(self, 'derived2'):
return 'derived2'
else:
return 'base'
Let's pretend that classes Derived1 and Derived2 are defined in that same library. Now, you want to use the features of MyObject, so you derive from it in your own code.
# defined in your own code
class MyBetterObject(MyObject):
pass
better_object = MyBetterObject()
better_object.what_is_derived() # prints 'base'
The whole point of polymorphism is that you can have many derived classes without the base class having to change. By making the base class aware of all of it's derived classes, you severely reduce the usefulness of such a class. You can't create a derived class without changing the base class.
Either you want to work with a derived class, or you don't care what the specific class is and all you need are the properties/methods of the base class. It is the same in all OOP languages. There are facilities for finding out what the derived class is, but usually it's a bad idea.
From a django models perspective, I usually use inheritance in such a way:
class Address(models.Model):
# fields...
class Person(Address):
# fields...
class Business(Address):
# fields...
Address.objects.all() # find all addresses for whatever reason
Person.objects.all() # im only interested in people
Business.objects.all() # need to work with businesses
# need to show all addresses in a postcode, and what type of address they are?
businesses = Business.objects.filter(postcode='90210')
people = Person.objects.filter(postcode='90210')
# use the address properties on both
Deeply nested inheritance chains with django models are awkward. They are also pretty unnecessary in most cases. Instead of polluting your base class with hasattr checks, define a helper method which is capable of querying the required derived classes if such a thing is called for. Just don't define it on the Base class.
I use introspection ;
class Base(models.Model):
[ we have some unique 'key' attribute ]
class_name = models.CharField(..., editable=False)
def get_base(self):
if self.__class__ == Base:
return self
# if we are not an instance of Base we 'go up'
return Base.objects.get(key=self.key)
def get_specific(self):
if self.__class__ != Base:
return self
# if we are an instance of Base we find the specific class
class_type = getattr(sys.modules["project.app.models"],
self.class_name)
return class_type.objects.get(key=self.key)
You need some factory to create the specific classes so you are sure to correctly save str(self.class) in class_name
You can also use InheritanceQuerySet from django-model-utils in case you want to explicitly state which queries to affect, like this:
from model_utils.managers import InheritanceQuerySet
class UserManager([...]):
def get_queryset(self):
return InheritanceQuerySet(self.model).select_subclasses()
(code from https://stackoverflow.com/a/25108201)
I have tried multiple aproaches, but as I cannot use self in the class body, self.__class__.__name__ is not available. Would I need to override the save method to do this? Thanks for your help.
Your question is oddly phrased, so I'm going to come at it sideways.
Assume that you have defined a model Foo as follows:
from django.db import models
class Foo( models.Model ):
foo = models.IntegerField()
bar = models.IntegerField()
def klass( self ):
return self.__class__.__name__
Supposing you start a Django shell (python manage.py shell), you can do the following:
>>> from foo.models import Foo
>>> foo = Foo()
>>> print foo.klass()
Foo
This demonstrates that you can certainly use self.__class__.__name__ in the body of any method for model Foo. Thus you must have some other context where you need to dynamically determine the actual class name for your model, but not from an instance of your model.
If you've finished defining the model, then the following is legal:
>>> print Foo._meta.object_name
Foo
This mechanism would allow you to do introspection directly against the model, without having to create an instance of the model.
If this doesn't work for you, you must need this during the actual definition of the model. In that case, I would respectfully suggest that if you know you're defining the Foo model, you should just hardcode Foo wherever you need it. If you really need a dynamic way during the creation of your model to determine the name of the model ... could you describe the actual problem you're trying to solve, so that we can help you solve it?
This is more or less what I want:
class VFXContainer(models.Model):
classname=models.CharField(max_length=60,editable=False,blank=True)
parent=models.ForeignKey("self",blank=True,null=True)
def save(self, *args, **kwargs):
self.classname=self.__class__.__name__
super(VFXContainer, self).save(*args, **kwargs)
class Company(VFXContainer):
class Meta:
verbose_name_plural="companies"
class Project(VFXContainer):
pass
class CustomVFXContainer(VFXContainer):
pass
Now, what I dont know how to do, is I want to "override" the limit_choices_to option in the parent field on the child classes. What I want is CustomVFXContainer to be parented to any type of class, Project only to be parented by Company, and Company not to be parented at all. Im using this structure for the following reason. There is going to be a ton of fields that I want to be in all the subclasses, and I also have a separate Tasks models that link through a foreign key to the base VFXContainer Class (and thus is attachable to any of the child classes). Hope this makes it more clear on what Im trying to achieve, thanks for your help.
I have a design question concerning Django. I am not quite sure how to apply the principle of loose coupling of apps to this specific problem:
I have an order-app that manages orders (in an online shop). Within this order-app I have two classes:
class Order(models.Model):
# some fields
def order_payment_complete(self):
# do something when payment complete, ie. ship products
pass
class Payment(models.Model):
order = models.ForeignKey(Order)
# some more fields
def save(self):
# determine if payment has been updated to status 'PAID'
if is_paid:
self.order.order_payment_complete()
super(Payment, self).save()
Now the actual problem: I have a more specialized app that kind of extends this order. So it adds some more fields to it, etc. Example:
class SpecializedOrder(Order):
# some more fields
def order_payment_complete(self):
# here we do some specific stuff
pass
Now of course the intended behaviour would be as follows: I create a SpecializedOrder, the payment for this order is placed and the order_payment_complete() method of the SpecializedOrder is called. However, since Payment is linked to Order, not SpecializedOrder, the order_payment_complete() method of the base Order is called.
I don't really know the best way to implement such a design. Maybe I am completely off - but I wanted to build this order-app so that I can use it for multiple purposes and wanted to keep it as generic as possible.
It would be great if someone could help me out here!
Thanks,
Nino
I think what you're looking for is the GenericForeignKey from the ContentTypes framework, which is shipped with Django in the contrib package. It handles recording the type and id of the subclass instance, and provides a seamless way to access the subclasses as a foreign key property on the model.
In your case, it would look something like this:
from django.db import models
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes import generic
class Payment(models.Model):
order_content_type = models.ForeignKey(ContentType)
order_object_id = models.PositiveIntegerField()
order = generic.GenericForeignKey('order_content_type', 'order_object_id')
You don't need to do anything special in order to use this foreign key... the generics handle setting and saving the order_content_type and order_object_id fields transparently:
s = SpecializedOrder()
p = Payment()
p.order = s
p.save()
Now, when your Payment save method runs:
if is_paid:
self.order.order_payment_complete() # self.order will be SpecializedOrder
The thing you want is called dynamic polymorphism and Django is really bad at it. (I can feel your pain)
The simplest solution I've seen so far is something like this:
1) Create a base class for all your models that need this kind of feature. Something like this: (code blatantly stolen from here)
class RelatedBase(models.Model):
childclassname = models.CharField(max_length=20, editable=False)
def save(self, *args, **kwargs):
if not self.childclassname:
self.childclassname = self.__class__.__name__.lower()
super(RelatedBase, self).save(*args, **kwargs)
#property
def rel_obj(self):
return getattr(self, self.childclassname)
class Meta:
abstract = True
2) Inherit your order from this class.
3) Whenever you need an Order object, use its rel_obj attribute, which will return you the underlying object.
This solution is far from being elegant, but I've yet to find a better one...