Is this a valid approach to sharing a function between Python classes? - django

My goal is to use the same function from multiple classes in Python.
I've seen discussion about mixins and inheritance etc but they all seem to come with caveats and cautions about doing things just right.
So I wondered if I could just call another plain old function that lives outsides the classes. It seems to work, but maybe I'm failing to understand something important?
So my question is - is this a valid approach to sharing a function between Python classes?
def clean_reference_url(self):
if not self.cleaned_data['reference_url']:
return ''
try:
if 'illegaltext' in self.cleaned_data['reference_url']:
raise Exception
except:
raise forms.ValidationError('You cannot put illegaltext in this field.')
return self.cleaned_data['reference_url']
class ContactForm(forms.ModelForm):
def clean_reference_url(self):
return clean_reference_url(self)
class TripForm(forms.ModelForm):
def clean_reference_url(self):
return clean_reference_url(self)

It's valid, but it's unnecessary to have the extra layer of wrapping. The mix-in approach is the simplest, but yes, it has some caveats (largely related to metaclasses), so if you want to avoid that, you can still set a method in multiple classes by just setting during the definition of each class. Keep the function definition the same, and change the classes to:
class ContactForm(forms.ModelForm):
clean_reference_url = clean_reference_url
class TripForm(forms.ModelForm):
clean_reference_url = clean_reference_url
Again, a mixin is even cleaner, e.g.:
class CleanableUrl: # Change name as appropriate
def clean_reference_url(self):
# ...
class ContactForm(CleanableUrl, forms.ModelForm):
# No need to talk about clean_reference_url at all
class TripForm(CleanableUrl, forms.ModelForm):
# No need to talk about clean_reference_url at all
and it's usually the most Pythonic approach, assuming it works for your scenario (no conflicting metaclasses on the base types).

Related

correct use of class method in django

I had studied class methods in python but never really understood its application in Djnago. My coding life was going well without the use of it. But I came across a situation where class method will be useful.
My model:
class Example(models.Model):
post_count = models.IntegerField(default=0)
#classmethod
def total_counts(cls):
return cls.objects.values('post_views').annotate(sum logic)
In the above function, if I had used property decorator, i can only get a single object post_count because the object self is used. But if I use classmethod then I can count the post views of all the objects.
So my thought is that whenever we have to deal with all the objects instead of a self object we need class method. Is this correct?
Also, can we use this class method directly into our model serializer field just like we use property to serializer??

Transient use of Django Model instances

I have a use case where a particular class can either be transient or persistent. Transient instances are build from a JSON payload on a PUT call, and may either be persisted to the database or used during the server call and then either returned or discarded. What is best practice for this case? My options seem to be:
Write two classes, one of which is a models.Model subclass, and the other of which isn't, and make them implement the same API, or
Use the Model subclass, but be careful not to call save().
Is either of these preferable, according to conventional use of Django models?
You'll need both:
abstract = True is useful if inheritants still should be concrete models, so that no table should be created just for the parent class. It allows you to opt out of multi-table inheritance, and instead have the shared attributes duplicated to inheritants tables instead (abstract base inheritance).
managed = False is useful if the inheriting class should never be persisted at all. Django migrations and fixtures won't generate any database table for this.
class TransientModel(models.Model):
"""Inherit from this class to use django constructors and serialization but no database management"""
def save(*args, **kwargs):
pass # avoid exceptions if called
class Meta:
abstract = True # no table for this class
managed = False # no database management
class Brutto(TransientModel):
"""This is not persisted. No table app_brutto"""
#do more things here
pass
In order to remain as DRY as possible, you could have an abstract mock class deriving your model:
class A(models.Model):
# fields'n'stuff
class TransientA(A):
def save(*args, **kwargs):
pass # avoid exceptions if called
class Meta:
abstract = True # no table created
Now, even if you call save on it anywhere (even in methods inherited from A), you'll be shooting blanks.

Best practice: Where should I write functions (not methods) in Django?

This is maybe not a fundamental question, but as curious as I am, I'm asking this myself for a long time.
A part of Django code is functions and is not directly related to Django models.
Let's take for example this function :
def make_random_raw_password(length=8):
# Do some stuff and return a 8 char randomised string
Technically, this function is not related to the User class, but functionally, it is!
Where would you write this function?
From my beginner point of view, I can see at least 2 options:
I create a myutils.py file which contains all these kind of functions then I import myutils.py in files like models.py / view.py to use these functions.
I move this function as a classmethod in my UserProxy class (since I can not directly modify the User class methods in Django) which gives me something close to:
-
class UserProxy(User):
class Meta(object):
proxy = True
#classmethod
def make_random_raw_password(cls, length=8):
# Do some stuff...
Which one would you use? Could you explain why? Thank you in advance.

django: Downcasting automatically in multitable inheritance/polymorphic tables

I have something like this:
class Base(Model):
...
def downcast(self):
try:
return self.childa
except:
pass
try:
return self.childb
except:
pass
return self
class ChildA(Base):
....
class ChildB(Base):
....
class Foo(Model):
child = ForeignKey(Base)
Whenever I have a Foo object, the child foreignkey is always an instance of Base - my understanding is that's how Django works. For now I have added a downcast() method to Base (see above). I don't mind hardcoding the possible derived types.
What I would like is to somehow centralize that downcast automatically in Foo. I added this multi-table inheritance to existing code and I keep finding instances where the code really needs it downcast -- so I have to then manually downcast it locally in the code.
I was using the django-polymorphic package, but it is giving me some side effects I don't know how/nor want to deal with (like I can't delete rows - got some error about opts.pk being None deep in queryset code.)
So I've wondered -- would putting something in __init__() (after calling the base class init) be ok? Are there side effects I'm not thinking of? This seems like it could be a problem when creating new instances from scratch.
def __init__(*args, **kwargs):
super(Base, self).__init__(*arg, **kwargs)
self.child = self.child.downcast()
Should I just rename child?
class Foo(Model):
child_poly = ForeignKey(Base) # was child
#property
def child(self):
return self.child_poly.downcast()
This could be a problem when creating Foo() from scratch. I can't say Foo(child=c).
Is there a better approach? Not looking for a generic polymorphic solution/mixin -- not after trying to debug django and finding that removing django-polymorphic fixed the deletion issue.
In the end, I went back to django-polymorphic and haven't had the issue I was having before again.

How to go from a Model base to derived class in Django?

Assuming a simple set of inherited Model classes, like this:
class BaseObject(models.Model):
some_field = models.SomeField(...)
class AwesomeObject(BaseObject):
awesome_field = models.AwesomeField(...)
class ExcellentObject(BaseObject):
excellent_field = models.ExcellentField(...)
and a query that looks like this:
found_objects = BaseObject.objects.filter(some_field='bogus')
What's the best way to take each found object and turn it back into it's derived class? The code I'm using now is like this:
for found in found_objects:
if hasattr(found, 'awesomeobject'):
ProcessAwesome(found.awesomeobject)
elif hasattr(found, 'excellentobject'):
ProcessExcellent(found.excellentobject):
But, it feels like this is an abuse of "hasattr". Is there a better way to do this without creating an explicit "type" field on the base class?
For this specific problem, there is django-polymorphic. It works by using the content type framework in Django to store the model ID which the derived table points to. When you evaluate the queryset, it will upcast all models their specific type.
You'll get:
>>> BaseProject.objects.all()
[ <AwesomeObject>, <ExcellentObject>, <BaseObject>, <AwesomeObject> ]
That's the best way that I know of. Unfortunately, inheritance is a little clunky in this regard. Multiple table inheritance is basically just a one-to-one relationship between the parent model and the extra fields the child adds, which is why that hasattr trick works. You can think of each of those as a OneToOneField attribute on your parent model. When you think of it that way, Django has no way of knowing which child to return or even if to return a child, so you have to handle that logic yourself:
I tend to create a method on the parent such as get_child, which simply cycles through the attributes and returns the one that pops:
class BaseObject(models.Model):
some_field = models.SomeField(...)
def get_child(self):
if hasattr(self, 'awesomeobject'):
return ProcessAwesome(found.awesomeobject)
elif hasattr(self, 'excellentobject'):
return ProcessExcellent(found.excellentobject):
else:
return None
At least then, you can just call found.get_child(), and maybe forget about the hackery that gets you there.
Going from a base class to a derived class is generally a sign of bad design in a program. The method you propose, using hasattr, can be a serious problem. I'll show you:
# defined in some open source library
class MyObject(object):
def what_is_derived(self):
if hasattr(self, 'derived1'):
return 'derived1'
elif hasattr(self, 'derived2'):
return 'derived2'
else:
return 'base'
Let's pretend that classes Derived1 and Derived2 are defined in that same library. Now, you want to use the features of MyObject, so you derive from it in your own code.
# defined in your own code
class MyBetterObject(MyObject):
pass
better_object = MyBetterObject()
better_object.what_is_derived() # prints 'base'
The whole point of polymorphism is that you can have many derived classes without the base class having to change. By making the base class aware of all of it's derived classes, you severely reduce the usefulness of such a class. You can't create a derived class without changing the base class.
Either you want to work with a derived class, or you don't care what the specific class is and all you need are the properties/methods of the base class. It is the same in all OOP languages. There are facilities for finding out what the derived class is, but usually it's a bad idea.
From a django models perspective, I usually use inheritance in such a way:
class Address(models.Model):
# fields...
class Person(Address):
# fields...
class Business(Address):
# fields...
Address.objects.all() # find all addresses for whatever reason
Person.objects.all() # im only interested in people
Business.objects.all() # need to work with businesses
# need to show all addresses in a postcode, and what type of address they are?
businesses = Business.objects.filter(postcode='90210')
people = Person.objects.filter(postcode='90210')
# use the address properties on both
Deeply nested inheritance chains with django models are awkward. They are also pretty unnecessary in most cases. Instead of polluting your base class with hasattr checks, define a helper method which is capable of querying the required derived classes if such a thing is called for. Just don't define it on the Base class.
I use introspection ;
class Base(models.Model):
[ we have some unique 'key' attribute ]
class_name = models.CharField(..., editable=False)
def get_base(self):
if self.__class__ == Base:
return self
# if we are not an instance of Base we 'go up'
return Base.objects.get(key=self.key)
def get_specific(self):
if self.__class__ != Base:
return self
# if we are an instance of Base we find the specific class
class_type = getattr(sys.modules["project.app.models"],
self.class_name)
return class_type.objects.get(key=self.key)
You need some factory to create the specific classes so you are sure to correctly save str(self.class) in class_name
You can also use InheritanceQuerySet from django-model-utils in case you want to explicitly state which queries to affect, like this:
from model_utils.managers import InheritanceQuerySet
class UserManager([...]):
def get_queryset(self):
return InheritanceQuerySet(self.model).select_subclasses()
(code from https://stackoverflow.com/a/25108201)