Django Models Unit Tests: Help for a newbie - django

Right, this is kind of the last place i would like to have asked due to the question being very vague, but I'm at a loss.
Basically, I'm trying to learn how to code and I'm currently working with Django to help me get to grips with the back end of stuff. I keep being reminded of the importance of unit testing so I want to include them in my dummy project that I'm working on so I can begin to understand them early on in my programming journey... only... they seem easy enough on the surface, but are clearly more complicated to a beginner than i gave them credit for.
I'm not entirely sure where / if I'm going wrong here, so feel free to poke me and make fun but try point me in the right direction, or to some beginner friendly resources (scoured the internet and there isn't a massive amount of info).
My project is a dummy GUI related to policing (hoping to use it in my portfolio in future)
Here's the model class I want help with testing. There are others but I want to do those by myself once I know more:
class Warrant(models.Model):
"""
"""
name = models.CharField(max_length=50)
WARRANT_TYPE_CHOICES = (
('ARREST', 'Arrest'),
('SEARCH', 'Search'),
)
warrant_type = models.CharField(max_length=10, choices=WARRANT_TYPE_CHOICES, default='ARREST')
start_date = models.DateTimeField(auto_now=True, null=True)
expiry_date = models.DateTimeField(auto_now_add=True, null=True)
renewal_date = models.DateTimeField(auto_now_add=True, null=True)
As you can see, fairly simple model.
Here are the tests that I'm currently aimlessly fiddling around with:
def test_model_creation(self):
object = Warrant.objects.create()
self.assertIsNotNone(object)
def test_warrant_str(self):
warrant = Warrant.objects.create(
name="Warrant Name",
)
self.assertEqual(str(warrant), "Warrant Name")
def test_datetime_model(self):
start_date = Warrant.objects.create(start_date=datetime.now())
expiry_date = Warrant.objects.create(expiry_date=datetime.now())
renewal_date = Warrant.objects.create(renewal_date=datetime.now())
To me, this reads fine, and all tests return OK. However, I'm not sure if this is best practise or actually does what I think it should be doing.
In addition, I'm not sure how to test the warrant_type / WARRANT_TYPE_CHOICES field either.
I know this isn't typically the type of question that should be asked here, but I'm essentially just typing stuff at random that I've picked up from tutorials and have no idea if it is even correct.
Thanks

It's good that you think tests first! It's usually not models that you test, but behavior, or in other words, methods. If you add a method to your model with some actual logic in it, then that's a good time to start a test for it. For example:
class Warrant(models.Model):
name = models.CharField(max_length=50)
WARRANT_TYPE_CHOICES = (
('ARREST', 'Arrest'),
('SEARCH', 'Search'),
)
warrant_type = models.CharField(max_length=10, choices=WARRANT_TYPE_CHOICES, default='ARREST')
start_date = models.DateTimeField(auto_now=True, null=True)
expiry_date = models.DateTimeField(null=True)
renewal_date = models.DateTimeField(null=True)
#property
def is_expired(self):
return self.expiry_date < timezone.now()
With as test cases:
import datetime
from .models import Warrant
from django.test import TestCase
from django.utils import timezone
class WarrantTestCase(TestCase):
def test_is_expired_returns_true_if_expiry_date_before_now(self):
now = timezone.now()
hour_ago = now - datetime.timedelta(hours=1)
warrant = Warrant.objects.create(expiry_date=hour_ago)
self.assertTrue(warrant.is_expired)
def test_is_expired_returns_false_if_expiry_date_after_now(self):
now = timezone.now()
in_an_hour = now + datetime.timedelta(hours=1)
warrant = Warrant.objects.create(expiry_date=in_an_hour)
self.assertFalse(warrant.is_expired)
Note that I did remove the properties auto_now_add=True from the fields expiry_date and renewal_date. That is probably behavior that you don't want. I figured this out by writing this test case, because one of the test cases failed when I didn't remove it. Long live tests!

I don't think your tests are actually doing anything useful, but there is a way to do what I believe you are looking for. If I wanted to write a test to make sure that a model was creating objects correctly, I would do something like this:
from django.test import TestCase
class WarrantTestCase(TestCase):
def setUp(self):
self.warrant = Warrant.objects.create(name="Warrant Name", warrant_type="SEARCH")
def test_warrant_create(self):
warrant = Warrant.objects.get(name="Warrant Name")
self.assertEqual(warrant.type, "SEARCH")
The setup first creates a sample object, and then my test grabs that object and checks to see that the warrant type is equal to what I expect it to be.
More info on unit testing in Django can be found in the docs: https://docs.djangoproject.com/en/3.1/topics/testing/overview/

Related

Django, What is the advantage of Modifying a model manager’s initial QuerySet?

The below model have EditorManager,
class EditorManager(models.Manager):
def get_queryset(self):
return super().get_queryset().filter(role='E')
class Person(models.Model):
first_name = models.CharField(max_length=50)
role = models.CharField(max_length=1, choices=[('A', _('Author')), ('E', _('Editor'))])
people = models.Manager()
editors = EditorManager()
If I query Person.objects.filter(role='E') or Person.editors.all() I gets same result.
then, Why do we go for writing EditorManager() ?
The above code is from Django documentation (https://docs.djangoproject.com/en/3.0/topics/db/managers/).
As mentioned in the Documentation:
using multiple managers on the same model. You can attach as many Manager() instances to a model as you’d like. This is a non-repetitive way to define common “filters” for your models.
Since you just have one action, it may be hard for you to see the benefits. However, as your code gets larger, say:
good = Book.objects.filter(author="PersonA", stars=5).order_by("-date_created").exclude(outdated=True)
normal = Book.objects.filter(author="PersonA", stars=3).order_by("-date_created").exclude(outdated=True)
bad = Book.objects.filter(author="PersonA", stars=1).order_by("-date_created").exclude(outdated=True)
You can see that's an awful lot of code. With managers, you can do something like this:
class AuthorAManager(models.Manager):
def get_queryset(self):
return super().get_queryset().filter(author="PersonA").order_by("-date_created").exclude(outdated=True)
class Book(models.Model):
# ...
author_a = AuthorAManager()
good = Book.author_a.filter(stars=5)
normal = Book.author_a.filter(stars=3)
bad = Book.author_a.filter(stars=1)
Overall, it can make your code look a lot cleaner and understandable. As you said, you can't see the difference right now as you haven't gone into complex/repeating handles, but as your project expands, I'd say it's a worthwhile investment.

Is there any downside to Django proxy models?

I'm getting rather tired of paging through lots of irrelevant little fiddly properties while looking for the actual database structure of my models. Would it be a bad thing to use proxy models universally just to keep my code better organized / more readable? I.e.
class Foo_Base( models.Model):
title = models.CharField( ...)
# other DB fields. As little as possible anything else.
class Bar_Base( models.Model):
foo = models.ForeignKey( Foo_Base, ... )
etc. not many more lines than there are columns in the DB tables. Then at the bottom or elsewhere,
class Foo( Foo_Base):
class Meta:
proxy=True
#property
def some_pseudo_field(self):
# compute something based on the DB fields in Foo_Base
return result
#property
# etc. pages of etc.
The fact that makemigrations and migrate tracks proxy models makes me slightly concerned, although this usage seems to be exactly what the Django documentation says they are for (wrapping extra functionality around the same database table).
Or is there another way to organize my code that accomplishes the same (keeping fundamental stuff and fiddly little support bits apart).
[Edit] am offering up something that seems to work as a self-answer below. I'd still very much like to hear from anybody who knows for a fact that this is OK, given the deep Django magic on its declarative field declarations.
(About the only thing I dislike about Python, is that it does not have include functionality for reading in a heap of code from another file! )
I think I may have found an answer: use a plugin class inheriting from object,
as is commonplace for class-based Views.
I'd still very much like to hear from anybody who knows for a fact that this is OK, given the deep Django magic on its declarative field declarations.
Minimal proof of concept:
class PenMethods1( object):
#property
def namey(self):
return format_html('<div class="namey">{name}</div>', name=self.name )
class PenTest1(PenMethods1, models.Model):
name = models.CharField( max_length=16, blank=True )
def __repr__(self):
return f'<Pentest1 object id={self.id} name={self.name}>'
Initial migration was OK. Then I added
pnum = models.ForeignKey( 'Pennum', models.SET_NULL, null=True)
(Pennum was something already lying around in my playpen) and ran makemigrations and migrate. Again OK and basic functioning checks out...
>>> from playpen.models import PenTest1, Pennum
>>> n = Pennum.objects.last()
>>> n
<Pennum object id=3 name="t3" num=14 >
>>> p = PenTest1.objects.get(name='bar')
>>> p
<Pentest1 object id=2 name=bar>
>>> p.namey
'<div class="namey">bar</div>'
>>> p.pnum=n
>>> p.save()
>>> n=Pennum.objects.last()
>>> list(n.pentest1_set.all())
[<Pentest1 object id=2 name=bar>]
>>>

Access to child object in Django relation

Edit: I would very much like to accomplish this without installing a 3rd-party app. It seems simple/common enough that someone would have posted a line of code that accomplishes this by now?
Couldn't this be done easily in SQL? Would it be taboo to just hit the DB with a custom SQL in the index view?
So I have a parent Class and 2 child Classes. I would like to query all items and return a quick list.
from django.db import models
VIDEO_TYPE_CHOICES = (
('dvd', 'DVD'),
('downloaded', 'Downloaded'),
)
BOOK_TYPE_CHOICES = (
('e_book', 'E-Book'),
('print', 'Print'),
('audio', 'Audio Book'),
)
class Unit(models.Model):
name = models.CharField(max_length=200)
image = models.ImageField()
def __unicode__(self):
return self.name
class Video(Unit):
this_type = models.CharField(max_length=20, choices=VIDEO_TYPE_CHOICES, default='dvd')
run_time = models.CharField(max_length=200)
class Book(Unit):
this_type = models.CharField(max_length=20, choices=BOOK_TYPE_CHOICES, default='print')
pages = models.CharField(max_length=200)
All I want to do is display a list of all "Units" with this_type mushed in there on my index page.
Such as:
Lord Of The Rings, lotr.jpeg, DVD
Treasure Island, treasure_island.jpeg, Print
But I only have access to the Units name and image properties if I do a standard "gimme all Units" query...not this_type. Unless of course I make an assumption about the object and try object.book.this_type for example...which throws an exception if that particular object is not a Book.
I've been researching this for a while now...and while I can find several related questions and several possible methods (generic relations, for example?), I cannot find an example that I can relate to my own use case...or understand at all for that matter. I've only been at this stuff (Python and Django) for about a week now...I learn best when I can just make something work, get an understanding of all the moving parts, and then build on that understanding.
In that light, if someone could give me an example of how to generate the previously mentioned object list, I would be extremely grateful!
Pretty PLS???
I would recommend using the django app model_utils
OOP is generally not the best design pattern for models but if you are going to go that route model_utils has an InheritanceManager which does exactly what you want.

Django - Lazy results with a context processor

I am working on a django project that requires much of the common page data be dynamic. Things that appear on every page, such as the telephone number, address, primary contact email etc all need to be editable via the admin panel, so storing them in the settings.py file isn't going to work.
To get around this, I created a custom context processor which acts as an abstract reference to my "Live Settings" model. The model looks like this:
class LiveSetting(models.Model):
id = models.AutoField(primary_key=True)
title = models.CharField(max_length=255, blank=False, null=False)
description = models.TextField(blank=True, null=True)
key = models.CharField(max_length=100, blank=False, null=False)
value = models.CharField(max_length=255, blank=True)
And the context processor like so:
from livesettings.models import LiveSetting
class LiveSettingsProcessor(object):
def __getattr__(self, request):
val = LiveSetting.objects.get(request)
setattr(self, val.key, val.value)
return val.value
l = LiveSettingsProcessor()
def livesetting_processors(request):
return {'settings':l}
It works really nicely, and in my template I can use {{ settings.primary_email }} and get the corresponding value from the database.
The problem with the above code is it handles each live setting request individually and will hit the database each time my {{ settings.*}} tag is used in a template.
Does anyone have any idea if I could make this process lazy, so rather than retrieve the value and return it, it instead updates a QuerySet then returns the results in one hit just before the page is rendered?
You are trying to invent something complex and these is no reason for that. Something as simple as this will work fork you good enough:
def livesetting_processors(request):
settings = LiveSetting.objects.get(request)
return {'settings':settings}
EDIT:
This is how you will solve your problem in current implementation:
class LiveSettingsProcessor(object):
def __getattr__(self, request):
val = getattr(self, '_settings', LiveSetting.objects.get(request))
setattr(self, val.key, val.value)
return val.value
#Hanpan, I've updated my answer to show how you can to solve your problem, but what I want to say is that things you are trying to achieve does not give any practical win, however it increase complexity ant it takes your time. It might also be harder to setup cache on all of this later. And with caching enabled this will not give any improvements in performance at all.
I don't know if you heard this before: premature optimization is the root of evil. I think this thread on SO is useful to read: https://stackoverflow.com/questions/211414/is-premature-optimization-really-the-root-of-all-evil
Maybe you could try Django's caching?
In particular, you may want to check out the low-level caching feature. It seems like it would be a lot less work than what you plan on.

Caching of querysets and re-evaluation

I'm going to post some incomplete code to make the example simple. I'm running a recursive function to compute some metrics on a hierarchical structure.
class Category(models.Model):
parent = models.ForeignKey('self', null=True, blank=True, related_name='children', default=1)
def compute_metrics(self, shop_object, metric_queryset=None, rating_queryset=None)
if(metric_queryset == None):
metric_queryset = Metric.objects.all()
if(rating_queryset == None):
rating_queryset = Rating.objects.filter(shop_object=shop_object)
for child in self.children.all():
do stuff
child_score = child.compute_metrics(shop_object, metric_queryset, rating_queryset)
metrics_in_cat = metric_queryset.filter(category=self)
for metric in metrics_in_cat
do stuff
I hope that's enough code to see what's going on. What I'm after here is a recursive function that is only going to run those queries once each, then pass the results down. That doesn't seem to be happening right now and it's killing performance. Were this PHP/MySQL (as much as I dislike them after working with Django!) I could just run the queries once and pass them down.
From what I understand of Django's querysets, they aren't going to be evaluated in my if queryset == None then queryset=stuff part. How can I force this? Will it be re-evaluated when I do things like metric_queryset.filter(category=self)?
I don't care about data freshness. I just want to read from the DB once for each of metrics and rating, then filter on them later without hitting the DB again. It's a frustrating problem that feels like it should have a very simple answer. Pickling looks like it could work but it's not very well explained in the Django documentation.
I think the problem here is you are not evaluating the queryset until after your recursive call. If you use list() to force the evaluation of the queryset then it should only hit the database once. Note you will have to change the metrics_in_cat line to a python level filter rather than using queryset filters.
parent = models.ForeignKey('self', null=True, blank=True, related_name='children', default=1)
def compute_metrics(self, shop_object, metric_queryset=None, rating_queryset=None)
if(metric_queryset is None):
metric_queryset = list([Metric.objects.all())
if(rating_queryset is None):
rating_queryset = list(Rating.objects.filter(shop_object=shop_object))
for child in self.children.all():
# do stuff
child_score = child.compute_metrics(shop_object, metric_queryset, rating_queryset)
# metrics_in_cat = metric_queryset.filter(category=self)
metrics_in_cat = [m for m in metric_queryset if m.category==self]
for metric in metrics_in_cat
# do stuff