Currently on my first advanced Django project and have what seems to be more of an architectural question. I have two models that represent an approval process and are related as follows:
from django.contrib.auth.models import User
from django.db.models.signals import post_save
from django.dispatch import receiver
class Transaction(models.Model):
uid = models.UUIDField(default=uuid.uuid4, editable=False, unique=True)
requester = models.ForeignKey(User, on_delete=models.PROTECT)
type = models.ForeignKey(TransactionType, on_delete=models.PROTECT)
class TransactionApproval(models.Model):
ACTIONS = (
(0, 'Declined'),
(1, 'Approved'),
)
transaction = models.ForeignKey(Transaction, on_delete=models.PROTECT)
user = models.ForeignKey(User, on_delete=models.PROTECT, related_name='approval_user')
action = models.PositiveSmallIntegerField(choices=ACTIONS, null=True, blank=True)
#receiver(post_save, sender=Transaction)
def create_transaction_approval(sender, instance, created, **kwargs):
""" Automatically adds approvers to the transaction when it's created. """
if created:
approvers = instance.requester.profile.get_manager()
logger.info('[TRANSACTION] Adding managers {0} to approval list (UUID {1})'.format(str(approvers), instance.uid))
for approver in approvers:
TransactionApproval.objects.create(transaction=instance, user=approver)
Now I've been creating the TransactionApprovals automatically when a certain Transaction is created using #receiver as outlined above as approvals are a crucial and required part of Transactions. However I'm not sure if it actually doesn't make more sense move these actions to the already overwritten def save() method of Transactions, to avoid scattering logic and linking back and forth to models. Which of the two is architecturally cleverer and more standard-compliant?
""" Alternative def save(): method of Transaction """
def save(self, *args, **kwargs):
logger.info('[TRANSACTION] {0} created transaction (UUID {1})'.format(self.requester, self.uid))
if not self.pk:
super.save()
approvers = self.requester.profile.get_manager()
logger.info('[TRANSACTION] Adding managers {0} to approval list (UUID {1})'.format(str(approvers), instance.uid))
for approver in approvers:
TransactionApproval.objects.create(transaction=self, user=approver)
else:
super.save()
# Other logic dealing with whether or not a transaction is fully approved
PS: Any other architectural leads are also appreciated.
I would avoid dealing with signals whenever I can. As cited on the documentation signals tend to duplicate themselves.
So, between the two I would choose overriding save method.
But as a better practice and to be more sure;
Consider a case where you have already deployed the project, and on the production a problem occured, so you have to create a transaction_approval without automatically adding approvers.
Rather than automatically adding the approvers on model or db signal level, you should handle the issue on view level.
Hope this helps!
Related
there two basic ways to do something when an instance gets deleted:
Overwrite Model.delete
Signal
I used to reckon both of them serve the same purpose, just provides different ways of writing, but works exactly.
However, in this occasion, I realise I was wrong:
class Human(models.Model):
name = models.CharField(max_length=20)
class Pet(models.Model):
name = models.CharField(max_length=20)
owner = models.ForeignKey(Human, related_name="pet", on_delete=models.CASCADE)
def delete(self, *args, **kwargs):
print('------- Pet.delete is called')
return super().delete(*args, **kwargs)
h = Human(name='jason')
h.save()
p = Pet(name="dog", owner=h)
p.save()
h.delete()
# nothing is shown
Why Pet.delete Is not firing at Human.delete By the foreign cascade? Does I have to apply a signal on this? If so, would it cost more performance?
I am building something very heavy, comment system, filter decent records and delete when the commented target get deleted, the comment model has many null-able foreign key fields, with models.CASCADE Set, only one of them is assigned with value. But in product delete view, I call product.delete Then triggers cascade, but comment.delete Is not firing.
Currently, the project has delete Defined on many models, with assumption that it is always triggered when the instance get removed from database, and it is tremendous work to rewrite it in signal. Is there a way to call delete When at cascading? (I know it is likely impossible since it is a database field specification)
I implement a mix-in for Commendable models with extra methods defined, therefore, I decided to modify delete method to signal to something like this:
from django.db import models
from django.dispatch import receiver
from django.db.models.signals import pre_delete
# Create your models here.
class Base:
def __init_subclass__(cls):
#receiver(pre_delete, sender=cls)
def pet_pre_delete1(sender, instance, **kwargs):
print('pet pre delete 1 is called')
#receiver(pre_delete, sender=cls)
def pet_pre_delete2(sender, instance, **kwargs):
print('pet pre delete 2 is called')
class Human(models.Model):
name = models.CharField(max_length=20)
def __str__(self):
return f'<human>{self.name}'
class Pet(Base, models.Model):
name = models.CharField(max_length=20)
owner = models.ForeignKey(Human, related_name="pet", on_delete=models.CASCADE)
def __str__(self):
return f'<pet>{self.name}'
# ------- Pet.delete is called
# pet pre delete 1 is called
# pet pre delete 2 is called
it works fine in testing, I wonder if there is any risk using this, would it be garbage collected?
Suppose we have two models that have signal to the User model:
from django.db import models
from django.contrib.auth.models import User
from django.db.models import signals
class Company(User):
name = models.CharField(null=True, blank=True, max_length=30)
if created:
Company.objects.create(
user_ptr_id=instance.id,
username=instance.username,
password=instance.password,
email=instance.email,
first_name=instance.first_name,
last_name=instance.last_name,
is_active=instance.is_active,
is_superuser=instance.is_superuser,
is_staff=instance.is_staff,
date_joined=instance.date_joined,
)
signals.post_save.connect(
create_company, sender=User, weak=False, dispatch_uid="create_companies"
)
class Individual(User):
name = models.CharField(null=True, blank=True, max_length=30)
def create_job_seeker(sender, instance, created, **kwargs):
"""
:param instance: Current context User instance
:param created: Boolean value for User creation
:param kwargs: Any
:return: New Seeker instance
"""
if created:
'''
we should add a condition on whether the Company uses the same username
if true, then, we must not create a JobSeeker and we would disable the account using
Firebase Admin
'''
JobSeeker.objects.create(
user_ptr_id=instance.id,
username=instance.username,
password=instance.password,
email=instance.email,
first_name=instance.first_name,
last_name=instance.last_name,
is_active=instance.is_active,
is_superuser=instance.is_superuser,
is_staff=instance.is_staff,
date_joined=instance.date_joined,
)
signals.post_save.connect(
create_job_seeker, sender=User, weak=False, dispatch_uid="create_job_seekers"
)
Now, each time a User is created we should be allowed to extend it through both Individual and Company models. But, I want to prohibit the usage of both objects. User can either have a Company or an Individual object to be edited not both. Should I override the save method such as this:
def save(self, *args, **kwargs):
if not Company.objects.filter(username=self.username).exists():
super(Model, self).save(*args, **kwargs)
else:
raise 'Some error'
Or should I add a condition on the created method such as this:
...
if created and Company.objects.filter(username, self.username).exists() == False:
Company.objects.create(
...
Which approach is better? And is there another approach that you might suggest?
Signals, for most cases, I believe are the best way to handle sharing data between models assuming the CRUD for each related model isn't done together. So post_save, pre_save, post_delete, pre_delete and so are typically the best way to go about handling data that any given model instance relies on. This can be true about manipulating model data after a save. Signals were designed specifically for this reason. The other great thing about signals is you can connect them throughout your project and not necessarily just where the Model is defined. Just import the model and the signal you want to connect to it and bam!
How to use signals? follow django's documentation here. it's very simple
https://docs.djangoproject.com/en/4.0/topics/signals/
My app has Users that can be Doctors/Patients/Secretaries. To create a Doctor, therefore, I perform two POST requests: one for the creation of a User and one for a Doctor. The way I do this, the User has to be created first so that I can later create the Doctor (Doctor requires a 'User' field). I am using Django Rest Framework to create the API.
class User(AbstractUser):
# defined roles so when I retrieve user, I know to perform a
# request to api/doctors/ or api/secretaries/ etc depending on role.
ROLES = (
('d', 'Doctor'),
('s', 'Secretary'),
('p', 'Patient'),
)
role = models.CharField(
max_length=1, choices=ROLES, blank=True, default='p', help_text='Role')
class Doctor(models.Model):
user = models.ForeignKey(User, on_delete=models.CASCADE)
national_id = models.CharField(max_length=32, blank=False)
...
Since I'm new to Django, I don't know if two requests is the standard/best way of creating this User/Doctor.
This comes to mind as I am also thinking of the GET methods which will be performed later on (two GET requests when a Doctor logs in if I want to retrieve all of their info (User/Doctor)?)
I read about subclassing, which would be something like Doctor(User), then the only necessary request would be a single POST to create a Doctor (which would alongside create the User). I am, however, skeptical of subclassing the User as I read at least 3 SO answers stating it could cause problems in the future.
have a look at this good tutorial https://simpleisbetterthancomplex.com/tutorial/2018/01/18/how-to-implement-multiple-user-types-with-django.html which explain 2 different approches
extend AbstractUser with flags is_doctor, is_secretary, is_patient
class User(AbstractUser):
is_doctor = models.BooleanField('Doctor status', default=False)
is_secretary = models.BooleanField('Secretary status', default=False)
is_patient = models.BooleanField('Patient status', default=False)
using roles which suites your case:
class Role(models.Model):
'''
The Role entries are managed by the system,
automatically created via a Django data migration.
'''
ROLE_CHOICES = (
('d', 'Doctor'),
('s', 'Secretary'),
('p', 'Patient'),
)
id = models.PositiveSmallIntegerField(choices=ROLE_CHOICES, primary_key=True)
def __str__(self):
return self.get_id_display()
class User(AbstractUser):
roles = models.ManyToManyField(Role)
Problem
Determine best practices for handling multiple user types and adding attributes to a user in Django.
Solution
The following is are recommendations based on design patterns that have been commonly used in Django since version 0.96. These recommendations are present in Django’s documentation (see: References).
Roles
Use Django’s built in permissions module for role and group management instead of rolling your own role and group management.
Instead of a model for each type, create a single UserProfile model, relegating user types to being managed by permissions model.
I recommend using a data migration to add groups so that default groups are automatically seeded on initial migrate call—this reduces overhead for anyone setting up your project for the first time.
OneToOne
Use Django OneToOne field instead of ForeignKey.
UserProfile and Signals
Create a signal that creates a UserProfile on User create.
Example
class UserProfile(models.Model):
user = models.OneToOne(“User”, on_delete=models.CASCADE)
national_id = models.CharField(max_length=32,
def create_profile(sender, **kwargs):
user = kwargs["instance"]
if kwargs["created"]:
user_profile = UserProfile(user=user)
user_profile.save()
post_save.connect(create_profile, sender=User)
References
Django permissions (groups): https://docs.djangoproject.com/en/3.1/topics/auth/default/#groups
Django Data Migrations: https://docs.djangoproject.com/en/3.1/topics/migrations/#data-migrations
Django extending User model recommendation: https://docs.djangoproject.com/en/1.8/topics/auth/customizing/#extending-django-s-default-user
Django post_save signal: https://docs.djangoproject.com/en/3.1/ref/signals/#post-save
in your case (the model you made) you can create the doctor and the user in one post request to the doctor creation by overriding the create function for the
from rest_framework.generics import CreateAPIView, ListAPIView
class CreateDoctorViewSet(CreateAPIView, ListAPIView):
def create(self, request, *args, **kwargs):
data = self.request.data
user_dict_keys = ["username", "email", "first_name", "last_name"]
user_dict = {key: data.pop(key, None) for key in user_dict_keys}
user_dict['role'] = "d"
user_serializer = UserSerializer(data=user_dict)
# if it's not valid it will return the exception details for the requester
user_serializer.is_valid(raise_exception=True)
user = user_serializer.create(user_dict)
user.set_password(data['password'])
data.pop("password", None)
user.save()
response = super().create(request, *args, **kwargs)
if response.statu_code == 201:
return response
# if an error happened while in the doctor model (model error or serializer error) >> delete the created user
user.delete()
return response
or made a little more DRY :
from rest_framework.generics import CreateAPIView, ListAPIView
def create_user(self):
data = self.request.data
user_dict_keys = ["username", "email", "first_name", "last_name"]
user_dict = {key: data.pop(key, None) for key in user_dict_keys}
user_dict['role'] = "d"
user_serializer = UserSerializer(data=user_dict)
# if it's not valid it will return the exception details for the requester
user_serializer.is_valid(raise_exception=True)
user = user_serializer.create(user_dict)
user.set_password(data['password'])
data.pop("password", None)
user.save()
return user
class CreateDoctorViewSet(CreateAPIView, ListAPIView):
def create(self, request, *args, **kwargs):
user = create_user(request)
response = super().create(request, *args, **kwargs)
if response.statu_code == 201:
return response
# if an error happened while in the doctor model (model error or serializer error) >> delete the created user
user.delete()
return response
Personal Advices:
in the case you're providing the User model has one role so it's better to make the user field in the Doctor class OneToOne instead of ForeignKey.
Of course if you have cases where there are people for example converting from Doctor to Secretary and you want them to switch between roles on the same account you can keep the ForeignKey on Doctor model but you have to make multiple roles possible in the user model.
What you mean by subclassing it is called Multi table inheritance. And there is no problem in using it, no side effects and it is perfectly compatible with Django Rest Framework (which you have tagged). This is the way it works:
class User(AbstractUser):
# Your common fields for all user types.
class Doctor(User):
national_id = models.CharField(max_length=32, blank=False)
class Secretary(User):
# Your specific fields for secretary model
class Patient(User):
# Your specifict fields for patient model
In background, it uses a OneToOne relationship for each subtype.
Advantajes of using Multi table inheritance:
It is simple and elegant: you don't have to take care of different tables, queries, etc; Django does it for you.
It also unsures a good and formalized structure of your database: different tables for common and specific data in OneToOne relationship.
It is suitable for your needs? That depends.
If each subtype has its own specific fields -> use multi table inheritance without doubt.
If each subtype has the same set of fields but different behaviours (different class/model methods, code, etc) -> use proxy models.
If all the subtypes have the same set of fields and the same behaviour (same class/model methods, code, etc) -> use role based approach (one field identifying the role).
(extra) If you have dozens of subtypes, each one of them has different fields and you don't care too much about database formalization -> Don't use Multi table inheritance. In this case, you can use a mix of role based approach with JSON fields (for storing all the specifict fields) and proxy models (for handling different behaviours)**.
I have a model that looks somewhat like this:
class Passenger(models.Model):
name = models.CharField(max_length=50)
surname = models.CharField(max_length=50)
class Flight(models.Model):
capacity = models.IntegerField()
passengers = models.ManyToManyField(Passenger)
Before adding a new passenger to the flight I would like to validate whether the number of passengers is not going to exceed the capacity. I was wondering what would be the best way to go about this.
Obviously I could manually check the number of passengers before adding a new one, but maybe there is some support in django? I tried writing a validator, but wasn't sure how to do it.
According to Django docs you can listen to the m2m_changed signal, which will trigger pre_add and post_add actions.
Using add() with a many-to-many relationship, however, will not call
any save() methods (the bulk argument doesn’t exist), but rather
create the relationships using QuerySet.bulk_create(). If you need to
execute some custom logic when a relationship is created, listen to
the m2m_changed signal, which will trigger pre_add and post_add
actions.
According to #M.Void answer – Code Example:
from django.db import models
from django.db.models.signals import m2m_changed
from django.core.exceptions import ValidationError
class MyModel(models.Model):
m2mField = models.ManyToManyField('self')
m2mFieldLimit = 2
def m2mField_changed(sender,**kwargs):
instance = kwargs['instance']
if len(instance.m2mField.all()) >= instance.m2mFieldLimit :
raise ValidationError(f'Max number of records is {instance.m2mFieldLimi}')
m2m_changed.connect(commonobjects_changed,sender=MyModel.m2mField.through)
Override the clean method on the model to do the check you want:
class Passenger(models.Model):
name = models.CharField(max_length=50)
surname = models.CharField(max_length=50)
def clean(self, *args, **kwargs):
# clean gets called automatically by other things, so we can't always
# expect flight_id to be provided
if 'flight_id' in kwargs:
flight = Flight.objects.get(pk=kwargs['flight_id'])
if flight.passengers.all().count() >= flight.capacity:
# flight is full!
raise ValidationError
super(Passenger, self).clean()
class Flight(models.Model):
capacity = models.IntegerField()
passengers = models.ManyToManyField(Passenger)
Note that to do this, you will have to pass in the flight ID when validating the passenger:
f = Flight.objects.get(...)
p = Passenger(name='First', surname='Last')
try:
p.clean(flight_id=f.id) # full_clean calls clean, among other validations
p.save()
except ValidationError as e:
# do something to handle the error
Note that it is possible in multi-threaded applications for something to get validated successfully, but still fail to save in a race condition. You would need to add additional code to handle that.
See here for details on model validation.
What I want to do is, whenever I create a new message, I want the sender of the Message to be added to the user of that particular Thread (the Message its relating to).
How do I do that? Can it be done by overriding the save method? I can do it in the views.py, but I was hoping it would be better if I can add it in the models.py itself. Any help will be very grateful. Thank you!
class Thread(models.Model):
user = models.ManyToManyField(User)
is_hidden = models.ManyToManyField(User, related_name='hidden_thread', blank=True)
def __unicode__(self):
return unicode(self.id)
class Message(models.Model):
thread = models.ForeignKey(Thread)
sent_date = models.DateTimeField(default=datetime.now)
sender = models.ForeignKey(User)
body = models.TextField()
is_hidden = models.ManyToManyField(User, related_name='hidden_message', blank=True)
def __unicode__(self):
return "%s - %s" % (unicode(self.thread.id), self.body)
You could lookup the reverse foreign key and get all the users for a particular thread without having to manually put it in Thread
Then you can get users associated with a thread by the reverse lookup:
User.objects.filter(message__thread=thread)
If you don't want to actively pull the user set as dm03514 showed, such as if you want to add users to the thread by default but maintain the ability to remove them from the thread many-to-many later, you can indeed do this by overriding the save method or by using a post_save signal.
save is good enough for almost all cases - the advantage of post_save is that it can more reliably distinguish between saving a new message and saving edits to an existing message. But if you're not creating messages with preselected PKs or loading them from fixtures save can work fine:
class Message(models.Model):
def save(self, *args, **kwargs):
probably_new = (self.pk is None)
super(Message, self).save(*args, **kwargs)
if probably_new:
self.thread.user.add(self.sender)
A signal would look like this:
from django.db.models.signals import post_save
def update_thread_users(sender, **kwargs):
created = kwargs['created']
raw = kwargs['raw']
if created and not raw:
instance = kwargs['instance']
instance.thread.user.add(instance.sender)
post_save.connect(update_thread_users, sender=Message)
And then review the docs on preventing duplicate signals in case of multiple imports:
https://docs.djangoproject.com/en/dev/topics/signals/#preventing-duplicate-signals