I have three models, one called Events and one called Tasks. Each one has a field called deadline.
Goal: I need to create a section in my Custom User model called Todis and a section called Overdose where I filter the Events and tasks that has deadline less than now and put them in the todos and I need to filter the other Tasks and Events object to put them in the overdoes.
from Myapp import Events
from MyOtherApp import Tasks
User(models.Model):
field1 = models...
field2 = models...
def smthing(self,args,kwargs);
# here create new field
# Events and Tasks are other models
for event in Events.objects.all():
if event.deadline < now():
self.Ovedows.append(event)
for tasks in Tasks.objects.all():
if tasks.deadline < now():
self.Ovedows.append(tasks)
I know how to filter and all this stuff but I need an idea to get that data and set them inside the user module without creating a new data field together them, just like using the relational data
I think you need do something like this
from datetime import datetime
class User(models.Model):
#....
def overdue_tasks(self):
now = datetime.now()
return Tasks.objects.filter(deadline__lt=now)
def overdue_events(self):
now = datetime.now()
return Events.objects.filter(deadline__lt=now)
deadline__lt means deadline less than which is equivalent to deadline < now
Since you need them from 2 different models, you'll need to merge them in the serializer or somewhere in the front end
Note that this currently fetches ALL the events for ALL the users, you didn't share your models properly (didn't share the Tasks and Events models) so I don't know how you are linking them to the User model, you should have a foreign key in them pointing to the user model.
assuming your Tasks model looks like this
class Tasks(models.Model):
deadline = ...
user = models. ForeignKey(User)
the query will instead be
self.tasks_set.filter(deadline__lt=now)
# and
self.events_set.filter(deadline__lt=now)
Related
I have a medium size Django REST app that I'm looking to add gamification features to.
The application in question is a school webapp where students can create mockup quizzes, participate in exams that teachers publish, and write didactical content that gets voted by other students and teachers.
I want to add some gamification features to make the app more interesting and to incentivize participation and usage of the various features: for example, each student will have a personal "reputation" score, and gain points upon completing certain actions--a student may gain points when completing a quiz with a high score, when submitting some content, or when receiving upvotes to such content.
The tricky part is I want to be able to have this logic be as separate as possible from the existing codebase, for various reasons: separation of concerns, ability to plug the engine in/out if needed, ability to easily deactivate features for certain groups of users, etc.
What I'm looking for here is some software engineering advice that's also Django-specific. Here's a high level description of what I'm thinking of doing--I'd like some advice on the approach.
create a new gamification app. Here I will have models that describe a change in reputation for a user and possibly other related events. The app should also send notifications when gamification-related events occur
from the gamification app, expose a callback-based interface, which the other primary app can call into to dispatch events
use the django-lifecycle package to call the callbacks from gamification when triggers occur.
This way, my existing models would only get touched to register the triggers from django-lifecycle (similar to signals). For example, let's say I want to give students points when they turn in an assignment. Let's say I have an AssignmentSubmission model to handle assignment submissions. With the added lifecycle hook, it'd look like this:
class AssignmentSubmission(models.Model):
NOT_TURNED_IN = 0
TURNED_IN = 1
STATES = ((NOT_TURNED_IN, 'NOT_TURNED_IN'), (TURNED_IN, 'TURNED_IN'))
user = models.ForeignKey(user)
assignment = models.ForeignKey(assignment)
state = models.PositiveSmallIntegerField(choices=STATES, default=NOT_TURNED_IN)
#hook(AFTER_UPDATE, when="state", was=NOT_TURNED_IN, is_now=TURNED_IN)
def on_turn_in(self):
get_gamification_interface().on_assignment_turn_in(self.user)
The on_assignment_turn_in method might look something like:
def on_assignment_turn_in(user):
ReputationIncrease.objects.create(user, points=50)
notifications.notify(user, "You gained 50 points")
This is pretty much just a sketch to give an idea.
I am unsure how get_gamification_interface() would work. Should it return a singleton? Maybe instantiate an object? Or return a class with static methods? I think it'd be best to have a getter like this as opposed to manually importing methods from the gamification app, but maybe it could also create too much overhead.
What's a good way to handle adding "pluggable" features to a project that are inherently entangled with existing models and business logic while also touching those as little as possible?
The foreign key approach is fine. You can easily chain and link queries using information from existing tables and you could even avoid touching the original code by importing your models to the new app. You can use Django signals in your new app and ditch the django-lifecycle extension to avoid adding lines to your core models. I used the following approach to keep track of modified records in a table; take a TrackedModel with fields field_one, field_two, field_n... which will be tracked by one of your new app's model, namely RecordTrackingModel:
from parent_app.models import TrackedModel # The model you want to track from a parent app.
from django.db.models.signals import post_save # I'm choosing post_save just to illustrate.
from django.dispatch import receiver
from datetime import datetime
class RecordTrackingModel(models.Model):
record = models.ForeignKey(TrackedModel, verbose_name=("Tracked Model"), on_delete=models.CASCADE)
field_one = models.TextField(verbose_name=("Tracked Field One"), null=True, blank=True) # Use same field type as original
field_two = models.TextField(("Tracked Field Two"))
field_n = ...
notes = models.TextField(verbose_name=("notes"), null=True, blank=True)
created = models.DateTimeField(verbose_name=("Record creation date"), auto_now=False, auto_now_add=True)
#receiver(post_save, sender=TrackedModel) # Here, listen for the save signal coming from a saved or updated TrackedModel record.
def modified_records(instance, **kwargs):
record = instance
tracked_field_one = instance.field_one
tracked_field_two = instance.field_two
tracked_field_n = another_function(instance.field_n) #an external function that could take a field content as input.
...
note = 'Timestamp: ' + str(datetime.now().isoformat(timespec='seconds'))
track_record = RecordTrackingModel.objects.create(record=record, field_one=tracked_field_one, field_two=tracked_field_two, field_n=tracked_field_n, ..., notes=note)
return track_record
There's no need to add functions to your pre-existing models as the signal dispatcher triggers whenever a save or delete signal appears at TrackedModel. Then you could place "if" statements for wether or not to perform actions based on field values, i.e.: just pass if an AssignmentSubmission record has a "Not Turned In" status.
Check Django signals reference for more information about when they trigger.
Additionally, I would suggest to change the "state" field to boolean type and rename it to something more explicit like "is_turned_in" for ease of use. It will simplify your forms and code. Edit: For more than 2 choices (non-boolean), I prefer using ForeignKey instead. It will let you modify choices over time easily from the admin site.
Edit:
Another approach could be mirroring the original models in your gamification app and call for a mirror record update when a save method is used in the original model.
gamification_app/models.py:
from parent_app.models import OriginalModel # The model you want to track from a parent app.
from django.db.models.signals import post_save # I'm choosing post_save just to illustrate.
from django.dispatch import receiver
from datetime import datetime
def gamification_function(input, parameters):
output = *your gamification logic*
return output
class MirrorModel(models.Model):
original_model = (OriginalModel, verbose_name=("Original Model"), on_delete=models.CASCADE)
field_one = ... #Same type as the original
field_two = ...
field_n = ...
#hook(AFTER_UPDATE, when="field_n", was=OPTION_1, is_now=OPTION_2)
def on_turn_in(self):
gamification_function(self.field, self.other_field)
#receiver(post_save, sender=OriginalModel) # Here, listen for the save signal coming from the original app change record.
def mirror_model_update(instance, **kwargs):
pk = instance.pk
mirror_model = []
if MirrorModel.objects.get(original_model.pk=pk).exists():
mirror_model = MirrorModel.objects.get(original_model.pk=pk)
mirror_model.field_one = instance.field_one # Map field values
*other logic ...*
mirror_model.save() # This should trigger your hook
else:
mirror_model = MirrorModel(original_model = instance, field_one = instance.field_one, ...)
mirror_model.save() #This should trigger your hooks as well, but for a new record
This way you can decouple the gamification app and even choose not to keep a record of all or the same fields as the original model, but fields specific to your functionality needs.
Your idea was good.
In the gamification app add your views, protect it with LoginRequiredMixin, and extend it with a check if a related record in the AssignmentSubmission table exists and is turned on.
In this way you have a 100% separated gamification views, models, logic, ecc...
In the existing views you can add a link to a gamification view.
from django.db import models
from djago.db.models import F, Q
class(models.Model):
order_date = models.DateField()
class OrderLine(models.Model):
order = models.ForeignKeyField(Order)
loading_date = models.DateField()
class Meta:
constraints = [
models.CheckConstraint(check=Q(loading_date__gte=F("order__order_date")), name="disallow_backdated_loading")
I want to make sure always orderline loading_date is higher than orderdate
A CHECK constraint can span over a column, or over a table, but not over multiple tables, so this is not possible through a CHECK constraint.
Some databases allow to define triggers. These triggers run for example when a records is created/updated, and can run SQL queries, and decide to reject the creation/update based on such queries, but currently, the Django ORM does not support that.
One could also work with a composite primary key of an id and the creation date of the order, in which case the create timestamp is thus stored in the OrderLine table, and thus one can implement a check at the table level, but Django does not support working with composite primary keys for a number of reasons.
Therefore, besides running raw SQL, for example with a migration file that has a RunSQL operation [Django-doc], but this will likely be specific towards a database.
Therefore probably the most sensical check is to override the model clean() method [Django-doc]. Django however does not run the .clean() method before saving an object in the database, this is only done by ModelForms, and ModelAdmins. We can thus add a check with:
from django.core.exceptions import ValidationError
class OrderLine(models.Model):
order = models.ForeignKeyField(
Order,
on_delete=models.CASCADE
)
loading_date = models.DateField()
def clean(self):
if self.loading_date < self.order.order_date:
raise ValidationError('Can not load before ordering')
return super().clean()
I have a model Event which has a ManyToManyField named users. I want to get all the events with an extra attribute which denotes whether a specific user belongs to this event or not.
class Event(models.Model):
users = models.ManyToManyField(User)
# Expected code format
user = User.objects.get(id=1)
events = // Event query
print events[0].your_event // It should be True if above user is part of this event else False
Here is the code which I tried
from django.db.models import Case, When, BooleanField
user = User.objects.get(id=1)
Event.objects.annotate(your_event=Case(when(users=user, then=True), default=False, output_field=BooleanField()))
Problem with above code: If an Event object has multiple users (let 4) then it is returning 4 objects which has your_event as False in three objects and True in one object. But I want only object for each
event.
I believe a better solution can exist for this purpose.
I want to present general statistics about the instances of the class "ModelClass" on a web page for my users. Lets say there are some thousands ModelClass-objects, and that there are a lot of statistics that I need to calculate. I have figured out that I can do this with model managers, and here is a (very) simplified example:
class ModelClassCustomManager(models.Manager):
def get_query_set(self):
return super(ModelClassCustomManager, self).get_query_set().filter(is_complete = True)
class ModelClass(models.Model):
is_complete = models.BooleanField(default = False)
...
objects = models.Manager()
complete_objects = ModelClassCustomManager()
My concern is that this drains a lot of resources if this is calculated when users view the page. I would therefore like to calculate it only when I change or create new ModelClass objects since this is the only time the statistics really change. I guess this can be done by overriding the ModelClass save()-method.
What is the best way to save these results? Should I create another django-model for holding the calculated statistics, or is there another way of storing this information?
EDIT: Thanks to pastylegs for a good answer. Doing it this way, however causes some minor bugs, and I figured I'll explain them here, in case anyone else runs into this issue.
First of all, I incorrectly put the imports in the ModelClass at the top, so I got a circular dependency, which gave some weird results. To save yourself some frustration, put them where pastylegs did. Secondly, in order to overwrite the previous calculations of statistics (and not create a new one every time), just replace
if sender is ModelClass and instance is not Null:
count = ModelClass.objects.all().count()
stat = Stat(name='some_name', value=count).save()
with this:
if sender is ModelClass:
count = Match.objects.all().count()
try:
stat = Stat.objects.get(key="Total")
stat.update(key="Total", value=count) #Update statistic
except:
Stat(key="Total", value=count).save() #Create new
I'd create an app called statistics (or similar) that has a simple model to hold key,value pairs:
class Stat(models.Model):
key = models.CharField(..)
value = models.CharField(...)
then use a signal on your own model to run a function every time a user saves/updates a instance of your model. In models.py:
class ModelClass(models.Model):
...
from django.db.models.signals import post_save
from myapp.signal import updates_stats
post_save.connect(updates_stats, sender=ModelClass)
and then create the receiver function to calculate what ever statistics you need in signals.py
from myapp.models import ModelClass
from stats.models import Stat
def update_stats(sender, instance, signal, *args, **kwargs):
if sender is ModelClass and instance is not Null:
count = ModelClass.objects.all().count()
stat = Stat(name='some_name', value=count).save()
This is a very simple and basic approach and only outlines how you can make use of signals to perform calculations but it is a good start. Ideally you should try to calculate these values outside of the request/response cycle (especially if they are big caculations) as it will hang your server and potentially time out your requests, so you should consider using a queuing/task system like Celery to perform the calculation in the background
(Django 1.1) I have a Project model that keeps track of its members using a m2m field. It looks like this:
class Project(models.Model):
members = models.ManyToManyField(User)
sales_rep = models.ForeignKey(User)
sales_mgr = models.ForeignKey(User)
project_mgr = models.ForeignKey(User)
... (more FK user fields) ...
When the project is created, the selected sales_rep, sales_mgr, project_mgr, etc Users are added to members to make it easier to keep track of project permissions. This approach has worked very well so far.
The issue I am dealing with now is how to update the project's membership when one of the User FK fields is updated via the admin. I've tried various solutions to this problem, but the cleanest approach seemed to be a post_save signal like the following:
def update_members(instance, created, **kwargs):
"""
Signal to update project members
"""
if not created: #Created projects are handled differently
instance.members.clear()
members_list = []
if instance.sales_rep:
members_list.append(instance.sales_rep)
if instance.sales_mgr:
members_list.append(instance.sales_mgr)
if instance.project_mgr:
members_list.append(instance.project_mgr)
for m in members_list:
instance.members.add(m)
signals.post_save.connect(update_members, sender=Project)
However, the Project still has the same members even if I change one of the fields via the admin! I have had success updating members m2m fields using my own views in other projects, but I never had to make it play nice with the admin as well.
Is there another approach I should take other than a post_save signal to update membership? Thanks in advance for your help!
UPDATE:
Just to clarify, the post_save signal works correctly when I save my own form in the front end (old members are removed, and new ones added). However, the post_save signal does NOT work correctly when I save the project via the admin (members stay the same).
I think Peter Rowell's diagnosis is correct in this situation. If I remove the "members" field from the admin form the post_save signal works correctly. When the field is included, it saves the old members based on the values present in the form at the time of the save. No matter what changes I make to the members m2m field when project is saved (whether it be a signal or custom save method), it will always be overwritten by the members that were present in the form prior to the save. Thanks for pointing that out!
Having had the same problem, my solution is to use the m2m_changed signal. You can use it in two places, as in the following example.
The admin upon saving will proceed to:
save the model fields
emit the post_save signal
for each m2m:
emit pre_clear
clear the relation
emit post_clear
emit pre_add
populate again
emit post_add
Here you have a simple example that changes the content of the saved data before actually saving it.
class MyModel(models.Model):
m2mfield = ManyToManyField(OtherModel)
#staticmethod
def met(sender, instance, action, reverse, model, pk_set, **kwargs):
if action == 'pre_add':
# here you can modify things, for instance
pk_set.intersection_update([1,2,3])
# only save relations to objects 1, 2 and 3, ignoring the others
elif action == 'post_add':
print pk_set
# should contain at most 1, 2 and 3
m2m_changed.connect(receiver=MyModel.met, sender=MyModel.m2mfield.through)
You can also listen to pre_remove, post_remove, pre_clear and post_clear. In my case I am using them to filter one list ('active things') within the contents of another ('enabled things') independent of the order in which lists are saved:
def clean_services(sender, instance, action, reverse, model, pk_set, **kwargs):
""" Ensures that the active services are a subset of the enabled ones.
"""
if action == 'pre_add' and sender == Account.active_services.through:
# remove from the selection the disabled ones
pk_set.intersection_update(instance.enabled_services.values_list('id', flat=True))
elif action == 'pre_clear' and sender == Account.enabled_services.through:
# clear everything
instance._cache_active_services = list(instance.active_services.values_list('id', flat=True))
instance.active_services.clear()
elif action == 'post_add' and sender == Account.enabled_services.through:
_cache_active_services = getattr(instance, '_cache_active_services', None)
if _cache_active_services:
instance.active_services.add(*list(instance.enabled_services.filter(id__in=_cache_active_services)))
delattr(instance, '_cache_active_services')
elif action == 'pre_remove' and sender == Account.enabled_services.through:
# de-default any service we are disabling
instance.active_services.remove(*list(instance.active_services.filter(id__in=pk_set)))
If the "enabled" ones are updated (cleared/removed + added back, like in admin) then the "active" ones are cached and cleared in the first pass ('pre_clear') and then added back from the cache after the second pass ('post_add').
The trick was to update one list on the m2m_changed signals of the other.
I can't see anything wrong with your code, but I'm confused as to why you think the admin should work any different from any other app.
However, I must say I think your model structure is wrong. I think you need to get rid of all those ForeignKey fields, and just have a ManyToMany - but use a through table to keep track of the roles.
class Project(models.Model):
members = models.ManyToManyField(User, through='ProjectRole')
class ProjectRole(models.Model):
ROLES = (
('SR', 'Sales Rep'),
('SM', 'Sales Manager'),
('PM', 'Project Manager'),
)
project = models.ForeignKey(Project)
user = models.ForeignKey(User)
role = models.CharField(max_length=2, choices=ROLES)
I've stuck on situation, when I needed to find latest item from set of items, that connected to model via m2m_field.
Following Saverio's answer, following code solved my issue:
def update_item(sender, instance, action, **kwargs):
if action == 'post_add':
instance.related_field = instance.m2m_field.all().order_by('-datetime')[0]
instance.save()
m2m_changed.connect(update_item, sender=MyCoolModel.m2m_field.through)