I am building an app using Django 1.10 as backend.
Is it possible to set a model field's default relative to another model from the same instance?
I specifically need to set second_visit's default to be 3 weeks after the first_visit
class SomeModel(models.Model):
first_visit = models.DateField()
second_visit = models.DateField(default= second_visit_default)
def second_visit_default(self):
# Set second_visit to 3 weeks after first_visit
You cannot assign a default value on a field dependent on another before having a model instance. To achieve the same you can override the save() method of the model:
class SomeModel(models.Model):
...
def save(self, *args, **kwargs):
self.second_visit = self.first_visit + datetime.timedelta(weeks=3)
super().save(*args, **kwargs)
You can override save or usepre_save
from django.db.models.signals import pre_save
from django.dispatch import receiver
#receiver(pre_save, sender=SomeModel)
def my_handler(sender, instance, **kwargs):
instance.second_visit = # Set second_visit to 3 weeks after instance.first_visit
This is a late answer, but #BjornW addresses a valid concern in the comment on the accepted answer that is not addressed in the 2 provided answers (at the time of writing): Overwriting save is not the same thing as setting a default value; a default value only takes effect once, at the first instance creation, while modifying save affects every subsequent modification of an existing instance as well. The 2nd answer suffers from the same deviation of the principle of a default value.
Providing a default value relative to other attributes of self is not (yet?) possible, however you can use the pre_save signal to set a default value for a field that is relative to other fields.
Here would be an example:
# signals.py
from django.dispatch import receiver
from django.db.models.signals import pre_save
# do all necessary imports e.g.:
from .models import SomeModel
#receiver(pre_save, sender=SomeModel)
def set_default_second_visit(sender, instance, raw, **kwargs):
"""
Set default value for `second_visit`
"""
if instance.pk is None:
instance.second_visit = instance.second_visit or instance.first_visit + datetime.timedelta(weeks=3)
A note might be at hand:
It is not that the pre_save signal is sent only once. Actually, it is sent whenever an instance of SomeModel is saved, i.e. also if an existing entry from the db is modified. The line if instance.pk is None: is present and necessary exactly for this reason: it makes sure that that the value is set only if the object does not yet exist in the database (i.e has no primary key yet).
Related
i have the following model
class FeeModel(models.Model):
user=models.ForeignKey(User,on_delete=models.CASCADE,null=True)
total_fee=models.IntegerField(default=100000)
paid_fee=models.IntegerField()
remaining_fee=models.IntegerField(default=0)
i need the remaining_fee to be filled by the result of (total_fee - paid_fee). How would i do that?
You can do this in numerous ways and places. A pre_save signal is one approach.
(In models.py, below your FeeModel class)
from django.db.models.signals import pre_save
from django.dispatch import receiver
#receiver(pre_save, sender=FeeModel)
def set_remaining_fee(sender, instance, *args, **kwargs):
instance.remaining_fee = (instance.total_fee - instance.paid_fee)
How this works: A user enters the values for total_fee and paid_fee into a form. Upon submit and just before save(), the signal calculates the difference and applies it to the remaining_fee field. Note that this will run every time the model instance is saved.
I want to use the username of the account in which my django is running as a string to load the model fields specific to that username. I have created a file 'survey.py' which returns a dictionary and I want the keys as the fields.
How can I get the username as string?
from django.db import models
from django.contrib.auth.models import User
from multiselectfield import MultiSelectField
from survey_a0_duplicate import details, analysis
import ast
class HomeForm1(models.Model):
user= models.OneToOneField(User, on_delete=models.CASCADE,)
details.loadData(survey_name = user)#<=====This loads the data for specific user<======
global f1
f1=analysis.getQuestion(in_json=False)#<====We get the dictionary here<========
d=list(f1.keys())
###################assign the filters#######################################################
for k in d:
q=list(f1[k].keys())
q.sort()
choices=tuple(map(lambda f: (f,f),q))
locals()[k]=MultiSelectField(max_length=1000,choices=choices,blank=True)
def save(self, *args, **kwargs):
if self.pk is None:
self.user= self.user.username
super(HomeForm1,self).save(*args,**kwargs)
def __str__(self):
return self.title
This is not how you write Django code. Global variables are a bad idea anyway, but you must not use them in a multi-user, multi-process environment like Django. You will immediately have thread-safety issues; you must not do it.
Not only is there an explicit global in the code you have shown, there is clearly one inside survey_a0_duplicate - since details.loadData() does not actually return anything but you then "get the dictionary" from analysis.getQuestion. You must remove the globals from both locations.
Also, your save method is totally wrong. You have the user relationship; why would you overwrite it with the username? That not only makes no sense, it specifically destroys the type of the field that you have set. Just don't do it. Remove the entire save method.
But you need to stop messing about with choices at class level. That is never going to work. If you need to dynamically set choices, do in in a form, where you can customise the __init__ method to accept the current user and build up the choices based on that.
I have the following models:
class Item(models.Model):
# fields
# ...
class Collection(models.Model):
items = models.ManyToManyField(Item, related_name="collections")
# other fields
# ...
Now I want two things:
I want to control if an Item can be added to a Collection.
I want the Collection to update some of its fields if an Item was added or removed.
For the second issue I know that there is the django.db.models.signals.m2m_changed which I can use to hook into changes of the relation. Is it allowed/ok to change the Collection within the signal callback? Can I use the signal also for "aborting" the insertion for issue 1?
I think the best way to approach both of your desired behaviors is not with signals, but rather with an overridden save() and delete() method on the through table which you would define explicitly using the argument through see: https://docs.djangoproject.com/en/dev/ref/models/fields/#django.db.models.ManyToManyField.through. and this: https://docs.djangoproject.com/en/dev/topics/db/models/#overriding-predefined-model-methods
Something like this:
# -*- coding: utf-8 -*-
from django.db import models
class Item(models.Model):
# fields
# ...
class Collection(models.Model):
items = models.ManyToManyField(Item, related_name="collections", through="CollectionItem")
# other fields
# ...
class CollectionItem(models.Model):
collection = models.ForeignKey(Collection)
item = models.ForeignKey(Item)
def save(self, *args, **kwargs):
# Only allow this relationship to be created on some_condition
# Part 1 of your question.
if some_condition:
super(CollectionItem, self).save(*args, **kwargs)
# Update some fields on Collection when this
# relationship is created
# Part 2 of your question (1/2)
self.Collection.updateSomeFields()
def delete(self, *args, **kwargs):
collection = self.collection
super(CollectionItem, self).delete(*args, **kwargs)
# Update some fields on Collection when this relationship
# is destroyed.
# Part 2 of your question (2/2)
collection.updateSomeFields()
Incidentally, you'll find that adding a relationship will cause a save-signal on this through model.
And, regarding signals, once you have the through table in place, you'd be able to listen for pre_save and/or post_save signals, but neither of them will allow you to directly veto the creation of the relationship.
If one or both of your models are supplied by a 3rd party and you really cannot create the through table, then, yes, the signal route may be the only way to go.
https://docs.djangoproject.com/en/dev/ref/signals/#m2m-changed
In which case, you could listen for the m2m_changed event and trigger updates to your collection objects (part 2 of your question) and retroactively delete inappropriately created relationships (part 1 of your question). However, as this latter bit is a fugly kludgy, I'd stick with the explicit through table if you can.
The pre_save signal is called before saving an instance. But you are not able to abort the save operation from there. A better solution would be to add a new method to your Collection model, which is responsible for checking if an Item can be added:
class Collection(models.Model):
items = models.ManyToManyField(Item, related_name="collections")
...
def add_item(self, item):
if check_if_item_can_be_added(item):
items.add(item)
self.save()
def check_if_item_can_be_added(self, item):
# do your checks here
When adding an instance to a m2m field, the save method does not get called. You are right, the m2m_changed signal is the way to go. You can safely update the collection instance in there.
i have a simple model:
class Article(models.Model):
name = models.CharField(max_length=1000)
custom_name = models.CharField(max_length=1000)
custom function:
def process_text(my_string):
return len(my_string)
i want the following:
custom_name = process_text(name)
Suppose the admin enters name as Mark Pilgrim then custom_name should have the auto populated value of 12.
in the admin.py can i have something like
prepopulated_fields
what would be an easy way to go about it.
Thanks!!
The easiest way to do this is to add a method that listens on the pre_save signal.
Here is a sample you can use (this code goes in your models.py for the app)
from django.db.models.signals import pre_save
from django.dispatch import receiver
# Your models go here
def process_text(mystring):
return len(mystring)
#receiver(pre_save, sender=Article)
def my_handler(sender, **kwargs):
if not kwargs['raw']:
obj = kwargs['instance']
obj.custom_name = process_string(obj.name)
The signals documentation has more information on signals, and the pre_save documentation lists what arguments the method expects.
If you have form with those fields then you can use some javascript to update the second field when the first lost focus. Then when someone will enter something in to the first field and leave the field, js can calculate length and put the value in second field.
If this should be done in the backend level - then you can for example override model's save() method. More here - https://docs.djangoproject.com/en/dev/topics/db/models/#overriding-predefined-model-methods
Cheers.
In my opinion the best option is to set the value on the form validation stage.
I have a model which is overriding save() to slugify a field:
class MyModel(models.Model):
name = models.CharField(max_length=200)
slug = models.SlugField(max_length=200)
def save(self, *args, **kwargs):
self.slug = slugify(self.name)
super(MyModel, self).save(*args, **kwargs)
When I run load data to load a fixture, this save() does not appear to be called because the slug field is empty in the database. Am I missing something?
I can get it to work by a pre_save hook signal, but this is a bit of a hack and it would be nice to get save() working.
def mymodel_pre_save(sender, **kwargs):
instance = kwargs['instance']
instance.slug = slugify(instance.name)
pre_save.connect(mymodel_pre_save, sender=MyModel)
Thanks in advance.
No you're not. save() is NOT called by loaddata, by design (its way more resource intensive, I suppose). Sorry.
EDIT: According to the docs, pre-save is not called either (even though apparently it is?).
Data is saved to the database as-is, according to https://docs.djangoproject.com/en/dev/ref/django-admin/#what-s-a-fixture
I'm doing something similar now - I need a second model to have a parallel entry for each of the first model in the fixture. The second model can be enabled/disabled, and has to retain that value across loaddata calls. Unfortunately, having a field with a default value (and leaving that field out of the fixture) doesn't seem to work - it gets reset to the default value when the fixture is loaded (The two models could have been combined otherwise).
So I'm on Django 1.4, and this is what I've found so far:
You're correct that save() is not called. There's a special DeserializedObject that does the insertion, by calling save_base() on the Model class - overriding save_base() on your model won't do anything since it's bypassed anyway.
#Dave is also correct: the current docs still say the pre-save signal is not called, but it is. It's behind a condition: if origin and not meta.auto_created
origin is the class for the model being saved, so I don't see why it would ever be falsy.
meta.auto_created has been False so far with everything I've tried, so I'm not yet sure what it's for. Looking at the Options object, it seems to have something to do with abstract models.
So yes, the pre_save signal is indeed being sent.
Further down, there's a post_save signal behind the same condition that is also being sent.
Using the post_save signal works. My models are more complex, including a ManyToMany on the "Enabled" model, but basically I'm using it like this:
from django.db.models.signals import post_save
class Info(models.Model):
name = models.TextField()
class Enabled(models.Model):
info = models.ForeignKey(Info)
def create_enabled(sender, instance, *args, **kwards):
if Info == sender:
Enabled.objects.get_or_create(id=instance.id, info=instance)
post_save.connect(create_enabled)
And of course, initial_data.json only defines instances of Info.