I have a model Property with certain fields and a relevant method:
class Property(models.Model):
table = models.ForeignKey(Table)
field1 = models.CharField()
field2 = models.IntegerField()
field3 = models.BooleanField()
class Meta:
abstract = True
def post():
pass
But then I have a definite number of types of columns, conceptually speaking. There is no difference in the fields, only in how the behavior of a certain method is implemented:
class Property1(Property):
def post():
# execute behavior for Property1
pass
class Property2(Property):
def post():
# execute behavior for Property2
pass
and so on.
If I turned Property into an abstract base model class and have the rest inherit it, I will end up with different tables for each property. I am not sure I want that. All tables will look the same, which is redundant.
But at the same time when running a query to get all properties in a table and calling post() I want the corresponding behavior to be executed:
for prop in table.property_set.all():
prop.post()
What are my options?
For that, you can use proxy model. Try like this:
class Property(models.Model):
table = models.ForeignKey(Table)
field1 = models.CharField()
field2 = models.IntegerField()
field3 = models.BooleanField()
class Property1(Property):
class Meta:
proxy = True
def post():
# execute behavior for Property1
pass
class Property2(Property):
class Meta:
proxy = True
def post():
# execute behavior for Property2
pass
As per documentation:
The MyPerson class operates on the same database table as its parent Person class. In particular, any new instances of Person will also be accessible through MyPerson, and vice-versa:
So you can get the proxy instances like this:
Property1.objects.filter(pk__in=table.property_set.all())
I want to serialize a model, but want to include an additional field that requires doing some database lookups on the model instance to be serialized:
class FooSerializer(serializers.ModelSerializer):
my_field = ... # result of some database queries on the input Foo object
class Meta:
model = Foo
fields = ('id', 'name', 'myfield')
What is the right way to do this? I see that you can pass in extra "context" to the serializer, is the right answer to pass in the additional field in a context dictionary?
With that approach, the logic of getting the field I need would not be self-contained with the serializer definition, which is ideal since every serialized instance will need my_field. Elsewhere in the DRF serializers documentation it says "extra fields can correspond to any property or callable on the model". Are "extra fields" what I'm talking about?
Should I define a function in Foo's model definition that returns my_field value, and in the serializer I hook up my_field to that callable? What does that look like?
Happy to clarify the question if necessary.
I think SerializerMethodField is what you're looking for:
class FooSerializer(serializers.ModelSerializer):
my_field = serializers.SerializerMethodField('is_named_bar')
def is_named_bar(self, foo):
return foo.name == "bar"
class Meta:
model = Foo
fields = ('id', 'name', 'my_field')
http://www.django-rest-framework.org/api-guide/fields/#serializermethodfield
You can change your model method to property and use it in serializer with this approach.
class Foo(models.Model):
. . .
#property
def my_field(self):
return stuff
. . .
class FooSerializer(ModelSerializer):
my_field = serializers.ReadOnlyField(source='my_field')
class Meta:
model = Foo
fields = ('my_field',)
Edit: With recent versions of rest framework (I tried 3.3.3), you don't need to change to property. Model method will just work fine.
With the last version of Django Rest Framework, you need to create a method in your model with the name of the field you want to add. No need for #property and source='field' raise an error.
class Foo(models.Model):
. . .
def foo(self):
return 'stuff'
. . .
class FooSerializer(ModelSerializer):
foo = serializers.ReadOnlyField()
class Meta:
model = Foo
fields = ('foo',)
if you want read and write on your extra field, you can use a new custom serializer, that extends serializers.Serializer, and use it like this
class ExtraFieldSerializer(serializers.Serializer):
def to_representation(self, instance):
# this would have the same as body as in a SerializerMethodField
return 'my logic here'
def to_internal_value(self, data):
# This must return a dictionary that will be used to
# update the caller's validation data, i.e. if the result
# produced should just be set back into the field that this
# serializer is set to, return the following:
return {
self.field_name: 'Any python object made with data: %s' % data
}
class MyModelSerializer(serializers.ModelSerializer):
my_extra_field = ExtraFieldSerializer(source='*')
class Meta:
model = MyModel
fields = ['id', 'my_extra_field']
i use this in related nested fields with some custom logic
My response to a similar question (here) might be useful.
If you have a Model Method defined in the following way:
class MyModel(models.Model):
...
def model_method(self):
return "some_calculated_result"
You can add the result of calling said method to your serializer like so:
class MyModelSerializer(serializers.ModelSerializer):
model_method_field = serializers.CharField(source='model_method')
p.s. Since the custom field isn't really a field in your model, you'll usually want to make it read-only, like so:
class Meta:
model = MyModel
read_only_fields = (
'model_method_field',
)
If you want to add field dynamically for each object u can use to_represention.
class FooSerializer(serializers.ModelSerializer):
class Meta:
model = Foo
fields = ('id', 'name',)
def to_representation(self, instance):
representation = super().to_representation(instance)
if instance.name!='': #condition
representation['email']=instance.name+"#xyz.com"#adding key and value
representation['currency']=instance.task.profile.currency #adding key and value some other relation field
return representation
return representation
In this way you can add key and value for each obj dynamically
hope u like it
This worked for me.
If we want to just add an additional field in ModelSerializer, we can
do it like below, and also the field can be assigned some val after
some calculations of lookup. Or in some cases, if we want to send the
parameters in API response.
In model.py
class Foo(models.Model):
"""Model Foo"""
name = models.CharField(max_length=30, help_text="Customer Name")
In serializer.py
class FooSerializer(serializers.ModelSerializer):
retrieved_time = serializers.SerializerMethodField()
#classmethod
def get_retrieved_time(self, object):
"""getter method to add field retrieved_time"""
return None
class Meta:
model = Foo
fields = ('id', 'name', 'retrieved_time ')
Hope this could help someone.
class Demo(models.Model):
...
#property
def property_name(self):
...
If you want to use the same property name:
class DemoSerializer(serializers.ModelSerializer):
property_name = serializers.ReadOnlyField()
class Meta:
model = Product
fields = '__all__' # or you can choose your own fields
If you want to use different property name, just change this:
new_property_name = serializers.ReadOnlyField(source='property_name')
As Chemical Programer said in this comment, in latest DRF you can just do it like this:
class FooSerializer(serializers.ModelSerializer):
extra_field = serializers.SerializerMethodField()
def get_extra_field(self, foo_instance):
return foo_instance.a + foo_instance.b
class Meta:
model = Foo
fields = ('extra_field', ...)
DRF docs source
Even though, this is not what author has wanted, it still can be considered useful for people here:
If you are using .save() ModelSerializer's method, you can pass **kwargs into it. By this, you can save multiple dynamic values.
i.e. .save(**{'foo':'bar', 'lorem':'ipsum'})
Add the following in serializer class:
def to_representation(self, instance):
representation = super().to_representation(instance)
representation['package_id'] = "custom value"
return representation
Here's my attempt at a generalized natural key model manager. It's like the docs except it tries (unsuccessfully) to determine the natural key field names from the Meta.unique_together attribute.
class NaturalKeyModelManager(Manager):
def get_by_natural_key(self, *args):
field_dict = {}
for i, k in enumerate(self.model.Meta.unique_together[0]):
field_dict[k] = args[i]
return self.get(**field_dict)
If I insert a debug print just before the for loop like this:
print dir(self.model.Meta)
it doesn't list the unqiue_together attribute at all:
['__doc__', '__module__', 'abstract']
The 'abstract' bit worried me, but another debug print shows that the model I'm trying manage with natural keys is not abstract:
>>> print self.model.Meta.abstract
False
I am mixing in a lot of abstract base classes. Could that be the problem?
class MixedModel(NamedModel, TimeStampedModel, VersionedModel, Model):
objects = NaturalKeyModelManager()
class Meta:
unique_together = (('name', 'version',),)
For completeness here's one of the mixins:
class TimeStampedModel(Model):
created = DateTimeField(_("Created"), auto_now_add=True, null=True, editable=False)
updated = DateTimeField(_("Updated"), auto_now=True, null=True, editable=True)
class Meta:
abstract = True
The hard-coded model manager works just fine:
class MixedModelManager(Manager):
def get_by_natural_key(self, name, version):
return self.get(name=name, version=version)
In order to get the actual options passed to meta, you should use self.model._meta rather than self.model.Meta
I have a Django Model that looks like this:
class MyModel(models.Model):
field1 = models.IntegerField()
field2 = models.IntegerField()
nonDbField = SomeObject()
objects = MyCustomManager()
field1 is actually a PK to an abstract class of SomeObject.
I want a custom manager that for every value returned by any of the functions (all, filter, get, etc) does the following:
value.nonDbField = SomeObject.objects.get(pk=value.field1)
I've tested that I can manually override get like so:
class MyCustomManager(models.Manager):
def get(self, *args, **kwargs):
value = super(MyCustomManager, self).get(*args, **kwargs)
value.nonDbField = SomeObject.objects.get(listid=value.itemListID)
return value
but wondered if there was an easier way to do it across all functions.
There's going to be plenty of you that will say, "Why are you doing this?". It has to do with a model inheritance of a legacy, but still active database.
If you need nonDbField's value to be related to the field1 (or any other field in the model) you can try something like this:
Class MyModel(models.Model):
# your fields here...
def _nonDbField(self):
return SomeObject.objects.get(pk=self.field1)
nonDbField = property(_nonDbField)
This allows you to do something like this:
MyModel.objects.get(pk=1).nonDbField
Keep in mind that you are making a database query each time you access nonDbField (which may or may not be detrimental to your DB performance).
you can use property for your calculated fields
Class MyModel(models.Model):
# your fields here...
first_name = models.CharField()
last_name = models.CharField()
#property
def fullname(self):
return f"{self.first_name} {self.last_name}"
This allows you to do something like this:
obj = MyModel.objects.get(pk=1)
print(obj.fullname)
I have a normal model and an abstract model like so:
class TaggedSubject(models.Model):
user = models.ForeignKey(User, null=True, blank=True)
category = models.CharField(max_length=200)
foo = models.CharField(max_length=50)
bar = models.CharField(max_length=50)
# etc
content_type = models.ForeignKey(ContentType)
content_object_pk = models.CharField(max_length=255)
content_object = generic.GenericForeignKey("content_type", "content_object_pk")
def __unicode__(self):
if self.user:
return "%s" % (self.user.get_full_name() or self.user.username)
else:
return self.label
class Taggable(models.Model):
tagged_subjects = generic.GenericRelation(TaggedSubject, content_type_field='content_type', object_id_field='content_object_pk')
#property
def tagged_users(self):
return User.objects.filter(pk__in=self.tagged_subjects.filter(user__isnull=False).values("user"))
class Meta:
abstract = True
The Taggable abstract model class then gets used like so:
class Photo(Taggable):
image = models.ImageField(upload_to="foo")
# ... etc
So if we have a photo object:
photo = Photo.objects.all()[0]
I can all the users tagged in the photo with photo.tagged_users.all()
I want to add the inverse relation to the user object, so that if I have a user:
user = User.objects.filter(pk__in=TaggedSubject.objects.exclude(user__isnull=True).values("user"))[0]
I can call something like user.tagged_photo_set.all() and have it return all the photo objects.
I suspect that since TaggedSubject connects to the Taggable model on a generic relation that it won't be possible to use it as a through model with a ManyToMany field.
Assuming this is true, this is the function I believe I'd need to add (somehow) to the User model:
def tagged_photo_set(self):
Photo.objects.filter(pk__in=TaggedSubject.objects.filter(user=self, content_type=ContentType.objects.get_for_model(Photo))
I'm wondering if it's possible to set it up so that each time a new model class is created based on Taggable, it creates a version of the function above and adds it (ideally as a function that behaves like a property!) to User.
Alternatively, if it is somehow possible to do ManyToMany field connections on a generic relation (which I highly doubt), that would work too.
Finally, if there is a third even cooler option that I am not seeing, I'm certainly open to it.
You could use add_to_class and the class_prepared signal to do some post processing when models subclassing your base class are set up:
def add_to_user(sender, **kwargs):
def tagged_FOO_set(self):
return sender.objects.filter(pk__in=TaggedSubject.objects.filter(
user=self,
content_type=ContentType.objects.get_for_model(sender)))
if issubclass(sender, MyAbstractClass):
method_name = 'tagged_{model}_set'.format(model=sender.__name__.lower())
User.add_to_class(method_name, property(tagged_FOO_set))
class_prepared.connect(add_to_user)