Is possible to create a dependence between serializers, like the code below?
class ProSerializer(serializers.ModelSerializer):
entity = serializers.PrimaryKeyRelatedField(many=False,queryset=Entity.objects.all())
foo = serializers.PrimaryKeyRelatedField(many=True,queryset=Foo.objects.filter(entity=entity))
class Meta:
model = ..............
What I want to do is to limit the queryset on Foo to just the ones from the chosen entity. Is there a way to do that?
Django Rest Framework does not make this easy, at least in version 2.x – and I am not sure whether there are/were any plans to make it better in version 3.
I hacked this fixed in various places with try catches in serializer inits filtering any applicable field's queryset by the parent property passed in the data dictionary before making an attempt at standardising the problem – the following is what I came up with.
SlugRelatedDependentField
class SlugRelatedDependentField(SlugRelatedField):
def __init__(self, depends_on=None, **kwargs):
assert depends_on is not None, 'The `depends_on` argument is required.'
self.depends_on = depends_on # archive_unit__organization or organization
self.depends_segments = self.depends_on.split('__')
self.depends_parent = self.depends_segments.pop(0)
self.depends_field = SimpleLazyObject(lambda: self.parent.parent.fields[self.depends_parent])
self.depends_queryset = SimpleLazyObject(lambda: self.depends_field.queryset)
self.depends_model = SimpleLazyObject(lambda: self.depends_queryset.model)
super(SlugRelatedDependentField, self).__init__(**kwargs)
def contextualize(self, instance, data):
self.data = data
self.instance = instance
def get_queryset(self):
try:
return self.queryset.filter(**{self.depends_on: reduce(getattr, self.depends_segments, self.get_relation())})
except self.depends_model.DoesNotExist:
# if parent was absent or invalid, empty the queryset
return self.queryset.none()
except TypeError:
# if parent was a Page instance, use the full queryset, it's only a list view
return self.queryset.all()
def get_relation(self):
try:
# if an allowed parent was passed, filter by it
return self.depends_queryset.get(**{self.depends_field.slug_field: self.data[self.depends_parent]})
except (KeyError, TypeError):
# if data was empty or no parent was passed, try and grab it off of the model instance
if isinstance(self.instance, self.parent.parent.Meta.model):
return getattr(self.instance, self.depends_parent)
elif self.instance is None:
raise self.depends_model.DoesNotExist
else:
raise TypeError
Usage
class RepositorySerializer(ModelSerializer):
organization = SlugRelatedField(queryset=Organization.objects.all(), slug_field='slug')
teams = SlugRelatedDependentField(allow_null=True, depends_on='organization', many=True, queryset=Team.objects.all(), required=False, slug_field='slug')
def __init__(self, instance=None, data=empty, **kwargs):
f = self.fields['teams']
# assign instance and data for get_queryset
f.child_relation.contextualize(instance, data)
# inject relation values from instance if they were omitted so they are validated regardless
if data is not empty and instance and name not in data:
data[name] = [getattr(relation, f.child_relation.slug_field) for relation in getattr(instance, name).all()]
super(RepositorySerializer, self).__init__(instance=instance, data=data, **kwargs)
Summary
SlugRelatedDependentField expands on the regular SlugRelatedField to accept a depends_on kwarg which accepts a string describing the field's relation to another – in this example, the usage describes that the teams assigned to this repository must belong to the organization.
A few gotchas
I empty the queryset with .none() if the parent does not exist, this avoids choice leak, which may be otherwise exposed via OPTIIONS requests and validation messages, and is usually undesirable.
I used data when querying for the parent record, IIRC the reason I did this was because data is consistently available whilst the parent field's object may not be e.g. in the case of PATCH requests.
You'll notice I inject any omitted relation values in the latter portion of the serializer init, this serves the purpose of forcing validation to run on the many field – useful e.g. if the user changed the organization of the record in a PATCH request, meaning the assigned teams no longer apply.
Support for distant relations
Another problem this solution caters for is referencing distant relations, this can be done by passing a __ delimited string to depends_on e.g. repository__organization, I don't have a great example use case for this, but it's there if you need it.
Related
I currently have a form object that looks like this
class MainLoginForm(forms.Form):
PART_CHOICES = (
("0", "0"),
("1", "1"),
)
user_name = forms.CharField(required=True)
user_category = forms.ChoiceField(choices=PART_CHOICES)
user_password = forms.CharField(widget=forms.PasswordInput,required=True)
I then use this object in the following manner in my view
def home(request):
# if this is a POST request we need to process the form data
if request.method == 'POST':
form = MainLoginForm(request.POST)
if form.is_valid():
return HttpResponse('Hello World')
# if a GET (or any other method) we'll create a blank form
else:
form = MainLoginForm()
return render(request, 'main/home.html', {
'form': form,
})
Everything seems to work except that I am curious as to why the object i created has class members and not instance members such as this ? I just started learning Django and was curious ..
class MainLoginForm(forms.Form):
PART_CHOICES = (
("0", "0"),
("1", "1"),
)
def __init__(self)
self.user_name = forms.CharField(required=True)
self.user_category = forms.ChoiceField(choices=PART_CHOICES)
self.user_password = forms.CharField(widget=forms.PasswordInput,required=True)
Wouldnt the current way be unsafe when multiple simultaneous requests come in ?
Following syntax is called Declarative Syntax:
class MainLoginForm(forms.Form):
user_name = forms.CharField(required=True)
You can have a look at the comments in forms.Form
class Form(six.with_metaclass(DeclarativeFieldsMetaclass, BaseForm)):
"A collection of Fields, plus their associated data."
# This is a separate class from BaseForm in order to abstract the way
# self.fields is specified. This class (Form) is the one that does the
# fancy metaclass stuff purely for the semantic sugar -- it allows one
# to define a form using declarative syntax.
# BaseForm itself has no way of designating self.fields.
You have to notice that the fields you define in the form like above are only available in self.fields['user_name'] and not directly as self.user_name where self is referring to the form instance.
The DeclarativeFieldsMetaclass is responsible for reading the fields defined using declarative syntax and then populating self.fields on the form instance.
Wouldnt the current way be unsafe when multiple simultaneous requests come in ?
It is not exactly clear what you mean by unsafe because that is very subjective here. But, on the othe rhand if you just meant the fields being defined on the class level, then you can see in the explanation above that it is not the case. So from that perspective the current method is safe.
If you do wish to create a form where the fields needs to be defined within __init__ method, you have to store the fields in a dict and update the self.-fields collection:
class MainLoginForm(forms.Form):
def __init__(self, *args, **kwargs):
super(MainLoginForm, self).__init__(*args, **kwargs)
# create a new field
self.fields['user_name'] = forms.CharField(required=True)
I am using Field level Validation in my serializer and I have situation where I need value of first field while validating second field. I know in object level validation I can have access to that but my serailzer have many=True and in object level validation I cannot tell client side which iteration have raised the error.
Serailzer :
class Keys_Serializer(serializers.Serializer):
"""
"""
key_id = serializers.IntegerField(required=True)
key_name = serializers.CharField(required=True)
value_id = serializers.IntegerField(required=False)
def validate_key_id(self, value):
"""
validate key id
"""
*** validate key_id here ***
return value
def validate_value_id(self, value):
"""
validate value_id w.r.t key_id
"""
*** I need key_id of current iteration here so that I can validate value_id. ***
return value
Is there any way of accessing the value of key_id in value_id validation.
No that is not possible. If you need to access more than one value you have to use the Object-level validation (see docs):
class Keys_Serializer(serializers.Serializer):
key_id = serializers.IntegerField(required=True)
key_name = serializers.CharField(required=True)
value_id = serializers.IntegerField(required=False)
def validate(self, data):
# here you can access all values
key_id = data['key_id']
value_id = data['value_id']
# perform you validation
if key_id != value_id:
raise serializers.ValidationError("key_id must be equal to value_id")
return data
I dug around codebase of drf a little bit. You can get values of all fields using following approach. This way you can throw serialization error as {'my_field':'error message} instead of {'non_field_error':'error message'}
def validate_myfield(self, value):
data = self.get_initial() # data for all the fields
#do your validation
However, if you wish to do it for ListSerializer, i.e for serializer = serializer_class(many=True), this won't work. You will get list of empty values. In that scenario, you could write your validations in def validate function and to avoid non_field_errors in your serialization error, you can raise ValidationError with error message as a dictionary instead of string.
def validate(self, data):
# do your validation
raise serializers.ValidationError({"your_field": "error_message"})
def validate(self, validated_data):
"""
validate and verifies the user data before getting saved.
:param validated_data: dict obj
:return: validated_data
"""
existing_data = self.to_representation(self.instance)
So if you're performing a create or update the plain unvalidated data can accessed in:
self.context['view'].get_serializer().data
Although the other solutions are in fact cleaner, we have a single model on a partitioned table and need another field that specifies the partition, since we're not partitioned on the primary key the database has no clue how to look it up.
EDIT: It appears that field is actually blank, you may have luck with the following, depending on how the Serializer is used (for this it must be used in a ViewSet)
self.context['view'].get_serializer().context['request'].data
One other simple solution is accessing the properties as below
class Keys_Serializer(serializers.Serializer):
key_id = serializers.IntegerField(required=True)
key_name = serializers.CharField(required=True)
value_id = serializers.IntegerField(required=False)
def validate_key_name(self, value):
#getting other field values as below
self.initialdata
#scrape data from initialdata
# incase of updation time use
# self.instance and self.inistialdata together to get respective ones
return value
I had a similar problem where I just needed other values for the output, not for validation but this could probably be a good starting point. It is based on https://www.django-rest-framework.org/api-guide/fields/#examples.
You need to override the get_attribute function to return the object instead of the attribute. From the docs:
class ClassNameField(serializers.Field):
def get_attribute(self, instance):
# We pass the object instance onto `to_representation`,
# not just the field attribute.
return instance
Then I was able to do stuff like
def to_representation(self, instance): # pylint: disable=arguments-renamed
return instance.my_function()
or
return f"{instance.attribute_1}, {instance.attribute_2}"
def validate_fieldname(self, value):
data = self.context['request'].data
When I try to deserialize some data into an object, if I include a field that is unique and give it a value that is already assigned to an object in the database, I get a key constraint error. This makes sense, as it is trying to create an object with a unique value that is already in use.
Is there a way to have a get_or_create type of functionality for a ModelSerializer? I want to be able to give the Serializer some data, and if an object exists that has the given unique field, then just return that object.
In my experience nmgeek's solution won't work in DRF 3+ as serializer.is_valid() correctly honors the model's unique_together constraint. You can work around this by removing the UniqueTogetherValidator and overriding your serializer's create method.
class MyModelSerializer(serializers.ModelSerializer):
def run_validators(self, value):
for validator in self.validators:
if isinstance(validator, validators.UniqueTogetherValidator):
self.validators.remove(validator)
super(MyModelSerializer, self).run_validators(value)
def create(self, validated_data):
instance, _ = models.MyModel.objects.get_or_create(**validated_data)
return instance
class Meta:
model = models.MyModel
The Serializer restore_object method was removed starting with the 3.0 version of REST Framework.
A straightforward way to add get_or_create functionality is as follows:
class MyObjectSerializer(serializers.ModelSerializer):
class Meta:
model = MyObject
fields = (
'unique_field',
'other_field',
)
def get_or_create(self):
defaults = self.validated_data.copy()
identifier = defaults.pop('unique_field')
return MyObject.objects.get_or_create(unique_field=identifier, defaults=defaults)
def post(self, request, format=None):
serializer = MyObjectSerializer(data=request.data)
if serializer.is_valid():
instance, created = serializer.get_or_create()
if not created:
serializer.update(instance, serializer.validated_data)
return Response(serializer.data, status=status.HTTP_202_ACCEPTED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
However, it doesn't seem to me that the resulting code is any more compact or easy to understand than if you query if the instance exists then update or save depending upon the result of the query.
#Groady's answer works, but you have now lost your ability to validate the uniqueness when creating new objects (UniqueValidator has been removed from your list of validators regardless the cicumstance). The whole idea of using a serializer is that you have a comprehensive way to create a new object that validates the integrity of the data you want to use to create the object. Removing validation isn't what you want. You DO want this validation to be present when creating new objects, you'd just like to be able to throw data at your serializer and get the right behavior under the hood (get_or_create), validation and all included.
I'd recommend overwriting your is_valid() method on the serializer instead. With the code below you first check to see if the object exists in your database, if not you proceed with full validation as usual. If it does exist you simply attach this object to your serializer and then proceed with validation as usual as if you'd instantiated the serializer with the associated object and data. Then when you hit serializer.save() you'll simply get back your already created object and you can have the same code pattern at a high level: instantiate your serializer with data, call .is_valid(), then call .save() and get returned your model instance (a la get_or_create). No need to overwrite .create() or .update().
The caveat here is that you will get an unnecessary UPDATE transaction on your database when you hit .save(), but the cost of one extra database call to have a clean developer API with full validation still in place seems worthwhile. It also allows you the extensibility of using custom models.Manager and custom models.QuerySet to uniquely identify your model from a few fields only (whatever the primary identifying fields may be) and then using the rest of the data in initial_data on the Serializer as an update to the object in question, thereby allowing you to grab unique objects from a subset of the data fields and treat the remaining fields as updates to the object (in which case the UPDATE call would not be extra).
Note that calls to super() are in Python3 syntax. If using Python 2 you'd want to use the old style: super(MyModelSerializer, self).is_valid(**kwargs)
from django.core.exceptions import ObjectDoesNotExist, MultipleObjectsReturned
class MyModelSerializer(serializers.ModelSerializer):
def is_valid(self, raise_exception=False):
if hasattr(self, 'initial_data'):
# If we are instantiating with data={something}
try:
# Try to get the object in question
obj = Security.objects.get(**self.initial_data)
except (ObjectDoesNotExist, MultipleObjectsReturned):
# Except not finding the object or the data being ambiguous
# for defining it. Then validate the data as usual
return super().is_valid(raise_exception)
else:
# If the object is found add it to the serializer. Then
# validate the data as usual
self.instance = obj
return super().is_valid(raise_exception)
else:
# If the Serializer was instantiated with just an object, and no
# data={something} proceed as usual
return super().is_valid(raise_exception)
class Meta:
model = models.MyModel
There are a couple of scenarios where a serializer might need to be able to get or create Objects based on data received by a view - where it's not logical for the view to do the lookup / create functionality - I ran into this this week.
Yes it is possible to have get_or_create functionality in a Serializer. There is a hint about this in the documentation here: http://www.django-rest-framework.org/api-guide/serializers#specifying-which-fields-should-be-write-only where:
restore_object method has been written to instantiate new users.
The instance attribute is fixed as None to ensure that this method is not used to update Users.
I think you can go further with this to put full get_or_create into the restore_object - in this instance loading Users from their email address which was posted to a view:
class UserFromEmailSerializer(serializers.ModelSerializer):
class Meta:
model = get_user_model()
fields = [
'email',
]
def restore_object(self, attrs, instance=None):
assert instance is None, 'Cannot update users with UserFromEmailSerializer'
(user_object, created) = get_user_model().objects.get_or_create(
email=attrs.get('email')
)
# You can extend here to work on `user_object` as required - update etc.
return user_object
Now you can use the serializer in a view's post method, for example:
def post(self, request, format=None):
# Serialize "new" member's email
serializer = UserFromEmailSerializer(data=request.DATA)
if not serializer.is_valid():
return Response(serializer.errors,
status=status.HTTP_400_BAD_REQUEST)
# Loaded or created user is now available in the serializer object:
person=serializer.object
# Save / update etc.
A better way of doing this is to use the PUT verb instead, then override the get_object() method in the ModelViewSet. I answered this here: https://stackoverflow.com/a/35024782/3025825.
A simple workaround is to use to_internal_value method:
class MyModelSerializer(serializers.ModelSerializer):
def to_internal_value(self, validated_data):
instance, _ = models.MyModel.objects.get_or_create(**validated_data)
return instance
class Meta:
model = models.MyModel
I know it's a hack, but in case if you need a quick solution
P.S. Of course, editing is not supported
class ExpoDeviceViewSet(viewsets.ModelViewSet):
permission_classes = [IsAuthenticated, ]
serializer_class = ExpoDeviceSerializer
def get_queryset(self):
user = self.request.user
return ExpoDevice.objects.filter(user=user)
def perform_create(self, serializer):
existing_token = self.request.user.expo_devices.filter(
token=serializer.validated_data['token']).first()
if existing_token:
return existing_token
return serializer.save(user=self.request.user)
In case anyone needs to create an object if it does not exist on GET request:
class MyModelViewSet(viewsets.ModelViewSet):
queryset = models.MyModel.objects.all()
serializer_class = serializers.MyModelSerializer
def retrieve(self, request, pk=None):
instance, _ = models.MyModel.objects.get_or_create(pk=pk)
serializer = self.serializer_class(instance)
return response.Response(serializer.data)
Another solution, as I found that UniqueValidator wasn't in the validators for the serializer, but rather in the field's validators.
def is_valid(self, raise_exception=False):
self.fields["my_field_to_fix"].validators = [
v
for v in self.fields["my_field_to_fix"].validators
if not isinstance(v, validators.UniqueValidator)
]
return super().is_valid(raise_exception)
I've overridden the default manager of my models in order to show only allowed items, according to the logged user (a sort of object-specific permission):
class User_manager(models.Manager):
def get_query_set(self):
""" Filter results according to logged user """
#Compose a filter dictionary with current user (stored in a middleware method)
user_filter = middleware.get_user_filter()
return super(User_manager, self).get_query_set().filter(**user_filter)
class Foo(models.Model):
objects = User_manager()
...
In this way, whenever I use Foo.objects, the current user is retrieved and a filter is applied to default queryset in order to show allowed records only.
Then, I have a model with a ForeignKey to Foo:
class Bar(models.Model):
foo = models.ForeignKey(Foo)
class BarForm(form.ModelForm):
class Meta:
model = Bar
When I compose BarForm I'm expecting to see only the filteres Foo instances but the filter is not applied. I think it is because the queryset is evaluated and cached on Django start-up, when no user is logged and no filter is applied.
Is there a method to make Django evalutate the ModelChoice queryset at run-time, without having to make it explicit in the form definition? (despite of all performance issues...)
EDIT
I've found where the queryset is evaluated (django\db\models\fields\related.py: 887):
def formfield(self, **kwargs):
db = kwargs.pop('using', None)
defaults = {
'form_class': forms.ModelChoiceField,
'queryset': self.rel.to._default_manager.using(db).complex_filter(self.rel.limit_choices_to),
'to_field_name': self.rel.field_name,
}
defaults.update(kwargs)
return super(ForeignKey, self).formfield(**defaults)
Any hint?
Had exactly this problem -- needed to populate select form with user objects from a group, but fun_vit's answer is incorrect (at least for django 1.5)
Firstly, you don't want to overwrite the field['somefield'].choices object -- it is a ModelChoiceIterator object, not a queryset. Secondly, a comment in django.forms.BaseForm warns you against overriding base_fields:
# The base_fields class attribute is the *class-wide* definition of
# fields. Because a particular *instance* of the class might want to
# alter self.fields, we create self.fields here by copying base_fields.
# Instances should always modify self.fields; they should not modify
# self.base_fields.
This worked for me (django 1.5):
class MyForm(ModelForm):
users = ModelMultipleChoiceField(queryset=User.objects.none())
def __init__(self, *args, **kwargs):
super(MyForm, self).__init__(*args,**kwargs)
site = Site.objects.get_current()
self.fields['users'].queryset = site.user_group.user_set.all()
class Meta:
model = MyModel
i use init of custom form:
class BT_Form(forms.ModelForm):
def __init__(self, *args, **kwargs):
super(BT_Form, self).__init__(*args, **kwargs)
#prepare new values
cities = [(u'',u'------')] #default value
cities.extend([
(
c.pk,
c.__unicode__()
) for c in City.objects.filter(enabled=True).all()
])
self.fields['fly_from_city'].choices = cities #renew values
No way: I had to rewrite queryset definition (which is evaluated at startup)
I suppose similar problem would have been discussed here, but I couldn't find it.
Let's suppose I have an Editor and a Supervisor. I want the Editor to be able to add new content (eg. a news post) but before publication it has to be acknowledged by Supervisor.
When Editor lists all items, I want to set some fields on the models (like an 'ack' field) as read-only (so he could know what had been ack'ed and what's still waiting approval) but the Supervisor should be able to change everything (list_editable would be perfect)
What are the possible solutions to this problem?
I think there is a more easy way to do that:
Guest we have the same problem of Blog-Post
blog/models.py:
Class Blog(models.Model):
...
#fields like autor, title, stuff..
...
class Post(models.Model):
...
#fields like blog, title, stuff..
...
approved = models.BooleanField(default=False)
approved_by = models.ForeignKey(User)
class Meta:
permissions = (
("can_approve_post", "Can approve post"),
)
And the magic is in the admin:
blog/admin.py:
...
from django.views.decorators.csrf import csrf_protect
...
def has_approval_permission(request, obj=None):
if request.user.has_perm('blog.can_approve_post'):
return True
return False
Class PostAdmin(admin.ModelAdmin):
#csrf_protect
def changelist_view(self, request, extra_context=None):
if not has_approval_permission(request):
self.list_display = [...] # list of fields to show if user can't approve the post
self.editable = [...]
else:
self.list_display = [...] # list of fields to show if user can approve the post
return super(PostAdmin, self).changelist_view(request, extra_context)
def get_form(self, request, obj=None, **kwargs):
if not has_approval_permission(request, obj):
self.fields = [...] # same thing
else:
self.fields = ['approved']
return super(PostAdmin, self).get_form(request, obj, **kwargs)
In this way you can use the api of custom permission in django, and you can override the methods for save the model or get the queryset if you have to. In the methid has_approval_permission you can define the logic of when the user can or can't to do something.
Starting Django 1.7, you can now use the get_fields hook which makes it so much simpler to implement conditional fields.
class MyModelAdmin(admin.ModelAdmin):
...
def get_fields(self, request, obj=None):
fields = super(MyModelAdmin, self).get_fields(request, obj)
if request.user.is_superuser:
fields += ('approve',)
return fields
I have a system kind of like this on a project that I'm just finishing up. There will be a lot of work to put this together, but here are some of the components that I had to make my system work:
You need a way to define an Editor and a Supervisor. The three ways this could be done are 1.) by having an M2M field that defines the Supervisor [and assuming that everyone else with permission to read/write is an Editor], 2.) make 2 new User models that inherit from User [probably more work than necessary] or 3.) use the django.auth ability to have a UserProfile class. Method #1 is probably the most reasonable.
Once you can identify what type the user is, you need a way to generically enforce the authorization you're looking for. I think the best route here is probably a generic admin model.
Lastly you'll need some type of "parent" model that will hold the permissions for whatever needs to be moderated. For example, if you had a Blog model and BlogPost model (assuming multiple blogs within the same site), then Blog is the parent model (it can hold the permissions of who approves what). However, if you have a single blog and there is no parent model for BlogPost, we'll need some place to store the permissions. I've found the ContentType works out well here.
Here's some ideas in code (untested and more conceptual than actual).
Make a new app called 'moderated' which will hold our generic stuff.
moderated.models.py
class ModeratedModelParent(models.Model):
"""Class to govern rules for a given model"""
content_type = models.OneToOneField(ContentType)
can_approve = models.ManyToManyField(User)
class ModeratedModel(models.Model):
"""Class to implement a model that is moderated by a supervisor"""
is_approved = models.BooleanField(default=False)
def get_parent_instance(self):
"""
If the model already has a parent, override to return the parent's type
For example, for a BlogPost model it could return self.parent_blog
"""
# Get self's ContentType then return ModeratedModelParent for that type
self_content_type = ContentType.objects.get_for_model(self)
try:
return ModeratedModelParent.objects.get(content_type=self_content_type)
except:
# Create it if it doesn't already exist...
return ModeratedModelParent.objects.create(content_type=self_content_type).save()
class Meta:
abstract = True
So now we should have a generic, re-usable bit of code that we can identify the permission for a given model (which we'll identify the model by it's Content Type).
Next, we can implement our policies in the admin, again through a generic model:
moderated.admin.py
class ModeratedModelAdmin(admin.ModelAdmin):
# Save our request object for later
def __call__(self, request, url):
self.request = request
return super(ModeratedModelAdmin, self).__call__(request, url)
# Adjust our 'is_approved' widget based on the parent permissions
def formfield_for_dbfield(self, db_field, **kwargs):
if db_field.name == 'is_approved':
if not self.request.user in self.get_parent_instance().can_approve.all():
kwargs['widget'] = forms.CheckboxInput(attrs={ 'disabled':'disabled' })
# Enforce our "unapproved" policy on saves
def save_model(self, *args, **kwargs):
if not self.request.user in self.get_parent_instance().can_approve.all():
self.is_approved = False
return super(ModeratedModelAdmin, self).save_model(*args, **kwargs)
Once these are setup and working, we can re-use them across many models as I've found once you add structured permissions for something like this, you easily want it for many other things.
Say for instance you have a news model, you would simply need to make it inherit off of the model we just made and you're good.
# in your app's models.py
class NewsItem(ModeratedModel):
title = models.CharField(max_length=200)
text = models.TextField()
# in your app's admin.py
class NewsItemAdmin(ModeratedModelAdmin):
pass
admin.site.register(NewsItem, NewsItemAdmin)
I'm sure I made some code errors and mistakes in there, but hopefully this can give you some ideas to act as a launching pad for whatever you decide to implement.
The last thing you have to do, which I'll leave up to you, is to implement filtering for the is_approved items. (ie. you don't want un-approved items being listed on the news section, right?)
The problem using the approach outlined by #diegueus9 is that the ModelAdmin acts liked a singleton and is not instanced for each request. This means that each request is modifying the same ModelAdmin object that is being accessed by other requests, which isn't ideal. Below is the proposed solutions by #diegueus9:
# For example, get_form() modifies the single PostAdmin's fields on each request
...
class PostAdmin(ModelAdmin):
def get_form(self, request, obj=None, **kwargs):
if not has_approval_permission(request, obj):
self.fields = [...] # list of fields to show if user can't approve the post
else:
self.fields = ['approved', ...] # add 'approved' to the list of fields if the user can approve the post
...
An alternative approach would be to pass fields as a keyword arg to the parent's get_form() method like so:
...
from django.contrib.admin.util import flatten_fieldsets
class PostAdmin(ModelAdmin):
def get_form(self, request, obj=None, **kwargs):
if has_approval_permission(request, obj):
fields = ['approved']
if self.declared_fieldsets:
fields += flatten_fieldsets(self.declared_fieldsets)
# Update the keyword args as needed to allow the parent to build
# and return the ModelForm instance you require for the user given their perms
kwargs.update({'fields': fields})
return super(PostAdmin, self).get_form(request, obj=None, **kwargs)
...
This way, you are not modifying the PostAdmin singleton on every request; you are simply passing the appropriate keyword args needed to build and return the ModelForm from the parent.
It is probably worth looking at the get_form() method on the base ModelAdmin for more info: https://code.djangoproject.com/browser/django/trunk/django/contrib/admin/options.py#L431