I've created a model, and I'm rendering the default/unmodified model form for it. This alone generates 64 SQL queries because it has quite a few foreign keys, and those in turn have more foreign keys.
Is it possible to force it to always (by default) perform a select_related every time one of these models are returned?
You can create a custom manager, and simply override get_queryset for it to apply everywhere. For example:
class MyManager(models.Manager):
def get_queryset(self):
return super(MyManager, self).get_queryset().select_related('foo', 'bar')
(Prior to Django 1.6, it was get_query_set).
Here's also a fun trick:
class DefaultSelectOrPrefetchManager(models.Manager):
def __init__(self, *args, **kwargs):
self._select_related = kwargs.pop('select_related', None)
self._prefetch_related = kwargs.pop('prefetch_related', None)
super(DefaultSelectOrPrefetchManager, self).__init__(*args, **kwargs)
def get_queryset(self, *args, **kwargs):
qs = super(DefaultSelectOrPrefetchManager, self).get_queryset(*args, **kwargs)
if self._select_related:
qs = qs.select_related(*self._select_related)
if self._prefetch_related:
qs = qs.prefetch_related(*self._prefetch_related)
return qs
class Sandwich(models.Model):
bread = models.ForeignKey(Bread)
extras = models.ManyToManyField(Extra)
# ...
objects = DefaultSelectOrPrefetchManager(select_related=('bread',), prefetch_related=('extras',))
Then you can re-use the manager easily between model classes. As an example use case, this would be appropriate if you had a __unicode__ method on the model which rendered a string that included some information from a related model (or anything else that meant a related model was almost always required).
...and if you really want to get wacky, here's a more generalized version. It allows you to call any sequence of methods on the default queryset with any combination of args or kwargs. There might be some errors in the code, but you get the idea.
from django.db import models
class MethodCalls(object):
"""
A mock object which logs chained method calls.
"""
def __init__(self):
self._calls = []
def __getattr__(self, name):
c = Call(self, name)
self._calls.append(c)
return c
def __iter__(self):
for c in self._calls:
yield tuple(c)
class Call(object):
"""
Used by `MethodCalls` objects internally to represent chained method calls.
"""
def __init__(self, calls_obj, method_name):
self._calls = calls_obj
self.method_name = method_name
def __call__(self, *method_args, **method_kwargs):
self.method_args = method_args
self.method_kwargs = method_kwargs
return self._calls
def __iter__(self):
yield self.method_name
yield self.method_args
yield self.method_kwargs
class DefaultQuerysetMethodCallsManager(models.Manager):
"""
A model manager class which allows specification of a sequence of
method calls to be applied by default to base querysets.
`DefaultQuerysetMethodCallsManager` instances expose a property
`default_queryset_method_calls` to which chained method calls can be
applied to indicate which methods should be called on base querysets.
"""
def __init__(self, *args, **kwargs):
self.default_queryset_method_calls = MethodCalls()
super(DefaultQuerysetMethodCallsManager, self).__init__(*args, **kwargs)
def get_queryset(self, *args, **kwargs):
qs = super(DefaultQuerysetMethodCallsManager, self).get_queryset(*args, **kwargs)
for method_name, method_args, method_kwargs in self.default_queryset_method_calls:
qs = getattr(qs, method_name)(*method_args, **method_kwargs)
return qs
class Sandwich(models.Model):
bread = models.ForeignKey(Bread)
extras = models.ManyToManyField(Extra)
# Other field definitions...
objects = DefaultQuerysetMethodCallsManager()
objects.default_queryset_method_calls.filter(
bread__type='wheat',
).select_related(
'bread',
).prefetch_related(
'extras',
)
The python-mock-inspired MethodCalls object is an attempt at making the API more natural. Some might find that a bit confusing. If so, you could sub out that code for an __init__ arg or kwarg that just accepts a tuple of method call information.
Create a custom models.Manager and override all the methods (filter, get etc.) and append select_related onto every query. Then set this manager as the objects attribute on the model.
I would recommend just going through your code and adding the select_related where needed, because doing select_related on everything is going to cause some serious performance issues down the line (and it wouldn't be entirely clear where it's coming from).
Related
I have a Django application that uses a JSON API as its data source.
Here's a simplified example of use in one of my views.py:
class GroupsList(LoginRequiredMixin):
def get(self, request, **kwargs):
# Get file list and totals
try:
group_list = group_adapter.list() # makes an API call and ALSO populates a meta info class
except APIAccessForbidden:
return HttpResponseRedirect(reverse('logout'))
return render(request, 'groups/index.html', {
# can I make a mixin to add data here gained from the API call?
'group_list': group_list,
})
This line:
The group_adapter.list() call populates some meta information into another class, that's not related to the group_list itself. I'd like to pass that data to the template. Ordinarily I'd use a context_processor, but when the context processor is called, the API call hasn't been made yet. I could manually check the information and add it to the render() method, but then I'd need to do that in dozens of different views.
Potential Solution #1: Create a Mixin For It
Can I use a mixin here that adds this information to context AFTER the view code runs but BEFORE render passes information to the template?
In other words is there a way to do this:
class GroupsList(LoginRequiredMixin, AddMetaInfoToContextMixin):
and then create a mixin something like this?
class AddMetaInfoToContextMixin(ContextMixin):
def get_context_data(self, **kwargs):
# self.request
context = super().get_context_data(**kwargs)
context['global_meta_information'] = get_global_meta_information()
return context
Potential Solution #2: Make an overridden templateview
Commenter Melvyn pointed out that I can potentially subclass TemplateView and override get_context_data(), so would something like this work?
class TemplateViewWithMeta(TemplateView):
def get_context_data(self, *args, **kwargs):
context = super(Home. self).get_context_data(*args, **kwargs)
context['global_meta_information'] = get_global_meta_information()
return context
class GroupsList(LoginRequiredMixin, TemplateViewWithMeta):
[...]
The typical workflow for a Django generic TemplateView is:
get()
get_context_data()
render_to_response()
So in your case keeping with the spirit of generic views, you could do it like this:
from django.views import generic
class BaseRemoteApiView(generic.TemplateView):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.group_list = None
def get(self, request, *args, **kwargs):
try:
self.group_list = group_adapter.list() # makes an API call and ALSO populates a meta info class
except APIAccessForbidden:
return HttpResponseRedirect(reverse('logout'))
return super().get(request, *args, **kwargs)
class RemoteApiContextMixin(generic.base.ContextMixin):
def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)
context["group_list"] = self.group_list
context["meta_information"] = get_global_meta_information()
return context
class ConcreteRemoteApiView(RemoteApiContextMixin, BaseRemoteApiView):
pass
Of course, you don't have to make 3 classes and can just combine the 3 into one - depends on how mixable you want to be.
I stumbled upon a code that is used to provide some args to the request method. Problem is that I'm not that sure if it is the cleanest way to handle this case.
def check_permissions(check_mixins):
"""
:param check_mixins: is given to the inner decorator
Decorator that will automatically populate some parameters when
using dispatch() toward the right method (get(), post())
"""
def _decorator(_dispatch):
def wrapper(request, *args, **kwargs):
Is it a problem if "self" isn't passed in the method definition in here...
for mixin in check_mixins:
kwargs = mixin.check(request, *args, **kwargs)
if isinstance(kwargs, HttpResponseRedirect):
return kwargs
return _dispatch(request, *args, **kwargs)
return wrapper
return _decorator
class UserLoginMixin(object):
def check(request, *args, **kwargs):
... and here ? It seems so ugly in my IDE
user = request.user
if user.is_authenticated() and not user.is_anonymous():
kwargs['user'] = user
return kwargs
return redirect('user_login')
class AppoExistMixin(object):
def check(request, *args, **kwargs):
Here too...
appo_id = kwargs['appo_id']
try:
appoff = IdAppoff.objects.get(id=appo_id)
kwargs['appoff'] = appoff
del kwargs['appo_id']
return kwargs
except IdAppoff.DoesNotExist:
pass
messages.add_message(request, messages.ERROR,
"Item doesn't exist!")
return redirect('home')
class SecurityMixin(View):
"""
Mixin that dispatch() to the right method with augmented kwargs.
kwargs are added if they match to specific treatment.
"""
data = []
def __init__(self, authenticators):
super(SecurityMixin, self).__init__()
# Clearing data in order to not add useless param to kwargs
self.data.clear()
# Build the list that contain each authenticator providing
# context increase
for auth in authenticators:
self.data.append(auth)
#method_decorator(check_permissions(data))
Why data and not self.data ? How is it possible ?
def dispatch(self, request, *args, **kwargs):
return super(SecurityMixin, self).dispatch(request, *args, **kwargs)
Each view then inherits from SecurityMixin and got authenticators = [UserLoginMixin, ...] as class attribute.
The problem I have sometimes (I can't reproduce the bug...) is that I got KeyError on augmented kwargs while URL definition is properly set. eg:
appo_id = kwargs['appo_id']
KeyError: 'appo_id'Exception
I've been looking for hours and it seems that I will never have the solution... It's a bit frustrating.
If someone could help It'll be greatly appreciated.
I have a hunch that improper handling of class attributes is at fault.
CLASS VS INSTANCE
The class attribute data is overwritten every time SecurityMixin.__init__ is called:
class A:
data = []
def __init__(self, *args):
self.data.clear() # self.data references the class attribute
for x in args:
self.data.append(x)
x = A('foo')
# A.data = ['foo']
# x.data = ['foo']
y = A('bar')
# A.data = ['bar']
# y.data = ['bar']
# x.data = ['bar'] !!
HOWEVER:
class A:
data = ['I am empty']
def __init__(self, *args):
self.data = [] # redeclaring data as an instance attribute
for x in args:
self.data.append(x)
x = A('foo')
# A.data = ['I am empty']
# x.data = ['foo']
y = A('bar')
# A.data = ['I am empty']
# y.data = ['bar']
# x.data = ['foo']
This class attribute data is passed to the decorator (you cannot pass an instance attribute to a method decorator, i.e. self.data, because the instance does not yet exist during decorator declaration).
The wrapped function, however, does have access to the instance if it is passed in ('self' argument).
Django's method_decorator removes this self argument; that decorator is used to transform a function decorator (which does not get a self argument implicitly) into a method decorator (which gets a self parameter implicitly). That's why you do not have to include self in the list of parameters for the various mixin check methods as it was removed by method_decorator. To put it simply: use method_decorator to decorate a method with a function decorator. Read up on it here decorating CBVs.
Knowing that, I am not really sure why check_permissions should be a function decorator as it is now when you only use it to decorate methods.
You could just decorate dispatch with check_permissions itself:
def check_permissions(_dispatch):
def _decorator(self, request, *args, **kwargs): # adding self
for mixin in self.data: # referencing the INSTANCE data
kwargs = mixin.check(request, *args, **kwargs)
if isinstance(kwargs, HttpResponseRedirect):
return kwargs
return _dispatch(self, request, *args, **kwargs) # don't forget self here
return _decorator
#check_permissions
def dispatch(self, request, *args, **kwargs):
...
Maybe some view is trying to check AppoExistMixin because it is in that view's data list, although it should not be - and the view's kwargs do not include 'appo_id'. You could also try being explicit by passing the wanted check mixins directly to the decorator: #method_decorator(check_permissions([UserLoginMixin, ...])). This way you you don't have to mess with class vs instance attributes.
Also... you should rename data to something that you are unlikely to overwrite with your own variable.
If you want to be super-lazy you could just do:
appo_id = kwargs.get('appo_id',False)
if not appo_id: return kwargs
But this would only fix that particular error in that one view. It's ignoring a symptom instead of curing the disease.
Some more explanation:
function vs method. check_permissions is a function, while dispatch() is a method. You cannot simply use a function decorator on a method: for one, because the implicit argument self (the instance the method belongs to) is passed to the decorator as well, although it may not expect it.
That is where django's method_decorator comes in by removing and storing self within the decorator. Compare the two signatures: wrapper(request, *args, **kwargs) vs _decorator(self, request, *args, **kwargs). In the former, method_decorator 'absorbed' self before the function decorator is called.
Think of it as an adapter, a decorator for the decorator, that 'bridges the gap' between function and method. Use it if you don't want to/cannot alter the decorator.
In your case, however, you can change the decorator to make it work with a method - thus you don't need django's method_decorator.
The goal is removing of duplicates from list field while saving model. For example creation in migration:
def migrate_model(apps, *args):
MyModel = apps.get_model('my_app.MyModel')
m = MyModel.objects.create(
array_field=['123','123'],
)
m.array_field # ['123']
I tried to overwrite save but it doesn't work
class MyModel(models.Model):
array_field = ArrayField(models.CharField(max_length=5))
def save(self, *args, **kwargs):
if self.array_field:
self.array_field = list(set(self.array_field))
super(MyModel, self).save(*args, **kwargs)
How can I do this?
Careful, the save() method is NOT called when using create() according to django docs.
Maybe that is causing you the problems, because your overriden save method doesn't actually gets called.
I have been combing through the internet for quite some while without finding any solution to this problem.
What I am trying to do...
I have the following models:
class TrackingEventType(models.Model):
required_previous_event = models.ForeignKey(TrackingEventType)
class TrackingEvent(models.Model):
tracking = models.ForeignKey(Tracking)
class Tracking(models.Model):
last_event = models.ForeignKey(TrackingEvent)
Now the main model is Tracking, so my admin for Tracking looks like this:
class TrackingEventInline(admin.TabularInline):
model = TrackingEvent
extra = 0
class TrackingAdmin(admin.ModelAdmin):
inlines = [TrackingEventInline]
That's it for the current setup.
Now my quest:
In the TrackingAdmin, when I add new TrackingEvent inlines, I want to limit the options of TrackingEventType to onlye those, that are allowed to follow on the last TrackingEvent of the Tracking. (Tracking.last_event == TrackingEventType.required_previous_event).
For this, I would need to be able to access the related Tracking on the InlineTrackingEvent, to access the last_event and filter the options for TrackingEventType accordingly.
So I found this: Accessing parent model instance from modelform of admin inline, but when I set up TrackingEventInline accordingly:
class MyFormSet(forms.BaseInlineFormSet):
def _construct_form(self, i, **kwargs):
kwargs['parent_object'] = self.instance
print self.instance
return super(MyFormSet, self)._construct_form(i, **kwargs)
class MyForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
print kwargs
self.parent_object = kwargs.pop('parent_object')
super(MyForm, self).__init__(*args, **kwargs)
class TrackingEventInline(admin.TabularInline):
form = MyForm
formset = MyFormSet
model = TrackingEvent
extra = 0
I get a KeyError at /admin/.../tracking/2/change/ 'parent_object' from self.parent_object = kwargs.pop('parent_object')
Does anyone know how to solve this? Am I approaching the problem the wrong way? I guess this would be pretty easy in a custom form in the frontend, but I really want to use the admin, because the whole application is built to be used from the admin, and it would be a hell lot of work to build a custom admin interface just because of this problem :)
Ok, so posting on StackOverflow is always helping to get the problem straight. I was able to put together a solution that works for me.
It includes defining my own Form in a outer function, as well as defining two InlineAdmin objects for TrackingEvent (one for update / edit, one just for insert).
Here's the code:
def create_trackingevent_form(tracking):
"""
"""
class TrackingEventForm(forms.ModelForm):
"""
Form for Tracking Event Inline
"""
def clean(self):
"""
May not be needed anymore, since event type choices are limited when creating new event.
"""
next_eventtype = self.cleaned_data['event_type']
tracking = self.cleaned_data['tracking']
# get last event, this also ensures last_event gets updated everytime the change form for TrackingEvent is loaded
last_eventtype = tracking.set_last_event()
if last_eventtype:
last_eventtype = last_eventtype.event_type
pk = self.instance.pk
insert = pk == None
# check if the event is updated or newly created
if insert:
if next_eventtype.required_previous_event == last_eventtype:
pass
else:
raise forms.ValidationError('"{}" requires "{}" as last event, "{}" found. Possible next events: {}'.format(
next_eventtype,
next_eventtype.required_previous_event,
last_eventtype,
'"%s" ' % ', '.join(map(str, [x.name for x in tracking.next_tracking_eventtype_options()]))
)
)
else:
pass
return self.cleaned_data
def __init__(self, *args, **kwargs):
# You can use the outer function's 'tracking' here
self.parent_object = tracking
super(TrackingEventForm, self).__init__(*args, **kwargs)
self.fields['event_type'].queryset = tracking.next_tracking_eventtype_options()
#self.fields['event_type'].limit_choices_to = tracking.next_tracking_eventtype_options()
return TrackingEventForm
class TrackingEventInline(admin.TabularInline):
#form = MyForm
#formset = MyFormSet
model = TrackingEvent
extra = 0
#readonly_fields = ['datetime', 'event_type', 'note']
def has_add_permission(self, request):
return False
class AddTrackingEventInline(admin.TabularInline):
model = TrackingEvent
extra = 0
def has_change_permission(self, request, obj=None):
return False
def queryset(self, request):
return super(AddTrackingEventInline, self).queryset(request).none()
def get_formset(self, request, obj=None, **kwargs):
if obj:
self.form = create_trackingevent_form(obj)
return super(AddTrackingEventInline, self).get_formset(request, obj, **kwargs)
I hope this helps other people with the same problem.. Some credit to the Stack Overflow threads that helped me come up with this:
Prepopulating inlines based on the parent model in the Django Admin
Limit foreign key choices in select in an inline form in admin
https://docs.djangoproject.com/en/1.9/ref/models/instances/#django.db.models.Model.clean_fields
Please do not hesitate to ask questions if you have any
It seems like if a ModelForm is given an instance, it ignores any values you provide for initial and instead sets it to the value of the instance -- even if that instance is an empty model record.
Is there any way to create a form with an instance and have it set initial data?
I need it because I'm saving related records and they don't appear to save correctly unless the ModelForm is given an instance when created.
I'm sure the answer to this is straightforward and I'm just missing something obvious.
Here is the relevant code:
in the view:
form = form_class(person=person, conference=conference, initial=initial, instance=registration)
where form_class is RegistrationForm and then in the registration form:
class RegisterForm(forms.ModelForm):
... fields here ...
def __init__(self, *args, **kwargs):
... other code ...
self.person = kwargs.pop('person')
super(RegisterForm, self).__init__(*args, **kwargs)
for key, in self.fields.keys():
if hasattr(self.person, key):
self.fields[k].initial = getattr(self.person, key)
Then when I call the field, the related fields are empty.
Figured this out after a little bit of googling.
You have to set the initial value before calling super.
So instead of looping through self.fields.keys(), I had to type out the list of fields that I wanted and looped through that instead:
class RegisterForm(forms.ModelForm):
... fields here ...
initial_fields = ['first_name', 'last_name', ... ]
def __init__(self, *args, **kwargs):
... other code ...
self.person = kwargs.pop('person')
for key in self.initial_fields:
if hasattr(self.person, key):
self.fields[k].initial = getattr(self.person, key)
super(RegisterForm, self).__init__(*args, **kwargs)
#Daria rightly points out that you don't have self.fields before calling super. I'm pretty sure this will work:
class RegisterForm(forms.ModelForm):
... fields here ...
initial_fields = ['first_name', 'last_name', ... ]
def __init__(self, *args, **kwargs):
... other code ...
initial = kwargs.pop('initial', {})
self.person = kwargs.pop('person')
for key in self.initial_fields:
if hasattr(self.person, key):
initial[key] = initial.get(key) or getattr(self.person, key)
kwargs['initial'] = initial
super(RegisterForm, self).__init__(*args, **kwargs)
In this version, we use the initial argument to pass the values in. It's also written so that if we already have a value in initial for that field, we don't overwrite it.
Sounds to me that you may be looking for a bound form. Not entirely sure, I'm trying to unpick a similar issue:
Django forms can be instantiated with two arguments which control this kind of thing. As I understand it:
form = MyForm(initial={...}, data={...}, ...)
initial will set the possible values for the fields—like setting a queryset—data will set the actual (or selected) values of a form and create a bound form. Maybe that is what you want. Another, tangental, point you might find interesting is to consider a factory method rather than a constructor, I think the syntax is more natural:
class MyForm(forms.ModelForm):
...
#staticmethod
def makeBoundForm(user):
myObjSet = MyObject.objects.filter(some_attr__user=user)
if len(myObjSet) is not 0:
data = {'myObject': myObjSet[0]}
else:
raise ValueError()
initial = {'myObject': myObjSet}
return MyForm(initial=initial, data=data)
You can also pass extra variables to the class when initializing it. The values you pass can then override initial or POST data.
class RegisterForm(forms.ModelForm):
... fields here ...
def __init__(self, person, conference, *args, **kwargs):
... other code ...
super(RegisterForm, self).__init__(*args, **kwargs)
self.fields['person'] = person
self.fields['conference'] = conference
form = RegisterForm(person, conference, initial=initial, instance=registration)
Use ModelAdmin.get_changeform_initial_data. For example, if you add initial data for form field "report_datetime"
def get_changeform_initial_data(self, request):
initial_data = super().get_changeform_initial_data(request)
initial_data.update(report_datetime=<my_initial_datetime>)
return initial_data
Works for 3.2+. I'm not sure about older versions.
See django docs