I am defining a function in my context_processor to pass a value from my settings to use in templates. So what is the right way, and more importantly what is the difference?
def baseurl(request):
return {'BASE_URL': settings.BASE_URL}
or
def baseurl(context):
return {'BASE_URL': settings.BASE_URL}
I have always used the first, but have run into a few examples of the second
From django/template/context.py:
class RequestContext(Context):
"""
This subclass of template.Context automatically populates itself using
the processors defined in TEMPLATE_CONTEXT_PROCESSORS.
Additional processors can be specified as a list of callables
using the "processors" keyword argument.
"""
def __init__(self, request, dict_=None, processors=None, current_app=None,
use_l10n=None, use_tz=None):
Context.__init__(self, dict_, current_app=current_app,
use_l10n=use_l10n, use_tz=use_tz)
if processors is None:
processors = ()
else:
processors = tuple(processors)
for processor in get_standard_processors() + processors:
self.update(processor(request))
The last two lines are the most important. It means that arguments are not named. Thus it doesn't matter how you name your argument.
You are free to call this argument whatever you like but request is the most common or clear. From the Django docs: https://docs.djangoproject.com/en/1.4/ref/templates/api/#writing-your-own-context-processors
A context processor has a very simple interface: It's just a Python function that takes one argument, an HttpRequest object, and returns a dictionary that gets added to the template context.
While there is nothing stopping you from naming this argument context it would be misleading since it is passed an HttpRequest object.
Related
This is mostly curiosity, but is the DB penalty for wrapping an entire view with #transaction.atomic a negligible one?
I'm thinking of views where the GET of a form or its re-display after a validation fail involves processing querysets. (ModelChoiceFields, for example, or fetching an object that the template displays.)
It seems to me to be far more natural to use with transaction.atomic() around the block of code which actually alters a bunch of related DB objects only after the user's inputs have validated.
Am I missing something?
From the source code:
def atomic(using=None, savepoint=True, durable=False):
# Bare decorator: #atomic -- although the first argument is called
# `using`, it's actually the function being decorated.
if callable(using):
return Atomic(DEFAULT_DB_ALIAS, savepoint, durable)(using)
# Decorator: #atomic(...) or context manager: with atomic(...): ...
else:
return Atomic(using, savepoint, durable)
It's the same. In both cases the function is returning an Atomic object which handles whether the transaction should commit or not.
Like the title states, I'm wondering if it's possible to use mixins in a function based view. If so, how would I import it? The following code doesn't work:
def payment_method_view(request, MyMixin):
Thanks!
No.
First of all, it makes no sense. A mixin is something to mix into a class. You "patch" certain functions in a mixin. But how would you do that on a function? A function can have attributes, but the idea is that typically a function does not do much with its attributes. It typically has no workflow where it calls attached functions. If you would be able to apply a mixin, that would mean that all of a sudden your function for example has a payment_method_view.get_queryset function. Although that could perhaps be useful, this is not really how functions are designed to be used.
A function typically encodes a form of computation, it is typically not used as some sort of collection that stores named attributes that interact with each other. A class is typically used for that: it contains elements (class attributes and methods), and they can interact with each other.
But furthermore it would result in a lot of problems. A function has no builtin inheritance mechanisms. So it means that if you applied to mixins that patched a function, then there would not be some method resolution order (MRO) that will "guide" super() calls, etc. As a result, very easily this would break.
Syntactically it makes no sense either. You simply defined a function that takes two parameters: request, and MyMixin. The fact that MyMixin happens to be the name of a class, is - according to Python - a coincidence. Function parameters define a scope, so it means that if you would use MyMixin in the function, you would refer to the value that corresponds to the given parameter.
What you typically do to change a function, is writing a decorator. A decorator is a function that takes as input the function, and alters the function, or creates a new one. In that case the decorated function is used. For example we can make a decorator #login_required:
from functools import wraps
def login_required(f):
#wraps(f)
def g(request, *args, **kwargs):
if request.user.user.is_authenticated():
return f(request, *args, **kwargs)
else:
return HttpResponse('Unauthorized', status=401)
return g
Here we thus have defined a function login_required that takes as input a function f, and we construct a new function g. This function takes as parameters request, *args and **kwargs. First g checks if the user that is attached to the request is autheticated (has logged in), if that is the case, we call the given function f with the request, *args, and **kwargs. If not, we return a 401 exception (not autheticated).
Now we can apply our decorator to a function:
#login_required
def payment_method_view(request):
# ...
pass
So now Python will call our login_required function with payment_method_view as parameter, and the result (the g function) will take the place of the payment_method_view. We thus now require the user to be logged in before the original payment_method_view is called.
I know I can use * to force all keyword arguments to a function/method to be "named".
If I have
def abc(a, *, x=10, z=30):
pass
then the following all work
abc(5)
abc(8, x=12)
abc(9, z=31)
abc(x=17, a=4)
even if I change the function signature to def abc(a, *, x=10, y=20, z=30),
and
abc(7, 13)
throws an error.
This is extremely important because, I can use the logical place, which will help maintenance over time, without being forced to use the end position based on history.
But * is not valid in Python 2.7, and abc(a, *args, x=10, z=30) (which I tried) doesn't work either.
Is there a way to force the use of x=12 in Python 2.7? Or another way of saying: make abc(7, 13) be invalid on Python 2.7.
One way of doing this is by adding a dummy keyword argument that never gets a valid positional value (so don't check for None):
_dummy = object()
def abc(a, dummy_kw=_dummy, x=10, z=30):
if dummy_kw is not _dummy:
raise TypeError("abc() takes 1 positional argument but at least 2 were given")
That will prohibit abc(7, 13) and allow all the others. It works on Python 2 and Python 3, so it is useful when you have code that needs to run on both.
Originally I used:
def _dummy():
pass
but as #mata pointed out _dummy=object() works as well, and cleaner. Essentially any unique memory location that is not used in another way will work.
What about the following:
def abc(a, **kwargs):
# Get arguments from kwargs otherwise use default values
x = kwargs.pop('x', 10)
z = kwargs.pop('z', 30)
if not kwargs: # if kwargs is not empty
print 'extra parameters passed'
pass
This allows to force the use of kwargs and still have default values.
pop removes the key from kwargs, once you use it.
This is potentially very useful as you can check if the user gave extra parameters that do not belong to the function and in this case you can throw an error (for example).
I'm going through the django tutorials and I was wondering what happens when you have 2 functions with the same name in views.py?
for example:
def results(request, poll_id):
p = get_object_or_404(Poll, pk=poll_id)
return render_to_response('polls/results.html', {'poll': p})
def results(request, poll_id):
return HttpResponse("You're looking at the results of poll %s." % poll_id)
when i ran the code, the bottom function was the one that was called. How does this work?
In Python, methods and functions can take any number of arguments; which negates the need to have different function "signatures" to support different types of arguments passed; which is the common use case for function overloading. See 4.7.3. Arbitrary Argument Lists in the python documentation.
The reason the second method gets called is because you simply overwrite the method definition when you define it with the same name (and same argument list). For python, it is the same as:
>>> x = 1
>>> x = 'Hello'
>>> print x
Hello
You just re-defined the same method again, so it uses the last definition.
If I'm not mistaking, you need to use classes if you need extend or override the view method ... Or use "if" statement :)
https://docs.djangoproject.com/en/dev/topics/class-based-views/
In you're example, thats just python's normal behaviour ... reads the file from the top left .. then it compiles it and use it ...
In each view of my application I need to have navigation menu prepared. So right now in every view I execute complicated query and store the menu in a dictionary which is passed to a template. In templates the variable in which I have the data is surrounded with "cache", so even though the queries are quite costly, it doesn't bother me.
But I don't want to repeat myself in every view. I guessed that the best place to prepare the menu is in my own context processor. And so I did write one, but I noticed that even when I don't use the data from the context processor, the queries used to prepare the menu are executed. Is there a way to "lazy load" such data from CP or do I have to use "low level" cache in CP? Or maybe there's a better solution to my problem?
Django has a SimpleLazyObject. In Django 1.3, this is used by the auth context processor (source code). This makes user available in the template context for every query, but the user is only accessed when the template contains {{ user }}.
You should be able to do something similar in your context processor.
from django.utils.functional import SimpleLazyObject
def my_context_processor(request):
def complicated_query():
do_stuff()
return result
return {
'result': SimpleLazyObject(complicated_query)
If you pass a callable object into the template context, Django will evaluate it when it is used in the template. This provides one simple way to do laziness — just pass in callables:
def my_context_processor(request):
def complicated_query():
do_stuff()
return result
return {'my_info': complicated_query}
The problem with this is it does not memoize the call — if you use it multiple times in a template, complicated_query gets called multiple times.
The fix is to use something like SimpleLazyObject as in the other answer, or to use something like functools.lru_cache:
from functools import lru_cache:
def my_context_processor(request):
#lru_cache()
def complicated_query():
result = do_stuff()
return result
return {'my_info': complicated_query}
You can now use my_info in your template, and it will be evaluated lazily, just once.
Or, if the function already exists, you would do it like this:
from somewhere import complicated_query
def my_context_processor(request):
return {'my_info': lru_cache()(complicated_query)}
I would prefer this method over SimpleLazyObject because the latter can produce some strange bugs sometimes.
(I was the one who originally implemented LazyObject and SimpleLazyObject, and discovered for myself that there is curse on any code artefact labelled simple.)