I have a middleware where I a want to log every request/response. How can I access to POST data? - django

I have this middleware
import logging
request_logger = logging.getLogger('api.request.logger')
class LoggingMiddleware(object):
def process_response(self, request, response):
request_logger.log(logging.DEBUG,
"GET: {}. POST: {} response code: {}. response "
"content: {}".format(request.GET, request.DATA,
response.status_code,
response.content))
return response
The problem is that request in process_response method has no .POST nor .DATA nor .body. I am using django-rest-framework and my requests has Content-Type: application/json
Note, that if I put logging to process_request method - it has .body and everything I need. However, I need both request and response in a single log entry.

Here is complete solution I made
"""
Api middleware module
"""
import logging
request_logger = logging.getLogger('api.request.logger')
class LoggingMiddleware(object):
"""
Provides full logging of requests and responses
"""
_initial_http_body = None
def process_request(self, request):
self._initial_http_body = request.body # this requires because for some reasons there is no way to access request.body in the 'process_response' method.
def process_response(self, request, response):
"""
Adding request and response logging
"""
if request.path.startswith('/api/') and \
(request.method == "POST" and
request.META.get('CONTENT_TYPE') == 'application/json'
or request.method == "GET"):
request_logger.log(logging.DEBUG,
"GET: {}. body: {} response code: {}. "
"response "
"content: {}"
.format(request.GET, self._initial_http_body,
response.status_code,
response.content), extra={
'tags': {
'url': request.build_absolute_uri()
}
})
return response
Note, this
'tags': {
'url': request.build_absolute_uri()
}
will allow you to filter by url in sentry.

Andrey's solution will break on concurrent requests. You'd need to store the body somewhere in the request scope and fetch it in the process_response().
class RequestLoggerMiddleware(object):
def process_request(self, request):
request._body_to_log = request.body
def process_response(self, request, response):
if not hasattr(request, '_body_to_log'):
return response
msg = "method=%s path=%s status=%s request.body=%s response.body=%s"
args = (request.method,
request.path,
response.status_code,
request._body_to_log,
response.content)
request_logger.info(msg, *args)
return response

All answers above have one potential problem -- big request.body passed to the server. In Django request.body is a property. (from framework)
#property
def body(self):
if not hasattr(self, '_body'):
if self._read_started:
raise RawPostDataException("You cannot access body after reading from request's data stream")
try:
self._body = self.read()
except IOError as e:
six.reraise(UnreadablePostError, UnreadablePostError(*e.args), sys.exc_info()[2])
self._stream = BytesIO(self._body)
return self._body
Django framework access body directly only in one case. (from framework)
elif self.META.get('CONTENT_TYPE', '').startswith('application/x-www-form-urlencoded'):
As you can see, property body read the entire request into memory. As a result, your server can simply crash. Moreover, it becomes vulnerable to DoS attack.
In this case I would suggest using another method of HttpRequest class. (from framework)
def readlines(self):
return list(iter(self))
So, you no longer need to do this
def process_request(self, request):
request._body_to_log = request.body
you can simply do:
def process_response(self, request, response):
msg = "method=%s path=%s status=%s request.body=%s response.body=%s"
args = (request.method,
request.path,
response.status_code,
request.readlines(),
response.content)
request_logger.info(msg, *args)
return response
EDIT: this approach with request.readlines() has problems. Sometimes it does not log anything.

It's frustrating and surprising that there is no easy-to-use request logging package in Django.
So I created one myself. Check it out: https://github.com/rhumbixsf/django-request-logging.git
Uses the logging system so it is easy to configure. This is what you get with DEBUG level:
GET/POST request url
POST BODY if any
GET/POST request url - response code
Response body

It is like accessing the form data to create a new form.
You must use request.POST for this (perhaps request.FILES is something you'd log as well).
class LoggingMiddleware(object):
def process_response(self, request, response):
request_logger.log(logging.DEBUG,
"GET: {}. POST: {} response code: {}. response "
"content: {}".format(request.GET, request.POST,
response.status_code,
response.content))
return response
See Here for request properties.

You can use like below:
"""
Middleware to log requests and responses.
"""
import socket
import time
import json
import logging
request_logger = logging.getLogger(__name__)
class RequestLogMiddleware:
"""Request Logging Middleware."""
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
log_data = {}
# add request payload to log_data
req_body = json.loads(request.body.decode("utf-8")) if request.body else {}
log_data["request_body"] = req_body
# request passes on to controller
response = self.get_response(request)
# add response payload to our log_data
if response and response["content-type"] == "application/json":
response_body = json.loads(response.content.decode("utf-8"))
log_data["response_body"] = response_body
request_logger.info(msg=log_data)
return response
# Log unhandled exceptions as well
def process_exception(self, request, exception):
try:
raise exception
except Exception as e:
request_logger.exception("Unhandled Exception: " + str(e))
return exception
You can also check this out - log requests via middleware explains this

Also note, that response.content returns bytestring and not unicode string so if you need to print unicode, you need to call response.content.decode("utf-8").

You cannot access request.POST (or equivalently request.body) in the process_response part of the middleware. Here is a ticket raising the issue. Though you can have it in the process_request part. The previous answers give a class-based middleware. Django 2.0+ and 3.0+ allow function based middlewares.
from .models import RequestData # Model that stores all the request data
def requestMiddleware(get_response):
# One-time configuration and initialization.
def middleware(request):
# Code to be executed for each request before
# the view (and later middleware) are called.
try : metadata = request.META ;
except : metadata = 'no data'
try : data = request.body ;
except : data = 'no data'
try : u = str(request.user)
except : u = 'nouser'
response = get_response(request)
w = RequestData.objects.create(userdata=u, metadata=metadata,data=data )
w.save()
return response
return middleware
Model RequestData looks as follows -
class RequestData(models.Model):
time = models.DateTimeField(auto_now_add=True)
userdata = models.CharField(max_length=10000, default=' ')
data = models.CharField(max_length=20000, default=' ')
metadata = models.CharField(max_length=20000, default=' ')

Related

DRF- Post Request Unitest

I have a post method under View Set. I need to write a unit test case for the method. when I pass param its give None. How should I pass both param and data(payload).
views.py
#action(detail=True, methods=['post'])
def complete_task(self, request, *args, **kwargs):
"""
Method for complete the task
input post request : task_id : str, variable_return:boolean, request data: dict
output Response : gives whether task is completed or not
"""
try:
get_task_id = self.request.query_params.get("task_id")
get_process_variables = request.data
print(get_task_id)
print(get_process_variables)
complete_task = CamundaWriteMixins.complete_task(url=CAMUNDA_URL, task_id=get_task_id,
process_variable_data=get_process_variables)
print("compl", complete_task)
return Response({"task_status": str(complete_task)})
except Exception as error:
return Response(error)
test.py
def test_completed_task(self):
self.client = Client()
url = reverse('complete-task')
data = {"variables": {
"dept_status": {"value": "approved", "type": "String"}}
}
response = self.client.post(url, data=data, params={"task_id": "000c29840512"},
headers={'Content-Type': 'application/json'})
print(response.data)
self.assertTrue(response.data)
I have tried above test case method which is getting request data but I got param None.
Thanks in Advance,.
if you just modify your request a bit and add query param as part of your url then i guess you are good to go.
Example:
response = self.client.post(f'{url}?task_id=000c29840512', data=data,
headers={'Content-Type': 'application/json'})
you can refer the official documentation for the example: https://docs.djangoproject.com/en/4.0/topics/testing/tools/

Skip Django Rest Throttling Counter

How can I prevent Django rest throttling count the request when the user request is invalid or the server failed to complete the process?
For example, I need params from the user, but when the user does not give the params, Django rest throttling still counts it.
Is there any solution to skipping the throttling counter when the request is not successful?
Example
class OncePerHourAnonThrottle(AnonRateThrottle):
rate = "1/hour"
class Autoliker(APIView):
throttle_classes = [OncePerHourAnonThrottle]
def get(self, request):
content = {"status": "get"}
return Response(content)
def post(self, request):
post_url = request.POST.get("url", None)
print(post_url)
content = {"status": "post"}
return Response(content)
def throttled(self, request, wait):
raise Throttled(
detail={
"message": "request limit exceeded",
"availableIn": f"{wait} seconds",
"throttleType": "type",
}
)
You can create a decorator to do so.
class OncePerHourAnonThrottle(AnonRateThrottle):
rate = "1/hour"
def allow_request(self, request, view):
"""
This function is copy of SimpleRateThrottle.allow_request
The only difference is, instead of executing self.throttle_success
it directly returns True and doesn't mark this request as success yet.
"""
if self.rate is None:
return True
self.key = self.get_cache_key(request, view)
if self.key is None:
return True
self.history = self.cache.get(self.key, [])
self.now = self.timer()
# Drop any requests from the history which have now passed the
# throttle duration
while self.history and self.history[-1] <= self.now - self.duration:
self.history.pop()
if len(self.history) >= self.num_requests:
return False
return True
def rate_limiter(view_function):
#wraps(view_function)
def inner(view_obj, request, *args, **kwargs):
throttle = OncePerHourAnonThrottle()
allowed = throttle.allow_request(request, None)
if not allowed:
raise exceptions.Throttled(throttle.wait())
try:
response = view_function(view_obj, request, *args, **kwargs)
except Exception as exc:
response = view_obj.handle_exception(exc)
if response.status_code == 200:
# now if everything goes OK, count this request as success
throttle.throttle_success()
return response
return inner
class Autoliker(APIView):
#rate_limiter
def post(requests):
# view logic
pass
This is the basic idea how you can do it, now you can make it a generic decorator or even class based decorator.

Django REST Framework - Testing POST request

I'm currently struggling to make this current unit-test pass:
def test_markNotifications(self):
request_url = f'Notifications/mark_notifications/'
view = NotificationsViewSet.as_view(actions={'post': 'mark_notifications'})
request = self.factory.post(request_url)
request.POST = {'id_notifs': "1"}
force_authenticate(request, user=self.user)
response = view(request)
self.assertEqual(response.status_code, 200)
Here's the associated view:
#action(detail=False, methods=['POST'])
def mark_notifications(self, request, pk=None):
"""
Put Notifications as already read.
"""
id_notifs = request.POST.get("id_notifs")
if not id_notifs:
return Response("Missing parameters.", status=400)
id_notifs = str(id_notifs).split(",")
print(id_notifs)
for id in id_notifs:
notif = Notification.objects.filter(pk=id).first()
if not notif:
return Response("No existant notification with the given id.", status=400)
notif.isRead = True
notif.save()
return Response("Notifications have been marked as read.", status=200)
The problem is that even though I'm passing "id_notifs" through the request in test, I'm getting None when I do id_notifs = request.POST.get("id_notifs").
It seems that the id_notifs I'm passing in the POST request are neither in the body and the form-data. In this context, I have no idea on how to access them.
Looking forward some help, thanks.

how to access POST data inside tastypie custom Authentication

I'm trying to write custom Authentication in tastypie. Basically, I want to do the authentication using the post parameters and I don't want to use the django auth at all, so my code looks something like:
class MyAuthentication(Authentication):
def is_authenticated(self, request, **kwargs):
if request.method == 'POST':
token = request.POST['token']
key = request.POST['key']
return is_key_valid(token,key)
This is more or less the idea. The problem is that I keep getting the following error:
"error_message": "You cannot access body after reading from request's data stream"
I understand that this is related to the fact that I'm accessing the POST, but I could not figure if there is a way to solve it. Any ideas?
Thanks.
EDIT: Maybe I forgot the mention the most important thing. I'm handling form data using a trick I found in github. My resource derives from multipart resource
class MultipartResource(object):
def deserialize(self, request, data, format=None):
if not format:
format = request.META.get('CONTENT_TYPE', 'application/json')
if format == 'application/x-www-form-urlencoded':
return request.POST
if format.startswith('multipart'):
data = request.POST.copy()
data.update(request.FILES)
return data
return super(MultipartResource, self).deserialize(request, data, format)
The problem is the Content-Type in your request' headers isn't correctly set. [Reference]
Tastypie only recognizes xml, json, yaml and bplist. So when sending the POST request, you need to set Content-Type in the request headers to either one of them (eg., application/json).
EDIT:
It seems like you are trying to send a multipart form with files through
Tastypie.
A little background on Tastypie's file upload support by Issac Kelly for
roadmap 1.0 final (hasn't released yet):
Implement a Base64FileField which accepts base64 encoded files (like the one in issue #42) for PUT/POST, and provides the URL for GET requests. This will be part of the main tastypie repo.
We'd like to encourage other implementations to implement as independent projects. There's several ways to do this, and most of them are slightly finicky, and they all have different drawbacks, We'd like to have other options, and document the pros and cons of each
That means for now at least, Tastypie does not officially support multipart
file upload. However, there are forks in the wild that are supposedly working
well, this is one of
them. I haven't tested it though.
Now let me try to explain why you are encountering that error.
In Tastypie resource.py, line 452:
def dispatch(self, request_type, request, **kwargs):
"""
Handles the common operations (allowed HTTP method, authentication,
throttling, method lookup) surrounding most CRUD interactions.
"""
allowed_methods = getattr(self._meta, "%s_allowed_methods" % request_type, None)
if 'HTTP_X_HTTP_METHOD_OVERRIDE' in request.META:
request.method = request.META['HTTP_X_HTTP_METHOD_OVERRIDE']
request_method = self.method_check(request, allowed=allowed_methods)
method = getattr(self, "%s_%s" % (request_method, request_type), None)
if method is None:
raise ImmediateHttpResponse(response=http.HttpNotImplemented())
self.is_authenticated(request)
self.is_authorized(request)
self.throttle_check(request)
# All clear. Process the request.
request = convert_post_to_put(request)
response = method(request, **kwargs)
# Add the throttled request.
self.log_throttled_access(request)
# If what comes back isn't a ``HttpResponse``, assume that the
# request was accepted and that some action occurred. This also
# prevents Django from freaking out.
if not isinstance(response, HttpResponse):
return http.HttpNoContent()
return response
convert_post_to_put(request) is called from here. And here is the code for
convert_post_to_put:
# Based off of ``piston.utils.coerce_put_post``. Similarly BSD-licensed.
# And no, the irony is not lost on me.
def convert_post_to_VERB(request, verb):
"""
Force Django to process the VERB.
"""
if request.method == verb:
if hasattr(request, '_post'):
del(request._post)
del(request._files)
try:
request.method = "POST"
request._load_post_and_files()
request.method = verb
except AttributeError:
request.META['REQUEST_METHOD'] = 'POST'
request._load_post_and_files()
request.META['REQUEST_METHOD'] = verb
setattr(request, verb, request.POST)
return request
def convert_post_to_put(request):
return convert_post_to_VERB(request, verb='PUT')
And this method isn't really intended to handled multipart as it has
side-effect of preventing any further accesses to request.body because
_load_post_and_files() method will set _read_started flag to True:
Django request.body and _load_post_and_files():
#property
def body(self):
if not hasattr(self, '_body'):
if self._read_started:
raise Exception("You cannot access body after reading from request's data stream")
try:
self._body = self.read()
except IOError as e:
six.reraise(UnreadablePostError, UnreadablePostError(*e.args), sys.exc_info()[2])
self._stream = BytesIO(self._body)
return self._body
def read(self, *args, **kwargs):
self._read_started = True
return self._stream.read(*args, **kwargs)
def _load_post_and_files(self):
# Populates self._post and self._files
if self.method != 'POST':
self._post, self._files = QueryDict('', encoding=self._encoding), MultiValueDict()
return
if self._read_started and not hasattr(self, '_body'):
self._mark_post_parse_error()
return
if self.META.get('CONTENT_TYPE', '').startswith('multipart'):
if hasattr(self, '_body'):
# Use already read data
data = BytesIO(self._body)
else:
data = self
try:
self._post, self._files = self.parse_file_upload(self.META, data)
except:
# An error occured while parsing POST data. Since when
# formatting the error the request handler might access
# self.POST, set self._post and self._file to prevent
# attempts to parse POST data again.
# Mark that an error occured. This allows self.__repr__ to
# be explicit about it instead of simply representing an
# empty POST
self._mark_post_parse_error()
raise
else:
self._post, self._files = QueryDict(self.body, encoding=self._encoding), MultiValueDict()
So, you can (though probably shouldn't) monkey-patch Tastypie's
convert_post_to_VERB() method by setting request._body by calling
request.body and then immediately set _read_started=False so that
_load_post_and_files() will read from _body and won't set
_read_started=True:
def convert_post_to_VERB(request, verb):
"""
Force Django to process the VERB.
"""
if request.method == verb:
if hasattr(request, '_post'):
del(request._post)
del(request._files)
request.body # now request._body is set
request._read_started = False # so it won't cause side effects
try:
request.method = "POST"
request._load_post_and_files()
request.method = verb
except AttributeError:
request.META['REQUEST_METHOD'] = 'POST'
request._load_post_and_files()
request.META['REQUEST_METHOD'] = verb
setattr(request, verb, request.POST)
return request
You say you need custom auth which is fine but please consider using the Authorization header instead. By using POST you force Django to parse the entire payload assuming the data is either urlencoded or multipart form encoded. This effectively makes it impossible to use non-form payloads such as JSON or YAML.
class MyAuthentication(Authentication):
def is_authenticated(self, request, **kwargs):
auth_info = request.META.get('HTTP_AUTHORIZATION')
# ...
This error occurs when you access request.body (or request.raw_post_data if you're still on Django 1.3) a second time or, I believe, if you access it after having accessed the POST, GET, META or COOKIES attributes.
Tastypie will access the request.body (raw_post_data) attribute when processing PUT or PATCH requests.
With this in mind and without knowing more detail, I would:
Check if this only happens for POST/PUTs. If so, then you would have to do some overriding of some tastypie methods or abandon your approach for authentication.
Look for places in your code where you access request.body (raw_post_data)
Look for calls on 3rd party modules (perhaps a middleware) that might try to access body/raw_post_data
Hope this helps!
I've created a utility method that works well for me. Though I am not sure how this affects the underlying parts of Django, it works:
import io
def copy_body(request):
data = getattr(request, '_body', request.body)
request._body = data
request._stream = io.BytesIO(data)
request._files = None
return data
I use it in a middleware to add a JSON attribute to request: https://gist.github.com/antonagestam/9add2d69783287025907

Safari doesn't respect Cache-Control HTTP headers?

Using a variety of resources, I've come up with the following django middleware to prevent browser caching for authenticated users:
class NoBrowserCachingMiddleware:
def add_to_header(self, response, key, value):
if response.has_header(key):
values = re.split(r'\s*,\s*', response[key])
if not value in values:
response[key] = ', '.join(values + [value])
else:
response[key] = value
def process_response(self, request, response):
if hasattr(request, 'user') and request.user.is_authenticated():
response['Expires'] = 0
self.add_to_header(response, 'Cache-Control', 'no-cache')
self.add_to_header(response, 'Cache-Control', 'no-store')
self.add_to_header(response, 'Cache-Control', 'must-revalidate')
self.add_to_header(response, 'Pragma', 'no-cache') #HTTP 1.0
if request.is_ajax():
return response
if response.status_code != 200:
return response
if 'text/html' not in response['Content-Type']:
return response
# safari back button fix
response.content = response.content.replace('<body', '<body onunload=""')
return response
I would like to remove the piece where I have to modify the response content. If I do, however, Safari will display the previous cached page after a logout if the user hits the back button. Is there any way to prevent this using standard HTTP headers?
Thanks,
Pete