Logging web service requests in Django - django

I have created a REST web service using Django. This web service has a log file. I'd like to log all web service (http) requests in the log file. However, the web service request handling is done by Django, I only setup url-request handlers mapping and create the request handlers (views in Django nomenclature). Is there a way to log all requests in a central point, without needing to log each request in its associated request handler (view)?
Thanks in advance.

Yes Django has a built in signals framework.
It allows you to register a function to be called everytime a request starts.
This documenation page explains how to do it step by step
Using the decorator method:
from django.core.signals import request_started
from django.dispatch import receiver
#receiver(request_started)
def my_callback(sender, **kwargs):
# log the request here
pass
Where should this code live? You can put signal handling and
registration code anywhere you like. However, you’ll need to make sure
that the module it’s in gets imported early on so that the signal
handling gets registered before any signals need to be sent. This
makes your app’s models.py a good place to put registration of signal
handlers.

Related

Can a POST to Django REST Framework backend trigger an update of the frontend through Django Channels?

Background:
I am creating a chat app using Django Channels that is the pilot for a more complicated app down the road that will need real-time data updates. Right now, the backend for the chat app is set up with Channels and Django REST Framework. When the chat app is opened, a websocket is created and messages are sent through the websocket, as any other basic chat app does. Also, when a message is sent, the message is posted to the REST framework to store the message in the database.
Question:
This app that I will be creating has other resources that will be posting data to the Django REST Framework, and this data needs to be shown on the front end in real-time. My idea for this is when the REST framework gets a POST request from one of the resources, the data from the POST request gets sent as a message through the websocket so the data gets updated on the frontend. Is there a way to do this? I have been struggling finding resources on this.
Yes you can and it is explained in the docs.
You can interact with the consumer through the channel layer from anywhere in your code. And this includes your DRF views, serializer, etc.
So, when you receive the message in your DRF view, you can do something like this:
from channels.layers import get_channel_layer
from asgiref.sync import async_to_sync
channel_layer = get_channel_layer()
async_to_sync(channel_layer.group_send)("chat", {"type": "send_message", "message": message})
Where "chat" is the group name and send_message is a handler in your consumer that sends the message and message is the message you want to send. Could be a dict or some other serializable type.

Django Signal that provides access to response of a request?

So far in our code we have implemented Django built-in signals for handling user login, logout, and failed login as well as db actions with pre-save, post_save and post_delete signals. I am looking to maintain consistency by performing some operations on the http response after a request has been made using signals. However, I do not see any built-in signal that grants access to the response object. I am aware of the request_finished signal but this does not seem to provide access to the response object. I have successfully achieved what I need using Django's process_response function in middleware but was wondering if there was a way I could maintain consistency and use a signal here as well.
We are using Django 2.2 and DRF 3.8.2

Log http requests in db table with django and tastypie without increase response time

I have android app on which my django server is sending some push notifications using firebase cloud messaging. As soon as client received the notifications, it makes a GET request to my API written using Tastypie.
I want to somehow store this GET request in my Log table so that I can ensure that my android app is working fine and make right requests on receiving notifications from server.
On one of the answers, I read that it can be achieved by overriding the dispatch function for Model Resource in tastypie. Could this logging be achieved without overriding the dispatch because it can potentially increase the response time.

How to manage user in flask using session?

I'm creating chat website using flask socketio. I control people who join chatting using session ( When user login, I set people's session to usernake)like session ['name']=username And when the user is out of chat, I set his or her session none. But in this situation when the user closes the website by not click logout button, there is the user's session though the user logout.. so the user's name is on the list...
How to manage people who do not logout by closing the website?
You might want to try using Javascript to detect a browser close event. It's not a perfect solution, but it will give you a chance in many cases to detect that the user is leaving, and fire off some code to end their session.
You can check out various approaches, but I saw this question which seemed pretty helpful.
As per this doc http://flask.pocoo.org/docs/0.12/api/
you have this parameter for flask.session
permanent If set to True the session lives for
permanent_session_lifetime seconds. The default is 31 days. If set to
False (which is the default) the session will be deleted when the user
closes the browser.
Also from flask-socketio there is a mentioning of session behaviour
The session context global behaves in a different way than in regular
requests. A copy of the user session at the time the SocketIO
connection is established is made available to handlers invoked in the
context of that connection. If a SocketIO handler modifies the
session, the modified session will be preserved for future SocketIO
handlers, but regular HTTP route handlers will not see these changes.
Effectively, when a SocketIO handler modifies the session, a “fork” of
the session is created exclusively for these handlers. The technical
reason for this limitation is that to save the user session a cookie
needs to be sent to the client, and that requires HTTP request and
response, which do not exist in a SocketIO connection. When using
server-side sessions such as those provided by the Flask-Session or
Flask-KVSession extensions, changes made to the session in HTTP route
handlers can be seen by SocketIO handlers, as long as the session is
not modified in the SocketIO handlers.

Properly Securing GAE Task Queue URLs (without using app.yaml)

I want to secure my Task Queue URLs against malicious access.
In the views that the Task Queue requests I've got:
if not users.is_current_user_admin():
return HttpResponse(status=403)
But my Task Queues are receiving 403 errors! I was under the impression from this GAE documentation that the Task Queue user was gauranteed to be an admin. What gives?
NOTE: I'm using DjangoNonRel so I can't specify the admin only url access in my app.yaml, I have to do it programmatically in the views.
Tasks can bypass login: admin restrictions, however users.is_current_user_admin() will still return false, as there is technically no current user.
Using Django-nonrel shouldn't stop you from protecting your tasks with app.yaml. Just add a protected handler above your Django catch-all:
handlers:
- url: /tasks/.+
script: main.py
login: admin
- url: .*
script: main.py
Any URLs that start with /tasks/ will be accessible to the task queue and inaccessible to non-admin visitors, without changing how anything routes.
Your handlers can look for a task queue HTTP header, such as X-AppEngine-QueueName.
From official GAE docs :
Requests from the Task Queue service contain the following HTTP headers:
X-AppEngine-QueueName
X-AppEngine-TaskName
X-AppEngine-TaskRetryCount
X-AppEngine-TaskExecutionCount
X-AppEngine-TaskETA
These headers are set internally by Google App Engine. If your
request handler finds any of these headers, it can trust that the
request is a Task Queue request. If any of the above headers are
present in an external user request to your app, they are stripped.
You can accomplish this by doing 2 checks
Check remote address, it will be 0.1.0.1
Check for existence of header [X-Appengine-Cron].
This will secure you Task Queue URLs (this is only applicable for Pull Queues as per my knowledge).
I wrote a decorator which does this checks for me.Hope this was helpful
For more info, Please refer Docs