I am currently working on a web project in django and there is a requirement to ensure the safety of transmitting data over a network (passwords, usernames etc.).
I've read on owasp cheat sheet about authenication that for safety reasons all passwords should be sent from a client to a server over tsl protocol.
https://www.owasp.org/index.php/Authentication_Cheat_Sheet#Transmit_Passwords_Only_Over_TLS_or_Other_Strong_Transport
Django framework sends these over http protocol. Is it possible to make django send it over tsl or work around it in another way?
When you run a Django application on the Internet, it's usually looking something like this:
[Django Application] <-> [uWSGI] <-> [nginx] <-> [web browser]
You can use different components, e.g. Gunicorn instead of uWSGI or Apache instead of nginx.
The thing is, you simply configure the webserver (Apache or nginx or whatever) with an SSL certificate and listen for https instead of http.
I think you're using Django runserver command for server your app over HTTP. It is absolutely not made for production and is a really HTTP (only) server for development.
For serve your app across SSL/TLS, you must use a frontend as described in henrikstroem's response
Related
Is it possible to deploy a django project without using third party tools like nginx or apache just to serve up https:// webpages? Being forced to setup a reverse proxy or some other web server just to serve https seems a bit overkill.
Using of built-in development server (manage.py runserver) is a bad idea for production environment. But, yes you can use SSL connection even with built-in server
Better idea is to use some application server. For example gunicorn. And yes again, you can serve SSL connection with gunicorn.
Apache or Nginx servers are not just for https. These allows you to effectively control other server resources like max number of processes, request/response headers, etc. WEB servers support many features that you can set without writing python code. And that will be more understandable for infra/server engineers.
I have a django application that I want to deploy using daphne.
Django application supports both websockets and http requests. I've converted the django to support ASGI.
I'm starting the server using :
daphne <project_name>.asgi:application
The server is able to accept websocket connections but unable to handle the incoming HTTP requests (throws 404).
Where am I going wrong over here?
P.S.: I'm not using django channels.
I had forgotten to instantiate 'get_asgi_application' while creating the django application. Hence, it wasn't able to accept HTTP requests.
I think I have some confusion in the understanding of websocket server and webserver.
So I followed the tutorial of django channels, where I created a little app that listens on a channel and returns some response.
At the same time, I can still serve webpages with normal view functions, so how does django do this magic so that it works without me modifying anything in the nginx server config?
The documentation mentions how this works:
It separates Django into two process types:
One that handles HTTP and WebSockets
One that runs views, websocket handlers and background tasks (consumers)
They communicate via a protocol called ASGI, which is similar to WSGI but runs over a network and allows for more protocol types. [...] probably Daphne
I have followed this tutorial: http://www.stephendiehl.com/?p=309 describing how to run a gevent pywsgi server serving Django with socketio behind a nginx front-end.
As this tutorial says, Nginx doesn't support websocket unless using a tcp proxy module. This proxy module doesn't support the use of the same port for socketio and classic serving, from what I understood the configuration look like that:
nginx listen on port 80
nginx tcp proxy listen on port 7000
Everything is forwarded to port 8000
Problem: the resulting socketio request doesn't include the django cookie containing the session id so I have no information on the requesting user in my django view.
I guess it's caused by the fact that the request is made to another port (7000) causing the browser to identify the request as cross-domain ?
What would be the cleanest way to include the django cookie into the request ?
Most answers in this question seem to indicate that port doesn't matter.
Also checked and supposedly WebSockets is regarded as HTTP, so HTTPOnly cookies should still be sent.
SocketIO seems to be using a custom Session manager to track users. Maybe try and link that up?
Major web frameworks (such as Django, Pyramid, Rails, etc) are often run as persistent servers, with a separate web server like nginx serving as a frontend. The web server connects via a protocol like FastCGI or SCGI:
browser --[http]--> nginx --[fastcgi]--> flup -> django
This seems convoluted to me; why is the request converted to an entirely different protocol, when the backend could just run its own HTTP server?
browser --[http]--> nginx --[http]--> wsgiref -> django
This approach appears to be both simpler and more flexible, since there's only one transport protocol and it's an RFC.
However, I don't think I've ever seen a web framework encourage the http-only design, so I assume there must be a reason for it.
What are the advantages of using a protocol like FastCGI/SCGI here?
HTTP is a large, complex protocol. Paring the interface down to the capabilities provided by FastCGI or WSGI allows the framework to handle requests faster than if it had to deal with the original.