How to access mod_ssl environment variables from django using mod_wsgi? - django

I'm trying to get the client's certificate and sign an xml file using it.
I have added the following to my virtual hosts:
SSLVerifyClient optional
SSLVerifyDepth 1
SSLOptions +stdEnvVars
This should allow mod_ssl to get the user's certificate. But I don't know how to pass it along to my django app. Any help is appreciated. Thanks.

You should use
SSLOptions +StdEnvVars
SSLOptions +ExportCertData
in apache config to have SSL_CLIENT_CERT in the environment.
With flask, it will be in request.environ['SSL_CLIENT_CERT']
Based on the discusson of the other answer, it might be request.META['SSL_CLIENT_CERT'] for django.

Those Apache configuration directives mean that mod_ssl environment variables should now be available in the environment inherited by Django. You can therefore access them using the os.environ object in your Django view:
import os
client_cert = os.environ['SSL_CLIENT_CERT']
The SSL_CLIENT_CERT variable contains the PEM-encoded client certificate.

SSLOptions +StdEnvVars +ExportCertData
SSL_CLIENT_CERT will contain the PEM encoded certificate.
SSL_CLIENT_CERT_CHAIN_n (where n is a number) and SSL_SERVER_CERT are also included, but probably uninteresting.
It's a pity that one can't configure exactly which items you want added to the environment. It would be much more svelte having only what's needed (for me common name and that the verify succeeded - though that may be implied with verify required, and for you the client cert PEM).

Related

Incorrect validation on ssl

I was trying to set up ssl using certbot. My webserver is nginx. when I run the command "sudo ./certbot-auto certonly" I enter my domain, which I purchased using netfirms. The domain is pointed to my amazon ec2 instance( public ip). I get this error " Type: unauthorized Detail: Incorrect validation certificate for TLS-SNI-01 challenge." Why is this happening?
I'm assuming it's the apache plugin that you are using.
The way the apache plugin works is that it adds a temporary with a "fake" certificate and SNI hostname that solves the TLS-SNI-01 challenge. Since this server has multiple IP addresses, I'm not certain if the apache plugin is capable of determining the correct IP address to listen on for this temporary . I haven't seen any success stories that explicitly mention this scenario, at least.
Your best bet might be to switch to the webroot plugin, which works by writing files to your existing DocumentRoot. If you'd like to continue using the automatic apache configuration while using the webroot authenticator, try something like this:
./certbot-auto --authenticator webroot --installer apache -w /var/www/html -d example.com
I had a similar problem - only when trying to update an existing key.
What I noticed was that the validation error said the it found a certificate that had all the other domain names in it that I had already requested in the certificate before.
Why does the validator see the previous certificate?
From the logs it seems to set up a new VirtualHost for each domain in the new cert in order to verify that the server is the one pointed to by the DNS. Validation requests to these mini VirtualHosts are not working correctly if it is seeing the existing cert with every domain in it - I though "my virtualhosts set up is somehow causing a problem!"
I thought maybe because I have a wildcard in my virtualHost it is somehow getting picked up before the mini temporary VirtualHosts.
I had named my existing hosts with 3 digit numeric prefixes so that I could carefully order them given that Apache said it processes .conf files in alphabetical order. This would mean they would get processed BEFORE any other .conf files starting with a letter.
I renamed my .conf files by adding a 'c' prefix before the number and now it appears at though it's working because it got passed the verification phase at least now - except now I have exceeded by 20 key requests for the week so I can't complete the process just yet!! Doh!

Using lets encrypt without control over the root directory

I'm running a Django website and using lets encrypt for my SSL. Configuration of the framework is such that I can't allow access on: http://url.com/.xxxx
What I can allow free access to is:
http://url.com/static/.xxxx
My /static/ URL can accept and host any random files lets encrypt needs. Is there a way to have certbot support /static/ instead of just using / for the URL?
Thanks
EDIT
I've found a work around that is acceptable for me. Further digging, I found that /.well-known/ is always the base directory for SSL checking. That means we can add a static directory which will work nicely with certbot. Here's how, firstly add this into your apache config:
Alias /.well-known/ /var/www/XXXXX/website/static/.well-known/
<Directory /var/www/XXXXX/website/static/.well-known/>
Require all granted
</Directory>
Then add this into your settings.py file:
STATIC_ENCRYPT_URL = '/.well-known/'
STATIC_ENCRYPT_ROOT = '/var/www/XXXXX/website/static/'
Add this into your urls.py:
urlpatterns = [
...
] + static(settings.STATIC_ENCRYPT_URL, document_root=settings.STATIC_ENCRYPT_ROOT)
Reset your webserver. Now you have a special url /.well-known/ which will host any file certbot requires.
I'd still like a better answer than this.
In case other users come this way like I did from Google, here's how I improved this situation:
I was unsatisfied by my options when it came to creating ACME challenges for Let's Encrypt when running a Django application. So, I rolled my own solution and created a Django app! Basically, you can manage your ACME challenges as just another object, and the app will produce the proper end-point URL.
Simply pip install django-letsencrypt and follow the README to be on your way.

how to retrieve a ssl certificate in django?

Is it possible to retrieve the client's SSL certificate from the current connection in Django?
I don't see the certificate in the request context passed from the lighttpd.
My setup has lighttpd and django working in fastcgi mode.
Currently, I am forced to manually connect back to the client's IP to verify the certificate..
Is there a clever technique to avoid this? Thanks!
Update:
I added these lines to my lighttpd.conf:
ssl.verifyclient.exportcert = "enable"
setenv.add-request-header = (
"SSL_CLIENT_CERT" => env.SSL_CLIENT_CERT
)
Unfortunately, the env.SSL_CLIENT_CERT fails to dereference (does not exist?) and lighttpd fails to start.
If I replace the "env.SSL_CLIENT_CERT" with a static value like "1", it is successfully passed to django in the request.META fields.
Anything else, I could try? This is lighttpd 1.4.29.
Yes. Though this question is not Django specific.
Usually web servers have option to export SSL client-side certificate data as environment variables or HTTP headers. I have done this myself with Apache (not Lighttpd).
This is how I did it
On Apache, export SSL certificate data to environment variables
Then, add a new HTTP request headers containing these environment variables
Read headers in Python code
http://redmine.lighttpd.net/projects/1/wiki/Docs_SSL
Looks like the option name is ssl.verifyclient.exportcert.
Though I am not sure how to do step 2 with lighttpd, as I have little experience on it.

Git with authentication but without ssh

I have the project to set up a git server for my school with a web interface to create repositories and display them. This web part will be handled by Django, which knows the users.
Now the problem: I want authentication to pull and push private repositories but I can't use SSH to handle that part (the IT guys don't want to do support on that). The HTTP protocol is read-only without "complexe WebDAV" (according to the official doc) and use .htaccess as authentication. The problem with .htaccess is to manage them with Django: I tried to use a Django user's hash in it but it didn't work. And, finally, the Git protocol is read-write but it lacks authentication.
Summing up:
I want authentication linked with Django's users database. (To avoid having multiple places with same data)
No SSH
Avoid WebDAV and .htaccess
With these constraints I found that rewriting the git daemon (code on github) to handle authentication would be an idea but I don't know for sure how a Git client would react to that.
If you guys have another idea or want to tell me how better it would be to use WebDAV/.htaccess/..., I will be glad to hear it !
You could setup an Apache server (even one with https like I do in my config!), except you would ask your wsgi application to handle the authentication.
See "Access Control Mechanisms", in the mod_wsgi:
AuthType Basic
AuthName "Top Secret"
AuthBasicProvider wsgi
WSGIAuthUserScript /usr/local/wsgi/scripts/auth.wsgi
Require valid-user
The auth.wsgi script can check the credentials against your Django users database.
That solution means calling the git-http-backend (smart http transport), which is ore efficient than WebDAV.

Django WSGI list enabled modules

I am using Django on Apache with mod_wsgi. I would like to use X-Sendfile for sending files, but with fallback when the X-Sendfile is not available.
Is there any way to list loaded Apache modules or to check whether the X-Sendfile is enabled or not directly from Django? I tried to dump the request variable, but there's no such information.
Add into Apache configuration:
<IfModule mod_xsendfile.c>
Setenv apache.modules.mod_xsendfile On
</IfModule>
The request meta variables, ie., WSGI environ dictionary, would then have an entry for 'apache.modules.mod_xsendfile' with value 'On'. You can check for presence of variable and modify behaviour accordingly.