docker, nginx, django and how to serve static files - django

Goal: The set of docker containers for a production django website deployment.
My hang up in this process is that usually nginx directly serves the static files... Based on my understanding of a good architecture using docker, you would have a container for your wsgi server (probably gunicorn), a separate nginx container with an upstream server configuration pointing to your gunicorn container. The nginx container can load balance between multiple gunicorn containers.
But what this implies is that I have to install my django app's static files in the nginx container, which seems like bad practice since it's primary goal is really load balancing
Is it better to have three containers: nginx, gunicorn, and a dedicated static server (possibly nginx or lighthttpd) for static files?

With reference to serving static files, your options depend on the functionality of your application. There's a very nifty tool called dj-static which will help you serve static files by adding very minimal code.
The documentation is fairly simple and all you have to do is follow these steps.

I found this answer from Michael Hampton:
"This only works if the processes are in the same host, VM or container, because it tries to make a connection to the same machine. When they are in different containers, it does not work.
You need to alter your nginx configuration so that it uses the internal IP address of the uwsgi container." Link from the post
And definitely is something you have to keep in mind if you will have Nginx in a different container, also you have to set the nginx.conf, pointing your statics file directory as alias preventing a security issue.
I hope this code work for everybody, it took me a couple of ours to figure about how to compose Gunicorn, docker, and Nginx:
# nginx.conf
upstream djangoA {
server $DOCKER_CONTAINER_SERVICE:9000 max_fails=3 fail_timeout=0;
# In my case looks like: web:9000
}
Server {
include mime.types;
# The port your site will be served on
listen 80;
# the domain name it will serve for
server_name $YOUR_SERVER_NAME;# substitute your machine's IP address or FQDN
charset utf-8;
#Max upload size
client_max_body_size 512M; # adjust to taste
location /site_media {
alias $DIRECTORY_STATIC_FILES/site_media;#your Django project's media files have to be inside of the container have nginxs, you can copy them with volumes.
expires 30d;
}
location / {
try_files $uri #proxy_to_app;
}
# Finally, send all non-media requests to the Django server.
location #proxy_to_app {
proxy_set_header X-Real-IP $remote_addr;
proxy_redirect off;
proxy_set_header Host $host;
proxy_pass http://djangoA;
}
}
And for the docker-compose:
#production.yml
version: '2'
services:
db:
extends:
file: base.yml
service: db
nginx:
image: nginx:latest
volumes:
- ./nginx:/etc/nginx/conf.d/
- ./$STATIC_FILE_ROOT/site_media:/$STATIC_FILE_ROOT/site_media
ports:
- "80:80"
depends_on:
- web
web:
extends:
file: base.yml
service: web
build:
args:
- DJANGO_ENV=production
command: bash -c "python manage.py collectstatic --noinput && chmod 775 -R project/site_media/static && gunicorn project.wsgi:application"
volumes:
- ./$DIRECTORY_APP:/$DIRECTORY_APP
ports:
- "9000:9000"
depends_on:
- db
volumes:
db_data:
external: true

If it's a django app using docker and/or kubernetes take a look at whitenoise http://whitenoise.evans.io/en/stable/ to provide a solution here.
This is a straightforward recommendation, but I spent much time searching around before I found it referenced.

Related

Frontend + NGINX can't make a request to backend endpoints (2 different docker containers)

I am having issues making requests to a backend Django container from a frontend app that is reverse proxied by NGINX.
I have a backend Django server which serves database information, carries out authenticated etc. It is containerised thru a docker container. This is locally served on http://127.0.0.1:8000/. I then have NGINX project.conf as follows:
server {
listen 80;
server_name docker_flask_gunicorn_nginx;
location / {
proxy_pass http://my_app:8080;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
location /static {
rewrite ^/static(.*) /$1 break;
root /static;
}}
There are a few different endpoints in the backend app, but it fails at the first hurdle which is trying to authenticate at /api/token/. When the frontend app makes a request to http://127.0.0.1:8000/api/token/ the following error is returned:
HTTPConnectionPool(host='127.0.0.1', port=8000): Max retries exceeded with url: /api/token/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fc3208f6910>: Failed to establish a new connection: [Errno 111] Connection refused'))
For completeness the docker-compose for the frontend / NGINX setup is:
version: '3'
services:
my_app:
container_name: my_app-frontend
restart: always
build: ./my_app
ports:
- "8080:8080"
command: gunicorn -w 2 -b :8080 app:server
env_file:
- ./my_app/.env
nginx:
container_name: nginx
restart: always
build: ./nginx
ports:
- "80:80"
depends_on:
- my_app
From what I can see on the Django container, the request is never being received by the backend and when I run the frontend app without NGINX it works as expected. As such, I guess that it is an issue with the NGINX setup. I did search through existing questions and some look similar, however I tried the proposed solutions and could not get them to work. For example, I tried changing the API url to point to the docker bridge ip, but that didn't seem to work either. Apologies if this has been answered before, but any help is much appreciated!
Thanks!
The default IP address that gunicorn binds to is 127.0.0.1 which means that it'll only accept connections from inside the container.
Use
command: gunicorn -w 2 -b 0.0.0.0:8080 app:server
to make it accept connections from outside the container.

Django returning "CSRF verification failed. Request aborted. " behind Nginx proxy locally

I'm running a simple Django application without any complicated setup (most of the default, Django allauth & Django Rest Framework).
The infrastructure for running both locally and remotely is in a docker-compose file:
version: "3"
services:
web:
image: web_app
build:
context: .
dockerfile: Dockerfile
command: gunicorn my_service.wsgi --reload --bind 0.0.0.0:80 --workers 3
env_file: .env
volumes:
- ./my_repo/:/app:z
depends_on:
- db
environment:
- DOCKER=1
nginx:
image: nginx_build
build:
context: nginx
dockerfile: Dockerfile
volumes:
- ./my_repo/:/app:z
ports:
- "7000:80"
... # db and so on
as you see, I'm using Gunicorn the serve the application and Nginx as a proxy (for static files and Let's Encrypt setup. The Nginx container has some customizations:
FROM nginx:1.21-alpine
RUN rm /etc/nginx/conf.d/default.conf
COPY nginx.conf /etc/nginx/conf.d
And the nginx.conf file is a reverse proxy with a static mapping:
server {
listen 80;
location / {
proxy_pass http://web;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_redirect off;
}
location /static/ {
alias /app/my_repo/static/;
}
}
Running this on the server after setting up let's encrypt in the Nginx container works without any issue, but locally I get the "CSRF verification failed. Request aborted." error every time I submit a form (e.g. create a dummy user in Django Admin). I exposed the web port and used it to submit the forms and it worked.
Because of that, I deduce that there is something missing in the Nginx config or something to "tell" Django how to handle it. So, what I'm missing and how should I investigate this?
Since you're using a proxy that translates https requests into http, you need to configure Django to allow POST requests from a different scheme (since Django 4.0) by adding this to settings.py:
CSRF_TRUSTED_ORIGINS = ["https://yourdomain.com", "https://www.yourdomain.com"]
If this does not solve your problem, you can temporarily set DEBUG = True in production and try again. On the error page, you will see a "Reason given for failure" that you can post here.

Bad Gateway when configuring nginx with. Django app container and Gunicorn

I'm using docker-compose to deploy a Django app on a VM with Nginx installed on the VM as a web server.
but I'm getting " 502 Bad gateway" I believe it's a network issue I think Nginx can't access the docker container! however, when I use the same configuration in an Nginx container it worked perfectly with the Django app but I need to use the installed one not the one with docker.
This is my docker-compose file:
version: "3.2"
services:
web:
image: ngrorra/newsapp:1.0.2
restart: always
ports:
- "8000:8000"
volumes:
- type: volume
source: django-static
target: /code/static
- type: volume
source: django-media
target: /code/media
environment:
- "DEBUG_MODE=False"
- "DB_HOST=…”
- "DB_PORT=5432"
- "DB_NAME=db_1”
- "DB_USERNAME=username1111"
volumes:
django-static:
django-media:
And this is my nginx.conf file:
upstream web_app {
server web:8000;
}
server {
listen 80;
location /static/ {
autoindex on;
alias /code/static/;
}
location /media/ {
autoindex on;
alias /code/media/;
}
location / {
proxy_pass http://web_app;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_redirect off;
}
#For favicon
location /favicon.ico {
alias /code/assets/favicon.ico;
}
# Error pages
error_page 404 /404.html;
location = /404.html {
root /code/templates/;
}
}
Does anyone know what is the issue?
Thank you!
As commented above, using "web" as host name will not work, you could try localhost or the docker ip (you can get it using ifconfig in Ubuntu, for example).
For the network issue, I think you could create a new docker external network using docker network create and adding to your "network" [definition inside compose] (https://docs.docker.com/compose/networking/#use-a-pre-existing-network). Another possibility is to use the host as network
When I run docker aplications with Nginx, usualy I create first an external docker network with defined IP (using some docker network IP - usualy 172.x.x.x), then add a Nginx container to my docker-compose.yaml and my server inside nginx.conf is something like this:
upstream web_app {
server 172.x.x.x:8000;
}
.
.
.
It works without problems. Hope this can help you.

How is nginx paired with docker treating referenced files?

I am running a dockerized django app. I deployed it on EC2. Nginx is also in a docker container. Nginx in my docker-container is configured so that it uses ssl certificates from Lets Encrypt.
Lets Encrypt certificates are only valid for 90 day, that's why I set a cronjob to renew them.
My question now is: Will my nginx that runs in a docker container automatically use the updated file? Or do I need to spin up my docker container again and build it anew for the changes to take effect? In the latter case, is it possible to tell nginx to use the renewed file so I don't have to rebuild my container? I'm asking because I'd like to minimize downtime for my application.
For more clarity I provide my config. The important files are the referenced ssl certificates:
server {
listen 443 ssl;
server_name mydomain;
charset utf-8;
ssl_stapling off;
ssl_stapling_verify off;
ssl_certificate /etc/letsencrypt/live/mydomain/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/mydomain/privkey.pem;
location / {
proxy_pass http://django:5000;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
Here my compose file:
production-nginx-container:
container_name: 'production-nginx-container'
image: nginx:latest
ports:
- "80:80"
- "443:443"
volumes:
- /home/ubuntu/nginx-conf/myconf.conf:/etc/nginx/conf.d/default.conf
- /etc/letsencrypt/live/mydomain/fullchain.pem:/etc/letsencrypt/live/mydomain/fullchain.pem
- /etc/letsencrypt/live/mydomain/privkey.pem:/etc/letsencrypt/live/mydomain/privkey.pem
depends_on:
- django
I can only see two options: Either nginx keeps this file open the whole time while my docker container is running or it doesn't.
In case it keeps it open I assume I need to restart the docker container which I do not want :).
I'd appreciate any input! Thanks in advance!
Nginx reads the certificates / configs provided at the start. To re-read them you can restart nginx (container) or send reload signal to nginx:
nginx -s reload - command in the container. Also paired with nginx -t beforehand to check that config files' syntax is ok.

Dockerized Nginx and Django, how to serve static files

Im using Docker to containerize my Django environment, which looks like this (simplified a bit):
A Nginx (official image) docker container
An Ubuntu docker container with uwsgi and Django
The Nginx container are serving the uwsgi just fine, but I have not found a way to serve static files.
upstream proceed {
server proceed:8000;
}
server {
listen 80;
server_name mydomain.com;
location /static {
alias /srv/www/proceed/static/; # What to do here?
}
location / {
uwsgi_pass proceed;
include uwsgi_params;
}
}
Question: Whats the best way to serve static files from another container? A solution not involving volumes are preferable.
As said #larsks you need share volumes between containers.
Your django Dockerfile need to contains volume definitions
FROM ubuntu
....
VOLUME /srv/www/proceed/static # there is path to your static
When you run nginx container you need add volumes-from argument
docker run nginx --volumes-from django
And than you nginx config will works fine.
Note! path to static content into django container and into nginx container must be the same!