I am currently running into an issue with one of my projects that will be running in Docker on my Ubuntu Server with a NGINX docker container to manage the reverse proxy for the Django Project. My issue I am running into is I already have previous Django projects running on that particular Ubuntu server so port 80 is already being used by a NGINX block running on the actual server.
Is there a workaround to running my Docker NGINX as well as the Ubuntu NGINX and have my docker image run as a "add on" site because the Django sites hosted there are clients websites, so I would prefer to not interfere with them if I dont have to.
My project needs HTTPS because it is serving data to a React-Native app running on Android APK 28 which for some reason has a security rule that blocks non HTTPS connections from happening in the app. If anyone else has run into an issue like this I would gladly appreciate the advice on how to tackle this issue.
I have tried running NGINX in Docker with port 81 instead of port 80 and that works perfectly, but I dont think there is a way to make a secure connection to port 81 is there?
Thanks in advance.
You can't just mess with default HTTP ports for endpoints - user browsers use 80 and 443 by default. If you change those, your users would have to connect to your.server.com:81 or something similar. Nobody would do that for a public server, but this can be an option for a private one.
I think a reasonable way out of this will be to use host's NGINX to proxy requests into Docker's NGINX (if there is sense in keeping it at all). You can handle HTTPS termination on host's NGINX and pass plain HTTP into Docker's one.
Another adequate option is to use another server, so that everything works with no dirty hacking involved.
Related
I have Django hosted with Nginx on DigitalOcean. Now I want to install Plausible Analytics. How do I do this? How do I change the Nginx config to get to the Plausible dashboard with mydomain/plausible for example?
Setup plausible by either running the software directly or in a docker container - let's say it runs on port 8080
Then in your nginx.conf - you should have a server block for your domain
Within that add a location block with the path you want plausible on and add a proxy pass directive to forward the requests to localhost:8080
Monitor access.log and error.log to debug any issues that may happen
I have a stack Django+Gunicorn+nginx running in docker containers. It is accessible from outside by domain and Port, like web.example.com:1300 . Also, there is Nginx Proxy Manager (NPM) running (uses ports 80 and 443) and succesfully managing some other resources (for example nextcloud). But it doesn't proxy to my django project at port 1300, shows "502 Bad Gateway".
In the Proxy Hosts of NPM I've added config:
domain names: web.example.com
Forward Hostname / IP: nginx_docker_container_name (this way it works with other resources)
Forward Port: 1300
Other settings: tried multiple combinations without success (like with and without SSL certificates etc.)
Is it possible to proxy using NPM?
Sorry if I missed to write some information, actually I do not know what else to state.
I managed to solve the problem myself.
So, nginx in docker container serves web-site with static pages. Nginx proxy manager proxying htpp protocol to nginx and secures communication (and also works from docker container in my set-up).
My mistake was that I didn't connect those docker containers by virtual network.
Ones I put them into one network - everything works.
Then I unpublished nginx port (1300).
NPM proxy settings are "standard", e.g. no "custom location" and nothing in "Advanced" tab. Just "Forward Hostname / IP" is docker container tag and "Forward Port" is nginx port it listens to (80 by default).
With WhiteNoise , you don't need to configure nginx for django static files
❤️❤️❤️
I have a docker container running a frontend (react) on port 3000 and a backend (django) on port 8000. From inside the container I can run
wget localhost:8000/
and I get back what the server has to give me back. This also works if I forward port 8000 and I call wget from outside the container.
But what about the frontend? Since it resides in the same container of the backend, I suppose it resides in the same localhost, so it should be able to retrieve the information from the backend using
wget localhost:8000/
But this is not what happens (I get ERR_CONNECTION_REFUSED)
Is it because when I run the frontend, the request comes actually from the browser on my local machine, which lives outside the container (and also outside the server)?
Or am I getting something wrong and wget localhost:8000/ should work also from my browser?
The frontend is running in your browser and therefore your thought that the request comes from your browser is the correct one. In this case you would have to expose a port for the Django backend so your browser can get to it from the "public" IP space.
Sounds like only port 8000 is mapped.
If you start the container make sure to add -p 3000:3000
I'm a complete newbie when it comes to networking. I have two PCs on my LAN both running Manjaro. My main aim is to test functionality on a Django server running on one PC, from the other. I am running the Django server on the PC with ip address 192.168.1.138 using the command
python manage.py runserver 192.168.1.138:8000
and in settings.py
ALLOWED_HOSTS = ['localhost', '192.168.1.138']
I can ping 192.168.1.138 from the client PC, and ping the client PC from the server PC. But if I enter the ip address/port into the browser, it fails with
took too long to respond
I don't know if this a separate problem or a manifestation of the first, but when I run NitroShare, I am able to 'see' the PC running the Django server from the PC acting as the client, but if I try to transfer a file, again it times out. I am unable to see the client from the server in NitroShare.
Any suggestions or help gratefully received
Ensure you don't have a firewall running (or that it allows connections to port 8000). Manjaro's docs imply there might be no firewall by default, but in case there is, see https://wiki.manjaro.org/index.php?title=Firewalls
Set ALLOWED_HOSTS = ['*'], don't bother with limiting them.
Run with python manage.py runserver 0:8000 ; the 0 stands for 0.0.0.0, i.e. has the server listening on all network interfaces.
First I would scan with the other PC the open ports of you "Server"-PC, you can do that with tools like Nmap. Make sure you opened the ports of your "Server"-PC at your router interface. Another option could be the launching of the django app in a docker container. Here's the link of the official docker image at DockerHub:
https://hub.docker.com/_/django
I'd like to open up my django app to other machines in the office during development.
I understand that it's a bad idea to run the django development server as root. The recommended way to serve a django app on port 80, even during development, appears to be django, plus gunicorn, plus nginx. This seems super complicated to me. I got the first two steps working, but am now staring at nginx in utter confusion. There's no mac build on the site. Do I really have to build it from the source?
One alternative I've come across is localtunnel. But this seems sketchy to me, and involves setting up public keys and whatnot. Is there any simpler way to serve a django app on a mac from port 80 without running it as root?
Also, just what are the risks of running a django development server on port 80 as root, vs not as root? What are the chances that someone could, say, gain total access to my file system? And, given the default user settings on a mac, is this more likely if I'm running my django dev server as root than if I'm running it as not-root?
Since you mentioned you don't want to run the Django server as root and you are on a mac, you could forward traffic from port 80 to port 8000:
sudo ipfw add 100 fwd 127.0.0.1,8000 tcp from any to any 80 in
and then run the Django server as a normal user (by default it serves on port 8000)
./manage.py runserver
To remove the port forwarding, run:
sudo ipfw flush