Dockerized Django REST Framework / Postgresql - Disallowed host - django

When I try to execute the following command:
sudo docker-compose up
I get the following error inside my terminal:
Here is what my docker file looks like:
FROM python:3.6
ENV PYTHONUNBUFFERED 1
RUN mkdir /agent-technologies
WORKDIR /agent-technologies
COPY . /agent-technologies
RUN pip install -r requirements.txt
EXPOSE 8000
Here is what my docker-compose.yml looks like:
version: '3'
services:
db:
image: postgres
environment:
- POSTGRES_USER=stefan_radonjic
- POSTGRES_PASSWORD=cepajecar995
- POSTGRES_DB=agent_technologies_db
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/agent-technologies
ports:
- "8000:8000"
links:
- db
depends_on:
- db
And finally here are settings of PostgresSQL DB I have previously created :
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'agent_technologies_db',
'USER': 'stefan_radonjic',
'PASSWORD': 'cepajecar995',
'HOST': 'db',
'PORT': '',
}
}

The error is quite self explanatory.. You didn't set your ip in your ALLOWED_HOSTS setting. Try adding these to your settings file:
ALLOWED_HOSTS = ['*'] # wildcard, allows all
Or if you want to be explicit:
ALLOWED_HOSTS = ['0.0.0.0'] # explicit

Your django app is trying to connect to 'localhost' database, but as it is inside its own container -- localhost is not right uri to reach database.
Dockercompose will resolve your namequeries to database if you will refer to is by db container name. In your case it is "db"
As text on picture reads -- you have to add "0.0.0.0" into your ALLOWED_HOSTS variable inside your settings module.
Be sure that you've read django documentation carefully. It is pretty vigorous.

Related

Cannot connect django to postgres running inside docker container

My django app is not dockerized, but I run postgres inside docker container using docker-compose.yml script. After docker-compose up I can connect to db with dbeaver, but not with django app. Every time I'm getting an error:
django.db.utils.OperationalError: could not translate host name "db" to address:
Temporary failure in name resolution
File docker-compose.yml:
version: "3.9"
services:
db:
image: postgres:13
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- "POSTGRES_HOST_AUTH_METHOD=trust"
- POSTGRES_USER="postgres"
- POSTGRES_PASSWORD="postgres"
- POSTGRES_DB="postgres"
ports:
- 5432:5432
volumes:
postgres_data
Django project file config/settings.py:
...
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': env.str("DB_NAME"),
'USER': env.str("DB_USER"),
'PASSWORD': env.str("DB_PASS"),
'HOST': env.str("DB_HOST"),
'PORT': env.decimal("DB_PORT")
}
}
It was some kind of pipenv error. When I restarted terminal and executed pipenv shell again and then python manage.py runserver everything worked just fine

could not translate host name to "db" to address: Unknown host

I'm attempting to follow the guide provided by: https://docs.docker.com/compose/django/
Whenever I attempt to makemigrations, it gives me the Unknown host error given in the title. I'm trying to use PostgreSQL with Django and Wagtail as its CMS
My docker-compose.yml looks like:
version: "3.9"
services:
db:
image: postgres
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
and my settings in the settings.py file look like:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'db',
'PORT': 5432,
}
}
Am I missing anything?
Your code can run in two different environments, and hard-coding the connection information might not be correct.
You mention in a comment that you're running something like:
docker-compose up -d db
python manage.py makemigrations
In this environment python is running outside of Docker. If you add ports: [5432:5432] to the database configuration in the docker-compose.yml file, the database will be accessible via (probably) localhost. On the other hand, when you run docker-compose up, the application runs inside Docker and the database will be reachable at db.
You can use an environment variable to configure this. I find it useful to give these variables default values that would be useful for a developer, and set them to different values in my deployment setup (the docker-compose.yml).
DATABASES = {
'default': {
...
'HOST': os.getenv('DB_HOST', 'localhost'),
...
}
}
version: "3.9"
services:
db:
ports:
- '5432:5432' # makes this accessible from your development environment
...
web:
environment:
- DB_HOST=db
...

Docker Compose Django + PostgreSQL access postgres host without using service name

So, basically i have this docker-compose.yml config:
services:
postgres:
container_name: youtube_manager_postgres
restart: always
image: postgres:alpine
environment:
- POSTGRES_HOST_AUTH_METHOD=trust
- POSTGRES_USER=admin
- POSTGRES_PASSWORD=qwerty123
- POSTGRES_DB=ytmanager
volumes:
- postgres_data:/var/lib/postgresql/data/
ports:
- "5432:5432"
django:
container_name: youtube_manager_django
restart: always
build:
context: ../
dockerfile: deploy/django/Dockerfile
command: sh -c "poetry run python3 manage.py migrate &&
poetry run python3 manage.py collectstatic --no-input --clear &&
poetry run uwsgi --ini /etc/uwsgi.ini"
ports:
- "8000:8000"
volumes:
- staticfiles:/code/static
- mediafiles:/code/media
depends_on:
- postgres
My Django's database preferences are:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'ytmanager',
'USER': 'admin',
'HOST': '0.0.0.0',
'PASSWORD': 'qwerty123',
'PORT': '5432',
}
}
I wan't to use it in two ways:
1. Running docker-compose up -d postgres and then python3 manage.py runserver (actually, poetry run python3 manage.py runserver but for now it doesn't matter) during development.
2. Running docker-compose up during deployment.
For now, it works fine with the 1 option, but when I'm execution docker-compose up I'm getting an error:
youtube_manager_django | django.db.utils.OperationalError: could not connect to server: Connection refused
youtube_manager_django | Is the server running on host "0.0.0.0" and accepting
youtube_manager_django | TCP/IP connections on port 5432?
If I'm changing Django database settings this way:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'ytmanager',
'USER': 'admin',
'HOST': '0.0.0.0',
'PASSWORD': 'qwerty123',
'PORT': '5432',
}
}
Executing docker-compose up -d postgres and then python manage.py runserver causes an error:
django.db.utils.OperationalError: could not translate host name "postgres" to address: Temporary failure in name resolution
How could I properly config docker-compose.yml to use the same HOST in Django settings? (for example, 'HOST': '0.0.0.0' or 'HOST': 'postgres' for both options).
I've tried to use network_mode: host on my django and postgres services. It works fine, but is there any other way to solve a problem (for example, using networking settings? I've read docker-compose documentation on their website, but can't get what's going on there).
I think you are mixing develop a production environments (by the way, second time you pasted your Django database settings probably you meant 'HOST': 'postgres')
So, if I'm not wrong:
On development you want in your Django setting .py file: 'HOST': '0.0.0.0', since i think your are
executing python manage.py runserver outside docker,
keeping postgres in docker.
On production you want the same in your Django setting .py file: 'HOST': '0.0.0.0', but to make it work you need 'HOST': 'postgres' (matching the name of the service in the compose file) and run everything on docker (executing the whole compose file as it is: docker-compose up). In this case, Django can't get access to '0.0.0.0' database host since it is running 'containerized' and that ip don't bind to any service, so it needs the ip or name of the service 'postgres'.
Posible solution:
In my opinion the solution is having two yml files to be called by Docker ( e.g. docker-compose -f docker-compose-development.yml up):
docker-compose-development.yml
docker-compose-production.yml
In each .yml file you can use different env variables or settings to cover differences between development and production in a clean way.
You can have a look at:
https://github.com/pydanny/cookiecutter-django. It is a template Django project using Docker.
It follows "The Twelve Factors" app methodology:
https://12factor.net/
In short:
Environment variables are set in files under the .envs folder.
in the compose .yml files you point to them in order to load the environment variables:
env_file:
./.envs/.production/.postgres
Django settings .py files get access to the env variables using the django-environ package.
Before editing Django DB config, please ensure the following:
Both containers are running in the same network.
Postgres service is up and running in the container.
Service 'postgres' is accessible from the webapp container. For this you can login to the container and perform a ping test.
docker exec -it containerID /bin/bash (in some cases /bin/sh) # to login to container
Solutions:
Similar:
Django connection to postgres by docker-compose
For you to connect to DB using service name in Django, as per the documentation if the HOST and PORT keys are left out of the dictionary then Django will try connecting with the complete "NAME" as a SID.
Hope it helps

Struggling with connection between Django and Postgres within Docker containers

I've been hitting the following error for awhile now and can't seem to fix it...
django.db.utils.OperationalError: could not connect to server: Connection refused
Is the server running on host "127.0.0.1" and accepting
TCP/IP connections on port 5432?
Multiple resources stated it was simply due to the HOST setting within my DATABASES, but the following is what I am working with and still can't get it to work:
DATABASES = {
'default': {
'HOST': 'db',
'ENGINE': 'django_postgrespool2',
'NAME': os.environ.get('PROJECT_HEARSAY_DB_NAME'),
'USER': os.environ.get('PROJECT_HEARSAY_DB_USER'),
'PASSWORD': os.environ.get('PROJECT_HEARSAY_DB_PASSWORD'),
'PORT': os.environ.get('PROJECT_HEARSAY_DB_PORT'),
}
}
Here is the Dockerfile for my Django app:
FROM python:3
ENV PYTHONUNBUFFERED 1
COPY . /app
WORKDIR /app
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
Here is the Dockerfile for my Postgresql DB:
FROM postgres
And here is the docker-compose.yml that I am working with:
version: "3"
services:
postgresql:
build:
context: ./db
container_name: db.postgresql
ports:
- 5432:5432
environment:
POSTGRES_DB: "db_name"
POSTGRES_USER: "username"
POSTGRES_PASSWORD: "password"
django:
restart: always
build:
context: ./api
command: bash -c "./manage.py migrate && ./manage.py runserver 0.0.0.0:8000"
container_name: api.django
ports:
- 8000:8000
depends_on:
- postgresql
I'm curious if anyone here could shed some light on what I am doing wrong.
Thank you.
Your container name is db.postgresql, so your connection string should be db.postgresql or postgresql in the host name.
DATABASES = {
'default': {
'HOST': 'db.postgresql',
'ENGINE': 'django_postgrespool2',
'NAME': os.environ.get('PROJECT_HEARSAY_DB_NAME'),
'USER': os.environ.get('PROJECT_HEARSAY_DB_USER'),
'PASSWORD': os.environ.get('PROJECT_HEARSAY_DB_PASSWORD'),
'PORT': os.environ.get('PROJECT_HEARSAY_DB_PORT'),
}
}
you can verify the connection
docker exec api.django bash -c "nslookup db.postgresql"
You will get the Postgres container IP inside your Django container.

Django rest Docker with MySQL

I am trying to dockerize my existing Django Rest project. I am using MySQL database instead of default SqlLite.
My Dockerfile looks like following:
FROM python:2.7
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY . /code/
RUN pip install -r requirements.txt
and Docker-compose:
version: '3'
services:
db:
image: mysql
environment:
MYSQL_ROOT_PASSWORD: docker
MYSQL_DATABASE: docker
MYSQL_USER: docker
MYSQL_PASSWORD: docker
ports:
- "3306:3306"
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
I did not run docker-compose run web python manage.py migrate
docker-compose build is successful
However docker-compose up fails eventually saying Can't connect to local MySQL server. I am guessing that I need to install MySQl in my container as well, but do not know how. What am I missing in my Dockerfile or docker-compose?
UPDATE: My settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', #django.db.backends.mysql
'NAME': 'libraries', #local: libraries #server:
'USER': 'root', #root #root
'PASSWORD': 'root', #local: root #server:
'HOST': 'localhost', #local: localhost #server:
'PORT': '3306',
}
}
Change the HOST in the database settings from localhost to db. There's no MySQL in the web container so the Python code couldn't connect to the db.