Docker together with Django and Postgres - django

Having following configuration placed in settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2', # Add 'postgresql$
'NAME': 'database1', # Or path to database file if $
# The following settings are not used with sqlite3:
'USER': 'database1_role',
'PASSWORD': 'database1_password',
'HOST': 'database1', # Empty for localhost throu$
'PORT': '5432', # Set to empty string for defaul$
'ATOMIC_REQUESTS': True
}
}
docker-compose.yml:
version: '3'
services:
# web container, with django + gunicorn
djangoapp:
build: .
environment:
- DJANGO_SETTINGS_MODULE
volumes:
- .:/opt/services/djangoapp/src
- static:/opt/services/djangoapp/static
- media:/opt/services/djangoapp/media
networks:
- database1_network
- nginx_network
depends_on:
- database1
# reverse proxy container (nginx)
nginx:
image: nginx:1.13
ports:
- 8000:80
volumes:
- ./config/nginx/conf.d:/etc/nginx/conf.d
- static:/opt/services/djangoapp/static
- media:/opt/services/djangoapp/media
networks:
- nginx_network
depends_on:
- djangoapp
# database containers, one for each db
database1:
image: postgres:10
environment:
POSTGRES_USER: database1_role
POSTGRES_PASSWORD: database1_password
POSTGRES_DB: database1
ports:
- "5432"
volumes:
- database1_volume:/var/lib/postgresql/data
networks:
- database1_network
networks:
database1_network:
driver: bridge
nginx_network:
driver: bridge
volumes:
database1_volume:
static:
media:
I still do receive following error communicate whenever I am trying to access my site using webbrowser:
OperationalError at / FATAL: password authentication failed for user
"database1_role"
Previously I used to have a message that my database does not even exist / psycopg2 cannot make any connections with it.
I've solved it by replacing the db name in both files:
settings.py as well as POSTGRES_DB, but right at the moment I literally have no idea of what to do right now.
In order to run this setup I am using these two commands:
docker-compose build
docker-compose up
command:
docker-compose ps
clearly says that there are three components running:
Name Command State Ports
-----------------------------------------------------------------------------------
services_database1_1_f84f6d3c38e0 docker-entrypoint.sh postgres Exit 0
services_djangoapp_1_da56c77d50ff gunicorn -c config/gunicor ... Exit 0
services_nginx_1_c6e0edb717c0 nginx -g daemon off; Exit 0

Related

How to access a postgres container through its name? Could not translate host name error

I have a django and postgres containers. When it's time for django to apply migrations, it doesn't see a postgres container that I named pgdb and I get this error:
django.db.utils.OperationalError: could not translate host name "pgdb"
to address: Temporary failure in name resolution
It appears that there is no docker container with name "pgdb". If I run "docker-compose run pgdb" it creates a postgres container with a name of "app_pgdb_run_23423423" under the "app" group. The cute thing is that I made it work previously with this settings.py setup and "pgdb" postgres container name. What could be the underlying issue?
You can clone the full code from https://github.com/UberStreuner/mailing-service
My settings.py setup, the environment variables definitely aren't at fault.
DATABASES = {
'default': {
'ENGINE': os.environ.get('DB_ENGINE', 'django.db.backends.postgresql_psycopg2'),
'NAME': os.environ.get('POSTGRES_DB'),
'USER': os.environ.get('POSTGRES_USER'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
'HOST': os.environ.get('POSTGRES_HOST', 'pgdb'),
'PORT': os.environ.get('POSTGRES_PORT', '5432')
}
}
docker-compose.yml
version: "3.8"
services:
django:
build: .
container_name: django
command: ./docker-entrypoint.sh
volumes:
- .:/usr/src/app/
ports:
- "8000:8000"
env_file:
- ./.dev.env
depends_on:
- pgdb
- redis
celery:
build: .
command: celery -A mailing worker -l INFO
volumes:
- .:/usr/src/app/
env_file:
- ./.dev.env
depends_on:
- django
- redis
- pgdb
celery-beat:
build: .
command: celery -A mailing beat -l INFO
volumes:
- .:/usr/src/app/
env_file:
- ./.dev.env
depends_on:
- django
- redis
- pgdb
pgdb:
image: postgres
container_name: pgdb
env_file:
- ./.dev.env
volumes:
- pgdata:/var/lib/postgresql/data
expose:
- "5432"
ports:
- "5432:5432"
redis:
image: "redis:alpine"
volumes:
pgdata:
.dev.env
SECRET_KEY=django-insecure-kh8ot+ag-^osyhg5itzj!iw_g#dklw3f75c0)^-7)*#^)#tc#x
DEBUG=1
DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1
CELERY_BROKER=redis://redis:6379/0
CELERY_BACKEND=redis://redis:6379/0
POSTGRES_DB=django_postgres
POSTGRES_USER=postgres
POSTGRES_PASSWORD=password
docker-entrypoint.sh
#!/bin/bash
python manage.py migrate
python manage.py runserver 0.0.0.0:8000
I fixed the issue.
It probably was in the fact that I import the environment variables for Django from both python-dotenv through .dev.env file and from docker-compose, where it's passed as env_file. So I commented out dotenv loading from settings.py, and it worked.

docker-compose django unable to connect to postgres

I'm trying to dockerize an existing django application. I've followed the example docker-compose and docker files on https://docs.docker.com/samples/django/
I've added the links and networks suggestion as see on other posts, to no avail.
I've brought up the db by itself with docker-compose up db and run docker ps and confirmed it's up, accepting connections, and has the name db. I'm not sure what else to try.
My docker-compose:
version: '3'
services:
db:
image: postgres
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
container_name: db
networks:
harpnetwork:
harp:
build:
context: .
dockerfile: docker/harp/Dockerfile
volumes:
- ./harp:/harp
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- 16765:16765
depends_on:
- db
networks:
harpnetwork:
links:
- db:db
networks:
harpnetwork:
My django db config:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('POSTGRES_NAME'),
'USER': os.environ.get('POSTGRES_USER'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
'HOST': 'db',
'PORT': 5432,
}
}
and my Dockerfile to build the django project
From python:3.10
COPY ./harp harp
WORKDIR /harp
RUN pip install -r requirements.txt
RUN python manage.py migrate
CMD ["python", "manage.py", "runserver", "0.0.0.0:16765"]

Using a postgres docker as a database for django || Launching dockers before build of another one in the docker-compose.yml file

I know there have been similar questions on this site, but the answers to these are sadly outdated.
So I have a django application and i want to use postgres as the underlying database.
Additionally a want to separate both programs in separate dockers.
Now the docker docs have a way to do it, but sadly it seems to be outdated: link
The problem appears when i call manage.py migrate in the docker build function which is being run by the docker compose file.
But i get the error that the host 'db' is unknown/invalid.
Compose excerpt:
services:
db:
image: postgres
restart: always
volumes:
- DataBase:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- "5432:5432"
app:
build: FehlzeitErfassungProject
restart: always
volumes:
- Logs:/app/logs
- Backups:/app/Backups
- Media:/app/MEDIA
ports:
- "5432:5432"
depends_on:
- db
app dockerfile:
FROM ubuntu
WORKDIR /app
ADD ./requirements.txt ./
RUN apt-get update && \
apt-get upgrade -y
# getting tzdata to shutup
ARG DEBIAN_FRONTEND=noninteractive
ENV TZ=Europe/Berlin
RUN apt-get -y install tzdata
#installing all needed porgramms
RUN apt-get install -y uwsgi python3 python3-pip
RUN python3 -m pip install pip --upgrade
RUN python3 -m pip install -r requirements.txt
COPY ./ ./
RUN ./manage.py migrate
CMD ["uwsgi", "uwsgu.ini"]
PS: It seems to appear, that the other dockers are only launched when the app docker has already finished building
Edit:
the database_settings:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': '127.0.0.1',
'PORT': 5432,
}
}
I think that your problem is in the file "settings.py" in the DB configuration for the HOST param, you need to have the name of the postgresql service that you have defined in the docker-compose file, in this case, is "db".
BTW I think that using constants in settings.py is bad behavior, it's better to define in the docker-compose file environmental variables.
In your case would be:
docker-compose.yml
services:
db:
image: postgres
restart: always
volumes:
- DataBase:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- "5432:5432"
app:
build: FehlzeitErfassungProject
restart: always
volumes:
- Logs:/app/logs
- Backups:/app/Backups
- Media:/app/MEDIA
ports:
- "5432:5432"
environment:
- DB_HOST=db
- DB_NAME=postgres
- DB_USER=postgres
- DB_PASSWORD=postgres
depends_on:
- db
Settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('DB_NAME'),
'USER': os.environ.get('DB_USER'),
'PASSWORD': os.environ.get('DB_PASSWORD'),
'HOST': os.environ.get('DB_HOST'),
}
}
Remember to import os in settings.py.
binding both container port at same port for the host will not work.
to access your database in other container you should link to it.
version: "3.0"
services:
db:
image: postgres
restart: always
volumes:
- DataBase:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- "5432:5432"
app:
build: FehlzeitErfassungProject
restart: always
volumes:
- Logs:/app/logs
- Backups:/app/Backups
- Media:/app/MEDIA
ports:
- "8000:8000"
links:
- "db"
depends_on:
- db

Impossible to connect to my postgreSQL Database

I'm using docker and I have started a new project.
I have a backend in Django and I want to use a postgreSQL Database.
I can't connect to my database with Django or with PGAdmin.
This is what I Got when I try to connect to my database (with Django or with PGAdmin)
He is a part of my docker-compose file:
services:
db:
image: postgres
container_name: db_wishlist
restart: always
volumes:
- ./postgres-data-wishlist:/var/lib/postgresql/data
env_file:
- env/base/db.env
- env/dev/db.env
ports:
- "5432:5432"
networks:
wishlist:
aliases:
- db
pgadmin:
image: dpage/pgadmin4
container_name: pgadmin
depends_on:
- db
ports:
- "5051:80"
env_file:
- env/base/db.env
- env/dev/db.env
environment:
PGADMIN_DEFAULT_EMAIL: pgadmin4#pgadmin.org
PGADMIN_DEFAULT_PASSWORD: root
restart: always
networks:
wishlist:
aliases:
- pgadmin
api:
container_name: backend
build:
context: ..
dockerfile: Docker/Dockerfiles/Django
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ../Django/:/api
ports:
- "8000:8000"
env_file:
- env/base/db.env
- env/dev/db.env
depends_on:
- db
links:
- db:db
networks:
wishlist:
aliases:
- api
Here is my db.env:
POSTGRES_DB=wishlist
POSTGRES_USER=maxdal1
POSTGRES_PASSWORD=demo
Here is my settings.py in Django:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'wishlist',
'USER': 'maxdal1',
'PASSWORD': 'demo',
'HOST': 'db',
'PORT': 5432,
}
}
Here is an additional screenshot when I try to connect to my db with PGAdmin:
Error with PGAdmin
I checked every discussion about this error but I can't figure it out.
Can you help me please ?
Thanks.
Have you tried passing your db user to postgres container via environment variable?
environment:
POSTGRES_USER: "youruser"
The following setup works for me:
docker-compose.yaml
services:
postgres:
image: postgres:10-alpine
volumes:
- ./pgdata:/var/lib/postgresql/data/pgdata
environment:
- PGDATA=/var/lib/postgresql/data/pgdata
- POSTGRES_DB=django_db
- POSTGRES_USER=XXX
- POSTGRES_PASSWORD=XXX
ports:
- 127.0.0.1:5432:5432
networks:
- backend
django:
build:
context: .
dockerfile: Dockerfile
environment:
- POSTGRES_HOST=postgres
- POSTGRES_DB=django_db
- POSTGRES_USER=XXX
- POSTGRES_PASSWORD=XXX
volumes:
- ./staticfiles:/staticfiles
- ./media:/media
ports:
- 8000:8000
networks:
- backend
- frontend
settings.py
POSTGRES_HOST = os.environ.get('POSTGRES_HOST', 'localhost')
POSTGRES_DB = os.environ.get('POSTGRES_DB', 'django_db')
POSTGRES_USER = os.environ.get('POSTGRES_USER', 'XXX')
POSTGRES_PW = os.environ.get('POSTGRES_PASSWORD', 'XXX')
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': POSTGRES_DB,
'HOST': POSTGRES_HOST,
'USER': POSTGRES_USER,
'PASSWORD': POSTGRES_PW,
'CONN_MAX_AGE': 120,
}
}
The Dockerfile for the django app specifies the following env vars:
ENV POSTGRES_DB=$POSTGRES_DB
ENV POSTGRES_HOST=$POSTGRES_HOST
ENV POSTGRES_USER=$POSTGRES_USER
ENV POSTGRES_PASSWORD=$POSTGRES_PASSWORD
You can run this either like this:
$ docker-compose up
or only run the postgres via docker and run Django locally:
$ docker-compose up postgres
in a different terminal:
$ ./manage.py runserver

Django docker - could not translate host name "db" to address: nodename nor servname provided, or not known

I'm relatively new to Django and Docker and am following a tutorial to build a mini application. However, I'm getting stuck with the following error:
django.db.utils.OperationalError: could not translate host name "db" to address: nodename nor servname provided, or not known
My docker-compose file looks as follows:
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432'
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8000:8000'
volumes:
- .:/code
depends_on:
- db
links:
- db:db
My settings.py file contains the database:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'HOST': 'db',
'PORT': 5432,
}
}
I've seen the post here and here however both have not fixed the issue.
Would appreciate some guidance. Thanks.
So you are trying to reach the db which is running in one container from another container? If yes - the following could potentially help, at least it helped me when I had similar issues.
Try to define networks config in addition to links in your compose file, create a network with some name and define it in both services. Like described here, as the docks on links config recommend to do that.
Something like this for your case:
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432'
networks:
some_network:
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8000:8000'
volumes:
- .:/code
depends_on:
- db
links:
- db:db
networks:
some_network:
networks:
some_network:
It helped me to resolve the host name to connect to the db.
I had to use docker-compose up to make sure my db container was running.
Instead of redefining the networks, use
docker-compose down -v
to stop all containers and remove all cached data.
Then
docker-compose up
to restart from scratch.
I've been searching for several days for the solution to this issue. Here's what I did:
1 - I copied the postgresql.conf.sample file from my postgres container to my project folder
you must get in to cd usr/share/postgresql
with docker exec -it yourcontainer bash
2 - I changed its name to postgresql.conf
3 - I modified these variables in postgresql.conf
listen_addresses = '*'
port = 5432
max_connections = 100
superuser_reserved_connections = 3
unix_socket_directories = '/tmp'
4 - I added it to my Dockerfile
COPY postgresql.conf /tmp/postgresql.conf
5 - settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'test',
'USER': 'testing',
'PASSWORD': 'tests',
'HOST': 'localhost',
'PORT': 5432,
}
}
6 - I did docker-compose up -build again
I came across this issue recently, i solved it by adding postgres environment variables to db in my docker-compose.yml file like so
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432'
environment:
- POSTGRES_DB=booksdb
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8000:8000'
volumes:
- .:/code
depends_on:
- db
links:
- db:db
In my specific case, I had postgres without any version, so postgres image gets updated and the db (postgres) container won't get up
The following configuration works for me from official documentation
version: "3.8"
services:
db:
image: postgres
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
web:
build: .
command: python /code/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
then run those commands:
docker-compose up -d --build
docker-compose logs
docker exec -it "container_name" python3 manage.py migrate
This works for me. Make sure of the path to the manage.py file is correct.