Struggling with connection between Django and Postgres within Docker containers - django

I've been hitting the following error for awhile now and can't seem to fix it...
django.db.utils.OperationalError: could not connect to server: Connection refused
Is the server running on host "127.0.0.1" and accepting
TCP/IP connections on port 5432?
Multiple resources stated it was simply due to the HOST setting within my DATABASES, but the following is what I am working with and still can't get it to work:
DATABASES = {
'default': {
'HOST': 'db',
'ENGINE': 'django_postgrespool2',
'NAME': os.environ.get('PROJECT_HEARSAY_DB_NAME'),
'USER': os.environ.get('PROJECT_HEARSAY_DB_USER'),
'PASSWORD': os.environ.get('PROJECT_HEARSAY_DB_PASSWORD'),
'PORT': os.environ.get('PROJECT_HEARSAY_DB_PORT'),
}
}
Here is the Dockerfile for my Django app:
FROM python:3
ENV PYTHONUNBUFFERED 1
COPY . /app
WORKDIR /app
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
Here is the Dockerfile for my Postgresql DB:
FROM postgres
And here is the docker-compose.yml that I am working with:
version: "3"
services:
postgresql:
build:
context: ./db
container_name: db.postgresql
ports:
- 5432:5432
environment:
POSTGRES_DB: "db_name"
POSTGRES_USER: "username"
POSTGRES_PASSWORD: "password"
django:
restart: always
build:
context: ./api
command: bash -c "./manage.py migrate && ./manage.py runserver 0.0.0.0:8000"
container_name: api.django
ports:
- 8000:8000
depends_on:
- postgresql
I'm curious if anyone here could shed some light on what I am doing wrong.
Thank you.

Your container name is db.postgresql, so your connection string should be db.postgresql or postgresql in the host name.
DATABASES = {
'default': {
'HOST': 'db.postgresql',
'ENGINE': 'django_postgrespool2',
'NAME': os.environ.get('PROJECT_HEARSAY_DB_NAME'),
'USER': os.environ.get('PROJECT_HEARSAY_DB_USER'),
'PASSWORD': os.environ.get('PROJECT_HEARSAY_DB_PASSWORD'),
'PORT': os.environ.get('PROJECT_HEARSAY_DB_PORT'),
}
}
you can verify the connection
docker exec api.django bash -c "nslookup db.postgresql"
You will get the Postgres container IP inside your Django container.

Related

Cannot connect django to postgres running inside docker container

My django app is not dockerized, but I run postgres inside docker container using docker-compose.yml script. After docker-compose up I can connect to db with dbeaver, but not with django app. Every time I'm getting an error:
django.db.utils.OperationalError: could not translate host name "db" to address:
Temporary failure in name resolution
File docker-compose.yml:
version: "3.9"
services:
db:
image: postgres:13
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- "POSTGRES_HOST_AUTH_METHOD=trust"
- POSTGRES_USER="postgres"
- POSTGRES_PASSWORD="postgres"
- POSTGRES_DB="postgres"
ports:
- 5432:5432
volumes:
postgres_data
Django project file config/settings.py:
...
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': env.str("DB_NAME"),
'USER': env.str("DB_USER"),
'PASSWORD': env.str("DB_PASS"),
'HOST': env.str("DB_HOST"),
'PORT': env.decimal("DB_PORT")
}
}
It was some kind of pipenv error. When I restarted terminal and executed pipenv shell again and then python manage.py runserver everything worked just fine

Django cannot find database when using docker-compose

Dockerfile
FROM python:3.8
ENV PYTHONUNBUFFERED 1
ENV WORKDIR /usr/src/app
WORKDIR ${WORKDIR}
RUN pip install --upgrade pip && pip install pipenv
COPY ./Pipfile* ${WORKDIR}/
RUN pipenv lock --requirements > requirements.txt
RUN pip install -r requirements.txt
ADD . ${WORKDIR}/
docker-compose.yml
version: '3'
services:
database:
image: postgres:latest
environment:
POSTGRES_PASSWORD: development
POSTGRES_USER: development
POSTGRES_DB: dev_db
ports:
- 5432:5432
backend:
build: .
command: python /usr/src/app/manage.py runserver 0.0.0.0:8000
volumes:
- .:/usr/src/app
ports:
- 8000:8000
depends_on:
- database
Django database configurations
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'dev_db',
'USER': 'development',
'PASSWORD': 'development',
'HOST': 'database',
'PORT': 5432,
}
}
However, when I start docker compose, it gives the following message:
> Is the server running on host "database" (172.29.0.2) and accepting
TCP/IP connections on port 5432?
Can anyone provide some guidance on why I'm getting this error? As far as I know both services are inside the same network
This is probably because when the Django instance is ready, Postgres is still starting up.
A common solution is to use a script like wait-for to delay the execution of a command until a service is replying on a specific host/port.
Take a look at the Docker documentation about controlling the startup order of multiple containers.
I think you are missing the hostname in the yml file. See below
services:
database:
image: postgres:latest
hostname: database
environment:
POSTGRES_PASSWORD: development
POSTGRES_USER: development
POSTGRES_DB: dev_db
ports:
- 5432:5432

Running Django migrations on dockerized project

I have a Django project running in multiple Docker containers with help of docker-compose. The source code is attached from directory on my local machine. Here's the compose configuration file:
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432:5432'
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8001:8000'
volumes:
- .:/code
depends_on:
- db
Although the application starts as it should, I can't run migrations, because every time I do manage.py makemigrations ... I receive:
django.db.utils.OperationalError: could not translate host name "db" to address: nodename nor servname provided, or not known
Obviously I can open bash inside the core container and run makemigrations from there, but then the migration files are created inside the container which is very uncomfortable.
In my project's settings the database is configured as:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'HOST': 'db',
'PORT': '5432',
}
}
As docker postgres image is accessible at localhost:5432 I tried changing database host in settings to:
'HOST': '0.0.0.0'
But then when firing up the containers with docker-compose up I'm receiving:
...
django.db.utils.OperationalError: could not connect to server:
Connection refused
Is the server running on host "0.0.0.0" and accepting
TCP/IP connections on port 5432?
...
How should I configure database in settings.py so that Django can access it to create migrations?
Your docker-compose configurations are not correct. You forgot to link services
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432:5432'
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8001:8000'
volumes:
- .:/code
depends_on:
- db
links: # <- here
- db

Docker container runs with docker-compose up but doesn't with docker run

First attempt into docker and I've hit a wall.
I don't understand how the docker works and when my local machine is "detached" from the container. At the end the project runs with docker-compose run, but doesn't with docker run because it cannot connect to postgres.
When I run docker-compose run with these settings everything works:
settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'postgres',
'USER': 'user',
'PASSWORD': 'password',
'HOST': 'db',
'PORT': '5432',
}
}
docker-compose.yml
version: '2'
services:
db:
image: postgres
environment:
- POSTGRES_USER=user
- POSTGRES_PASSWORD=password
ports:
- '5432:5432'
redis:
image: redis
ports:
- '6379:6379'
celery:
build:
context: .
dockerfile: Dockerfile
env_file: common.env
command: celery -A saleor worker --app=saleor.celeryconf:app --loglevel=info
volumes:
- .:/app:Z
links:
- redis
depends_on:
- redis
search:
image: elasticsearch:5.4.3
mem_limit: 512m
environment:
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
ports:
- '127.0.0.1:9200:9200'
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
env_file: common.env
depends_on:
- db
- redis
- search
ports:
- '8000:8000'
volumes:
- .:/app:Z
makemigrations:
build: .
command: python manage.py makemigrations --noinput
volumes:
- .:/app:Z
migration:
build: .
command: python manage.py migrate --noinput
volumes:
- .:/app:Z
However, if I run
docker build -t db_test .
docker run -p 4000:8000 db_test
Dockerfile:
FROM python:3.5
ENV PYTHONUNBUFFERED 1
RUN \
apt-get -y update && \
apt-get install -y gettext && \
apt-get clean
ADD requirements.txt /app/
RUN pip install -r /app/requirements.txt
ADD . /app
WORKDIR /app
EXPOSE 8000
ENV PORT 8000
CMD ["uwsgi", "/app/saleor/wsgi/uwsgi.ini"]
I am getting an exception:
OperationalError at / could not translate host name "db" to address: Name or service not known
If I change the DB settings to:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'postgres',
'USER': 'user',
'PASSWORD': 'password',
'HOST': '**localhost**',
'PORT': '5432',
}
}
the build and run I get:
could not connect to server: Connection refused Is the server running on host "localhost" (127.0.0.1) and accepting TCP/IP connections on port 5432? could not connect to server: Cannot assign requested address Is the server running on host "localhost" (::1) and accepting TCP/IP connections on port 5432?
This is the postgres port on my machine. Then if I turn of postgres and the port is free nothing changes.
Any help will be highly appreciated!

Django rest Docker with MySQL

I am trying to dockerize my existing Django Rest project. I am using MySQL database instead of default SqlLite.
My Dockerfile looks like following:
FROM python:2.7
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
COPY . /code/
RUN pip install -r requirements.txt
and Docker-compose:
version: '3'
services:
db:
image: mysql
environment:
MYSQL_ROOT_PASSWORD: docker
MYSQL_DATABASE: docker
MYSQL_USER: docker
MYSQL_PASSWORD: docker
ports:
- "3306:3306"
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
I did not run docker-compose run web python manage.py migrate
docker-compose build is successful
However docker-compose up fails eventually saying Can't connect to local MySQL server. I am guessing that I need to install MySQl in my container as well, but do not know how. What am I missing in my Dockerfile or docker-compose?
UPDATE: My settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', #django.db.backends.mysql
'NAME': 'libraries', #local: libraries #server:
'USER': 'root', #root #root
'PASSWORD': 'root', #local: root #server:
'HOST': 'localhost', #local: localhost #server:
'PORT': '3306',
}
}
Change the HOST in the database settings from localhost to db. There's no MySQL in the web container so the Python code couldn't connect to the db.