failed to connect with Postgres container in Django - django

django.db.utils.OperationalError: connection to server at "db" (172.18.0.2), port 5432 failed: FATAL: the database system is starting up
I have an issue connecting to the Postgres contaner. I was trying in different ways like setting passwords only in the docker-compose file. Still am stuck.
docker-compose.yml
version: '3'
services:
db:
image: postgres:alpine
environment:
- POSTGRES_PASSWORD=password
- POSTGRES_USER=user
- POSTGRES_DB=userdb
volumes:
- django_db:/var/lib/postgresql/data
container_name: db
singup:
build: .
command: python manage.py runserver 0.0.0.0:8000
ports:
- 8000:8000
container_name: singup
depends_on:
- db
volumes:
django_db:
database setting
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'userdb',
'USER': 'user',
'PASSWORD': 'password',
'HOST': 'db',
}
}

This is a timing issue, even thought you have the below line
singup:
depends_on:
- db
It's just waiting for db container to be up and running, not neccessarily Postgres.
To avoid this, use a tool such as wait-for-it, dockerize, sh-compatible wait-for, or RelayAndContainers template. These are small wrapper scripts which you can include in your application’s image to poll a given host and port until it’s accepting TCP connections.
For example, to use wait-for-it.sh or wait-for to wrap your service’s command:
singup:
depends_on:
- db
command: ["./wait-for-it.sh", "db:5432", "--", "python", "app.py"]
Repo of wait-for-it: https://github.com/vishnubob/wait-for-it

Related

docker-compose django unable to connect to postgres

I'm trying to dockerize an existing django application. I've followed the example docker-compose and docker files on https://docs.docker.com/samples/django/
I've added the links and networks suggestion as see on other posts, to no avail.
I've brought up the db by itself with docker-compose up db and run docker ps and confirmed it's up, accepting connections, and has the name db. I'm not sure what else to try.
My docker-compose:
version: '3'
services:
db:
image: postgres
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
container_name: db
networks:
harpnetwork:
harp:
build:
context: .
dockerfile: docker/harp/Dockerfile
volumes:
- ./harp:/harp
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- 16765:16765
depends_on:
- db
networks:
harpnetwork:
links:
- db:db
networks:
harpnetwork:
My django db config:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('POSTGRES_NAME'),
'USER': os.environ.get('POSTGRES_USER'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
'HOST': 'db',
'PORT': 5432,
}
}
and my Dockerfile to build the django project
From python:3.10
COPY ./harp harp
WORKDIR /harp
RUN pip install -r requirements.txt
RUN python manage.py migrate
CMD ["python", "manage.py", "runserver", "0.0.0.0:16765"]

How to connect to local network database from docker container

I'm trying to connect to an express database on sql server accesible throughout 192.168.0.130:1433 on local network from Docker Django container.
I'm on a Mac and from local host i have ping
$ ping 192.168.0.130
64 bytes from 192.168.0.130: icmp_seq=0 ttl=128 time=5.796 ms
64 bytes from 192.168.0.130: icmp_seq=1 ttl=128 time=2.234 ms
But inside docker container get timeout error.
docker-compose.yml:
version: '3.7'
services:
...
django:
container_name: djangonoguero_django_ctnr
build:
context: .
dockerfile: Dockerfile-django
restart: unless-stopped
env_file: ./project/project/settings/.env
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./project:/djangonoguero
depends_on:
- postgres
ports:
- 8000:8000
networks:
- djangonoguero-ntwk
networks:
djangonoguero-ntwk:
driver: bridge
Anybody could help me please ?
Thanks in advance.
Reset fabric default values from Docker solved the problem.
For example, you have postgres container:
db:
image: postgres
environment:
# This is for example, not use this in production!!!
POSTGRES_USER: developer
POSTGRES_DB: db
POSTGRES_PASSWORD: Passw0rd33
...
Then, in your django project settings, you should write:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'db',
'USER': 'developer',
'PASSWORD': 'Passw0rd33',
'HOST': 'db',
}
}
Be careful, this is example settings! Best practice is to store this data in .env file!

could not translate host name to "db" to address: Unknown host

I'm attempting to follow the guide provided by: https://docs.docker.com/compose/django/
Whenever I attempt to makemigrations, it gives me the Unknown host error given in the title. I'm trying to use PostgreSQL with Django and Wagtail as its CMS
My docker-compose.yml looks like:
version: "3.9"
services:
db:
image: postgres
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
and my settings in the settings.py file look like:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'db',
'PORT': 5432,
}
}
Am I missing anything?
Your code can run in two different environments, and hard-coding the connection information might not be correct.
You mention in a comment that you're running something like:
docker-compose up -d db
python manage.py makemigrations
In this environment python is running outside of Docker. If you add ports: [5432:5432] to the database configuration in the docker-compose.yml file, the database will be accessible via (probably) localhost. On the other hand, when you run docker-compose up, the application runs inside Docker and the database will be reachable at db.
You can use an environment variable to configure this. I find it useful to give these variables default values that would be useful for a developer, and set them to different values in my deployment setup (the docker-compose.yml).
DATABASES = {
'default': {
...
'HOST': os.getenv('DB_HOST', 'localhost'),
...
}
}
version: "3.9"
services:
db:
ports:
- '5432:5432' # makes this accessible from your development environment
...
web:
environment:
- DB_HOST=db
...

Django docker - could not translate host name "db" to address: nodename nor servname provided, or not known

I'm relatively new to Django and Docker and am following a tutorial to build a mini application. However, I'm getting stuck with the following error:
django.db.utils.OperationalError: could not translate host name "db" to address: nodename nor servname provided, or not known
My docker-compose file looks as follows:
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432'
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8000:8000'
volumes:
- .:/code
depends_on:
- db
links:
- db:db
My settings.py file contains the database:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'HOST': 'db',
'PORT': 5432,
}
}
I've seen the post here and here however both have not fixed the issue.
Would appreciate some guidance. Thanks.
So you are trying to reach the db which is running in one container from another container? If yes - the following could potentially help, at least it helped me when I had similar issues.
Try to define networks config in addition to links in your compose file, create a network with some name and define it in both services. Like described here, as the docks on links config recommend to do that.
Something like this for your case:
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432'
networks:
some_network:
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8000:8000'
volumes:
- .:/code
depends_on:
- db
links:
- db:db
networks:
some_network:
networks:
some_network:
It helped me to resolve the host name to connect to the db.
I had to use docker-compose up to make sure my db container was running.
Instead of redefining the networks, use
docker-compose down -v
to stop all containers and remove all cached data.
Then
docker-compose up
to restart from scratch.
I've been searching for several days for the solution to this issue. Here's what I did:
1 - I copied the postgresql.conf.sample file from my postgres container to my project folder
you must get in to cd usr/share/postgresql
with docker exec -it yourcontainer bash
2 - I changed its name to postgresql.conf
3 - I modified these variables in postgresql.conf
listen_addresses = '*'
port = 5432
max_connections = 100
superuser_reserved_connections = 3
unix_socket_directories = '/tmp'
4 - I added it to my Dockerfile
COPY postgresql.conf /tmp/postgresql.conf
5 - settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'test',
'USER': 'testing',
'PASSWORD': 'tests',
'HOST': 'localhost',
'PORT': 5432,
}
}
6 - I did docker-compose up -build again
I came across this issue recently, i solved it by adding postgres environment variables to db in my docker-compose.yml file like so
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432'
environment:
- POSTGRES_DB=booksdb
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8000:8000'
volumes:
- .:/code
depends_on:
- db
links:
- db:db
In my specific case, I had postgres without any version, so postgres image gets updated and the db (postgres) container won't get up
The following configuration works for me from official documentation
version: "3.8"
services:
db:
image: postgres
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
web:
build: .
command: python /code/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
then run those commands:
docker-compose up -d --build
docker-compose logs
docker exec -it "container_name" python3 manage.py migrate
This works for me. Make sure of the path to the manage.py file is correct.

Running Django migrations on dockerized project

I have a Django project running in multiple Docker containers with help of docker-compose. The source code is attached from directory on my local machine. Here's the compose configuration file:
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432:5432'
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8001:8000'
volumes:
- .:/code
depends_on:
- db
Although the application starts as it should, I can't run migrations, because every time I do manage.py makemigrations ... I receive:
django.db.utils.OperationalError: could not translate host name "db" to address: nodename nor servname provided, or not known
Obviously I can open bash inside the core container and run makemigrations from there, but then the migration files are created inside the container which is very uncomfortable.
In my project's settings the database is configured as:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'HOST': 'db',
'PORT': '5432',
}
}
As docker postgres image is accessible at localhost:5432 I tried changing database host in settings to:
'HOST': '0.0.0.0'
But then when firing up the containers with docker-compose up I'm receiving:
...
django.db.utils.OperationalError: could not connect to server:
Connection refused
Is the server running on host "0.0.0.0" and accepting
TCP/IP connections on port 5432?
...
How should I configure database in settings.py so that Django can access it to create migrations?
Your docker-compose configurations are not correct. You forgot to link services
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432:5432'
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8001:8000'
volumes:
- .:/code
depends_on:
- db
links: # <- here
- db