How to connect to local network database from docker container - django

I'm trying to connect to an express database on sql server accesible throughout 192.168.0.130:1433 on local network from Docker Django container.
I'm on a Mac and from local host i have ping
$ ping 192.168.0.130
64 bytes from 192.168.0.130: icmp_seq=0 ttl=128 time=5.796 ms
64 bytes from 192.168.0.130: icmp_seq=1 ttl=128 time=2.234 ms
But inside docker container get timeout error.
docker-compose.yml:
version: '3.7'
services:
...
django:
container_name: djangonoguero_django_ctnr
build:
context: .
dockerfile: Dockerfile-django
restart: unless-stopped
env_file: ./project/project/settings/.env
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./project:/djangonoguero
depends_on:
- postgres
ports:
- 8000:8000
networks:
- djangonoguero-ntwk
networks:
djangonoguero-ntwk:
driver: bridge
Anybody could help me please ?
Thanks in advance.

Reset fabric default values from Docker solved the problem.

For example, you have postgres container:
db:
image: postgres
environment:
# This is for example, not use this in production!!!
POSTGRES_USER: developer
POSTGRES_DB: db
POSTGRES_PASSWORD: Passw0rd33
...
Then, in your django project settings, you should write:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'db',
'USER': 'developer',
'PASSWORD': 'Passw0rd33',
'HOST': 'db',
}
}
Be careful, this is example settings! Best practice is to store this data in .env file!

Related

docker-compose django unable to connect to postgres

I'm trying to dockerize an existing django application. I've followed the example docker-compose and docker files on https://docs.docker.com/samples/django/
I've added the links and networks suggestion as see on other posts, to no avail.
I've brought up the db by itself with docker-compose up db and run docker ps and confirmed it's up, accepting connections, and has the name db. I'm not sure what else to try.
My docker-compose:
version: '3'
services:
db:
image: postgres
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
container_name: db
networks:
harpnetwork:
harp:
build:
context: .
dockerfile: docker/harp/Dockerfile
volumes:
- ./harp:/harp
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- 16765:16765
depends_on:
- db
networks:
harpnetwork:
links:
- db:db
networks:
harpnetwork:
My django db config:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('POSTGRES_NAME'),
'USER': os.environ.get('POSTGRES_USER'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
'HOST': 'db',
'PORT': 5432,
}
}
and my Dockerfile to build the django project
From python:3.10
COPY ./harp harp
WORKDIR /harp
RUN pip install -r requirements.txt
RUN python manage.py migrate
CMD ["python", "manage.py", "runserver", "0.0.0.0:16765"]

failed to connect with Postgres container in Django

django.db.utils.OperationalError: connection to server at "db" (172.18.0.2), port 5432 failed: FATAL: the database system is starting up
I have an issue connecting to the Postgres contaner. I was trying in different ways like setting passwords only in the docker-compose file. Still am stuck.
docker-compose.yml
version: '3'
services:
db:
image: postgres:alpine
environment:
- POSTGRES_PASSWORD=password
- POSTGRES_USER=user
- POSTGRES_DB=userdb
volumes:
- django_db:/var/lib/postgresql/data
container_name: db
singup:
build: .
command: python manage.py runserver 0.0.0.0:8000
ports:
- 8000:8000
container_name: singup
depends_on:
- db
volumes:
django_db:
database setting
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'userdb',
'USER': 'user',
'PASSWORD': 'password',
'HOST': 'db',
}
}
This is a timing issue, even thought you have the below line
singup:
depends_on:
- db
It's just waiting for db container to be up and running, not neccessarily Postgres.
To avoid this, use a tool such as wait-for-it, dockerize, sh-compatible wait-for, or RelayAndContainers template. These are small wrapper scripts which you can include in your application’s image to poll a given host and port until it’s accepting TCP connections.
For example, to use wait-for-it.sh or wait-for to wrap your service’s command:
singup:
depends_on:
- db
command: ["./wait-for-it.sh", "db:5432", "--", "python", "app.py"]
Repo of wait-for-it: https://github.com/vishnubob/wait-for-it

Impossible to connect to my postgreSQL Database

I'm using docker and I have started a new project.
I have a backend in Django and I want to use a postgreSQL Database.
I can't connect to my database with Django or with PGAdmin.
This is what I Got when I try to connect to my database (with Django or with PGAdmin)
He is a part of my docker-compose file:
services:
db:
image: postgres
container_name: db_wishlist
restart: always
volumes:
- ./postgres-data-wishlist:/var/lib/postgresql/data
env_file:
- env/base/db.env
- env/dev/db.env
ports:
- "5432:5432"
networks:
wishlist:
aliases:
- db
pgadmin:
image: dpage/pgadmin4
container_name: pgadmin
depends_on:
- db
ports:
- "5051:80"
env_file:
- env/base/db.env
- env/dev/db.env
environment:
PGADMIN_DEFAULT_EMAIL: pgadmin4#pgadmin.org
PGADMIN_DEFAULT_PASSWORD: root
restart: always
networks:
wishlist:
aliases:
- pgadmin
api:
container_name: backend
build:
context: ..
dockerfile: Docker/Dockerfiles/Django
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ../Django/:/api
ports:
- "8000:8000"
env_file:
- env/base/db.env
- env/dev/db.env
depends_on:
- db
links:
- db:db
networks:
wishlist:
aliases:
- api
Here is my db.env:
POSTGRES_DB=wishlist
POSTGRES_USER=maxdal1
POSTGRES_PASSWORD=demo
Here is my settings.py in Django:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'wishlist',
'USER': 'maxdal1',
'PASSWORD': 'demo',
'HOST': 'db',
'PORT': 5432,
}
}
Here is an additional screenshot when I try to connect to my db with PGAdmin:
Error with PGAdmin
I checked every discussion about this error but I can't figure it out.
Can you help me please ?
Thanks.
Have you tried passing your db user to postgres container via environment variable?
environment:
POSTGRES_USER: "youruser"
The following setup works for me:
docker-compose.yaml
services:
postgres:
image: postgres:10-alpine
volumes:
- ./pgdata:/var/lib/postgresql/data/pgdata
environment:
- PGDATA=/var/lib/postgresql/data/pgdata
- POSTGRES_DB=django_db
- POSTGRES_USER=XXX
- POSTGRES_PASSWORD=XXX
ports:
- 127.0.0.1:5432:5432
networks:
- backend
django:
build:
context: .
dockerfile: Dockerfile
environment:
- POSTGRES_HOST=postgres
- POSTGRES_DB=django_db
- POSTGRES_USER=XXX
- POSTGRES_PASSWORD=XXX
volumes:
- ./staticfiles:/staticfiles
- ./media:/media
ports:
- 8000:8000
networks:
- backend
- frontend
settings.py
POSTGRES_HOST = os.environ.get('POSTGRES_HOST', 'localhost')
POSTGRES_DB = os.environ.get('POSTGRES_DB', 'django_db')
POSTGRES_USER = os.environ.get('POSTGRES_USER', 'XXX')
POSTGRES_PW = os.environ.get('POSTGRES_PASSWORD', 'XXX')
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': POSTGRES_DB,
'HOST': POSTGRES_HOST,
'USER': POSTGRES_USER,
'PASSWORD': POSTGRES_PW,
'CONN_MAX_AGE': 120,
}
}
The Dockerfile for the django app specifies the following env vars:
ENV POSTGRES_DB=$POSTGRES_DB
ENV POSTGRES_HOST=$POSTGRES_HOST
ENV POSTGRES_USER=$POSTGRES_USER
ENV POSTGRES_PASSWORD=$POSTGRES_PASSWORD
You can run this either like this:
$ docker-compose up
or only run the postgres via docker and run Django locally:
$ docker-compose up postgres
in a different terminal:
$ ./manage.py runserver

Django docker - could not translate host name "db" to address: nodename nor servname provided, or not known

I'm relatively new to Django and Docker and am following a tutorial to build a mini application. However, I'm getting stuck with the following error:
django.db.utils.OperationalError: could not translate host name "db" to address: nodename nor servname provided, or not known
My docker-compose file looks as follows:
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432'
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8000:8000'
volumes:
- .:/code
depends_on:
- db
links:
- db:db
My settings.py file contains the database:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'HOST': 'db',
'PORT': 5432,
}
}
I've seen the post here and here however both have not fixed the issue.
Would appreciate some guidance. Thanks.
So you are trying to reach the db which is running in one container from another container? If yes - the following could potentially help, at least it helped me when I had similar issues.
Try to define networks config in addition to links in your compose file, create a network with some name and define it in both services. Like described here, as the docks on links config recommend to do that.
Something like this for your case:
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432'
networks:
some_network:
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8000:8000'
volumes:
- .:/code
depends_on:
- db
links:
- db:db
networks:
some_network:
networks:
some_network:
It helped me to resolve the host name to connect to the db.
I had to use docker-compose up to make sure my db container was running.
Instead of redefining the networks, use
docker-compose down -v
to stop all containers and remove all cached data.
Then
docker-compose up
to restart from scratch.
I've been searching for several days for the solution to this issue. Here's what I did:
1 - I copied the postgresql.conf.sample file from my postgres container to my project folder
you must get in to cd usr/share/postgresql
with docker exec -it yourcontainer bash
2 - I changed its name to postgresql.conf
3 - I modified these variables in postgresql.conf
listen_addresses = '*'
port = 5432
max_connections = 100
superuser_reserved_connections = 3
unix_socket_directories = '/tmp'
4 - I added it to my Dockerfile
COPY postgresql.conf /tmp/postgresql.conf
5 - settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'test',
'USER': 'testing',
'PASSWORD': 'tests',
'HOST': 'localhost',
'PORT': 5432,
}
}
6 - I did docker-compose up -build again
I came across this issue recently, i solved it by adding postgres environment variables to db in my docker-compose.yml file like so
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432'
environment:
- POSTGRES_DB=booksdb
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8000:8000'
volumes:
- .:/code
depends_on:
- db
links:
- db:db
In my specific case, I had postgres without any version, so postgres image gets updated and the db (postgres) container won't get up
The following configuration works for me from official documentation
version: "3.8"
services:
db:
image: postgres
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
web:
build: .
command: python /code/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
then run those commands:
docker-compose up -d --build
docker-compose logs
docker exec -it "container_name" python3 manage.py migrate
This works for me. Make sure of the path to the manage.py file is correct.

Running Django migrations on dockerized project

I have a Django project running in multiple Docker containers with help of docker-compose. The source code is attached from directory on my local machine. Here's the compose configuration file:
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432:5432'
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8001:8000'
volumes:
- .:/code
depends_on:
- db
Although the application starts as it should, I can't run migrations, because every time I do manage.py makemigrations ... I receive:
django.db.utils.OperationalError: could not translate host name "db" to address: nodename nor servname provided, or not known
Obviously I can open bash inside the core container and run makemigrations from there, but then the migration files are created inside the container which is very uncomfortable.
In my project's settings the database is configured as:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'HOST': 'db',
'PORT': '5432',
}
}
As docker postgres image is accessible at localhost:5432 I tried changing database host in settings to:
'HOST': '0.0.0.0'
But then when firing up the containers with docker-compose up I'm receiving:
...
django.db.utils.OperationalError: could not connect to server:
Connection refused
Is the server running on host "0.0.0.0" and accepting
TCP/IP connections on port 5432?
...
How should I configure database in settings.py so that Django can access it to create migrations?
Your docker-compose configurations are not correct. You forgot to link services
version: '3'
services:
db:
image: 'postgres'
ports:
- '5432:5432'
core:
build:
context: .
dockerfile: Dockerfile
command: python3 manage.py runserver 0.0.0.0:8000
ports:
- '8001:8000'
volumes:
- .:/code
depends_on:
- db
links: # <- here
- db