I'm trying to connect Django to the PostgreSQL server but still didn't manage to see what's the problem.
The docker container seems up and running correctly, but in Django, I get this error:
Pgdb throws me this error:
pgdb | 2021-06-09 23:49:00.871 UTC [35] FATAL: password authentication failed for user "django"
pgdb | 2021-06-09 23:49:00.871 UTC [35] DETAIL: Role "django" does not exist.
pgdb | Connection matched pg_hba.conf line 99: "host all all all md5"
Dockerfile
FROM python:3
ENV PYTHONUNBUFFERED=1
WORKDIR /usr/src/app
COPY requirements.txt ./
RUN pip3 install --upgrade pip
RUN pip3 install -r requirements.txt
docker-compose.yml
version: '3.8'
services:
django:
build: .
container_name: django
command: >
sh -c "python manage.py makemigrations &&
python manage.py runserver 0.0.0.0:8000"
volumes:
- .:/usr/src/app/
ports:
- '8000:8000'
environment:
DB_NAME: django
DB_USER: django
DB_HOST: pgdb
DB_PORT: 5432
DB_PASSWORD: django
CELERY_BROKER: redis://redis:6379/0
depends_on:
- pgdb
- redis
celery:
build: .
command: celery -A conf worker -l INFO
volumes:
- .:/usr/src/app/
depends_on:
- django
- redis
pgdb:
image: postgres
container_name: pgdb
environment:
POSTGRES_USER: django
POSTGRES_PASSWORD: django
ports:
- '5432:5432'
volumes:
- pgdata:/var/lib/postgresql/data/
redis:
image: 'redis:alpine'
volumes:
pgdata:
Django settings includes:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': os.environ.get('DB_NAME'),
'USER': os.environ.get('DB_USER'),
'HOST': os.environ.get('DB_HOST'),
'PORT': os.environ.get('DB_PORT'),
'PASSWORD': os.environ.get('DB_PASSWORD'),
},
}
What's the problem? Am I missing something?
Related
I have a django and postgres containers. When it's time for django to apply migrations, it doesn't see a postgres container that I named pgdb and I get this error:
django.db.utils.OperationalError: could not translate host name "pgdb"
to address: Temporary failure in name resolution
It appears that there is no docker container with name "pgdb". If I run "docker-compose run pgdb" it creates a postgres container with a name of "app_pgdb_run_23423423" under the "app" group. The cute thing is that I made it work previously with this settings.py setup and "pgdb" postgres container name. What could be the underlying issue?
You can clone the full code from https://github.com/UberStreuner/mailing-service
My settings.py setup, the environment variables definitely aren't at fault.
DATABASES = {
'default': {
'ENGINE': os.environ.get('DB_ENGINE', 'django.db.backends.postgresql_psycopg2'),
'NAME': os.environ.get('POSTGRES_DB'),
'USER': os.environ.get('POSTGRES_USER'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
'HOST': os.environ.get('POSTGRES_HOST', 'pgdb'),
'PORT': os.environ.get('POSTGRES_PORT', '5432')
}
}
docker-compose.yml
version: "3.8"
services:
django:
build: .
container_name: django
command: ./docker-entrypoint.sh
volumes:
- .:/usr/src/app/
ports:
- "8000:8000"
env_file:
- ./.dev.env
depends_on:
- pgdb
- redis
celery:
build: .
command: celery -A mailing worker -l INFO
volumes:
- .:/usr/src/app/
env_file:
- ./.dev.env
depends_on:
- django
- redis
- pgdb
celery-beat:
build: .
command: celery -A mailing beat -l INFO
volumes:
- .:/usr/src/app/
env_file:
- ./.dev.env
depends_on:
- django
- redis
- pgdb
pgdb:
image: postgres
container_name: pgdb
env_file:
- ./.dev.env
volumes:
- pgdata:/var/lib/postgresql/data
expose:
- "5432"
ports:
- "5432:5432"
redis:
image: "redis:alpine"
volumes:
pgdata:
.dev.env
SECRET_KEY=django-insecure-kh8ot+ag-^osyhg5itzj!iw_g#dklw3f75c0)^-7)*#^)#tc#x
DEBUG=1
DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1
CELERY_BROKER=redis://redis:6379/0
CELERY_BACKEND=redis://redis:6379/0
POSTGRES_DB=django_postgres
POSTGRES_USER=postgres
POSTGRES_PASSWORD=password
docker-entrypoint.sh
#!/bin/bash
python manage.py migrate
python manage.py runserver 0.0.0.0:8000
I fixed the issue.
It probably was in the fact that I import the environment variables for Django from both python-dotenv through .dev.env file and from docker-compose, where it's passed as env_file. So I commented out dotenv loading from settings.py, and it worked.
I'm trying to dockerize an existing django application. I've followed the example docker-compose and docker files on https://docs.docker.com/samples/django/
I've added the links and networks suggestion as see on other posts, to no avail.
I've brought up the db by itself with docker-compose up db and run docker ps and confirmed it's up, accepting connections, and has the name db. I'm not sure what else to try.
My docker-compose:
version: '3'
services:
db:
image: postgres
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
container_name: db
networks:
harpnetwork:
harp:
build:
context: .
dockerfile: docker/harp/Dockerfile
volumes:
- ./harp:/harp
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- 16765:16765
depends_on:
- db
networks:
harpnetwork:
links:
- db:db
networks:
harpnetwork:
My django db config:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('POSTGRES_NAME'),
'USER': os.environ.get('POSTGRES_USER'),
'PASSWORD': os.environ.get('POSTGRES_PASSWORD'),
'HOST': 'db',
'PORT': 5432,
}
}
and my Dockerfile to build the django project
From python:3.10
COPY ./harp harp
WORKDIR /harp
RUN pip install -r requirements.txt
RUN python manage.py migrate
CMD ["python", "manage.py", "runserver", "0.0.0.0:16765"]
I know there have been similar questions on this site, but the answers to these are sadly outdated.
So I have a django application and i want to use postgres as the underlying database.
Additionally a want to separate both programs in separate dockers.
Now the docker docs have a way to do it, but sadly it seems to be outdated: link
The problem appears when i call manage.py migrate in the docker build function which is being run by the docker compose file.
But i get the error that the host 'db' is unknown/invalid.
Compose excerpt:
services:
db:
image: postgres
restart: always
volumes:
- DataBase:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- "5432:5432"
app:
build: FehlzeitErfassungProject
restart: always
volumes:
- Logs:/app/logs
- Backups:/app/Backups
- Media:/app/MEDIA
ports:
- "5432:5432"
depends_on:
- db
app dockerfile:
FROM ubuntu
WORKDIR /app
ADD ./requirements.txt ./
RUN apt-get update && \
apt-get upgrade -y
# getting tzdata to shutup
ARG DEBIAN_FRONTEND=noninteractive
ENV TZ=Europe/Berlin
RUN apt-get -y install tzdata
#installing all needed porgramms
RUN apt-get install -y uwsgi python3 python3-pip
RUN python3 -m pip install pip --upgrade
RUN python3 -m pip install -r requirements.txt
COPY ./ ./
RUN ./manage.py migrate
CMD ["uwsgi", "uwsgu.ini"]
PS: It seems to appear, that the other dockers are only launched when the app docker has already finished building
Edit:
the database_settings:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': '127.0.0.1',
'PORT': 5432,
}
}
I think that your problem is in the file "settings.py" in the DB configuration for the HOST param, you need to have the name of the postgresql service that you have defined in the docker-compose file, in this case, is "db".
BTW I think that using constants in settings.py is bad behavior, it's better to define in the docker-compose file environmental variables.
In your case would be:
docker-compose.yml
services:
db:
image: postgres
restart: always
volumes:
- DataBase:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- "5432:5432"
app:
build: FehlzeitErfassungProject
restart: always
volumes:
- Logs:/app/logs
- Backups:/app/Backups
- Media:/app/MEDIA
ports:
- "5432:5432"
environment:
- DB_HOST=db
- DB_NAME=postgres
- DB_USER=postgres
- DB_PASSWORD=postgres
depends_on:
- db
Settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('DB_NAME'),
'USER': os.environ.get('DB_USER'),
'PASSWORD': os.environ.get('DB_PASSWORD'),
'HOST': os.environ.get('DB_HOST'),
}
}
Remember to import os in settings.py.
binding both container port at same port for the host will not work.
to access your database in other container you should link to it.
version: "3.0"
services:
db:
image: postgres
restart: always
volumes:
- DataBase:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
ports:
- "5432:5432"
app:
build: FehlzeitErfassungProject
restart: always
volumes:
- Logs:/app/logs
- Backups:/app/Backups
- Media:/app/MEDIA
ports:
- "8000:8000"
links:
- "db"
depends_on:
- db
docker-file:
FROM python:3.6
WORKDIR /usr/src/jobsterapi
COPY ./ ./
RUN pip install -r requirements.txt
CMD ["/bin/bash"]
docker-compose:
version: '3.6'
services:
#Backend API
jobsterapi:
container_name: jobsterapi
build: .
command: python src/manage.py runserver 0.0.0.0:8000
working_dir: /usr/src/jobsterapi
links:
- mongodb
depends_on:
- mongodb
ports:
- "8000:8000"
volumes:
- ./:/usr/src/facerecognition-api
mongodb:
restart: always
image: mongo:latest
container_name: "mongodb"
environment:
- MONGO_INITDB_ROOT_USERNAME=${soubhagya}
- MONGO_INITDB_ROOT_PASSWORD=${Thinkonce}
- MONGODB_USERNAME='soubhagya'
- MONGODB_PASSWORD='Thinkonce'
- MONGODB_DATABASE=='jobster'
volumes:
- ./data/db:/var/micro-data/mongodb/data/db
ports:
- 27017:27017
command: mongod --smallfiles --logpath=/dev/null # --quiet
django-database setting:
DATABASES = {
'default': {
'ENGINE': 'djongo',
'NAME': 'jobster',
'user': 'soubhagya',
'password':'Thinkonce',
'port': 27017
}
}
i don't know actually how to setup django with mongodb inside docker-compose. i am trying from some blogs by this way. but it is not working.
please have a look into my code.
You have errors in your environment:
- MONGO_INITDB_ROOT_USERNAME=${soubhagya}
- MONGO_INITDB_ROOT_PASSWORD=${Thinkonce}
- MONGO_INITDB_DATABASE=='jobster'
I have a docker-compose file with django project that trying to use database situated on host machine.
Now my Dockerfile is:
FROM python:3-slim
ENV PYTHONUNBUFFERED 1
RUN mkdir /code.
WORKDIR /code
ADD . /code/
RUN pip install -r requirements.txt
RUN export dockerhost=$(docker-machine ip)
docker-compose.yml:
version: "2"
networks:
workernetwork:
webnetwork:
services:
static:
volumes:
- /static:/static
- /media:/media
image: alpine:latest
web:
build: .
command: bash -c "SECRET_KEY=temp_value python /code/manage.py collectstatic --noinput && python /code/manage.py migrate && /code/run_gunicorn.sh"
volumes:
- .:/code
volumes_from:
- static
env_file:
- secrets.env
ports:
- 443:443
networks:
- webnetwork
extra_hosts:
- "dockerhost:${dockerhost}"
DATABASES in settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'revolution',
'USER': get_env_setting('POSTGRES_USER'),
'PASSWORD': get_env_setting('POSTGRES_PASSWORD'),
'HOST': 'dockerhost',
'PORT': 5432,
}
}
What Im doing wrong?
Thx for attention!
Finally resolved with docker volume:
as first, create volume:
docker volume create --name=coredb
docker-compose.yml
version: "2"
services:
...
web:
build:
context: .
command: bash -c "python /code/manage.py collectstatic --noinput && python /code/manage.py migrate && /code/run_gunicorn.sh"
volumes:
- /static:/data/web/static
- /media:/data/web/media
- .:/code
env_file:
- ../.env
depends_on:
- db
db:
restart: always
image: postgres
env_file:
- ../.env
volumes:
- pgdata: /var/lib/postgresql/data
volumes:
pgdata:
external:
name: coredb
.env variables:
POSTGRES_DB={hidden}
POSTGRES_USER={hidden}
POSTGRES_PASSWORD={hidden}
Important:
You need to create database and user manually via docker exec -it backend_db_1 bash and following this instructions till Install Django within a Virtual Environment chapter