Can't create database with default postgresql docker image - django

Official site says, that if you wanna build postgresql databse just put varibles POSTGRES_USER, POSTGRES_PASSWORD and POSTGRES_DB. I put them into .env file and write in docker-compose file env_file: ./.env, but then django says: django.db.utils.OperationalError: FATAL: database "languages" does not exist. It means I should write .sql script with commands. Questions:
Writing .sql sript is the only way to make it work?
How can I write it and where to put it if I don't want explicitly
write sensetive info like password, I wanna use .env file as i will store .sql script in public repo?
My .env file
DEBUG=True
SECRET_KEY=...
POSTGRESQL_DATABASE=languages
POSTGRESQL_USERNAME=admin
POSTGRESQL_PASSWORD=pass
POSTGRESQL_ADDRESS=postgres
POSTGRES_DB=languages
My docker-compose file:
version: '3'
services:
web:
build: ./web
env_file: ./web/microservice/.env
volumes:
- ./web:/code/
depends_on:
- postgres
command: python manage.py runserver 0.0.0.0:8000
nginx:
build: ./nginx/
ports:
- "80:80"
- "443:443"
volumes:
- ./web/static:/code/static
depends_on:
- web
postgres:
image: postgres:latest
ports:
- "5432:5432"
env_file: ./web/microservice/.env
volumes:
- /var/lib/postgresql/data/
My Dockefile:
FROM python:3.5
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
ADD ./requirements.txt /code/
RUN pip install --no-cache-dir -r requirements.txt
ADD . /code/
EXPOSE 8000
CMD ["/usr/local/bin/gunicorn", "translation_microservice.wsgi", "-w", "2", "-b", ":8000", "--access-logfile", "-", "--capture-output"]

Seems like you are using incorrect variable names. Change POSTGRESQL_USERNAME to POSTGRES_USER and so on in your .env file
You might have to clear the docker volume containing your database if it has already been built using another environment. The default is to create a user and a database named postgres
Do not add the .env file to a public repo.

Related

Seed Django + Postgres with .sql file

I have a .sql file which I'd like to seed my django + postgres app with on build. That way I'll have some data in my app during development.
Here's my docker stuff:
Dockerfile
FROM python:3
ENV PYTHONUNBUFFERED=1
WORKDIR /code
COPY requirements.txt /code/
RUN pip3 install -r requirements.txt
COPY . /code/
docker-compose.yml
version: "3.9"
services:
db:
image: postgres
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
Django supports this usecase with fixtures: https://docs.djangoproject.com/en/3.2/howto/initial-data/
You can serialize your .sql database into a json file and feed that during build with the loaddata command.

Getting postgres connected in docker-compose

So I have a successfully built image where I have postgres mounted as a volume, however I keep getting django.db.utils.OperationalError: could not connect to server: Connection refused- when I ran docker-compose up. My Dockerfile and docker-compose.yml looks like this:
# syntax=docker/dockerfile:1
FROM python:3.10-slim-buster
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /track
COPY requirements.txt .
RUN apt-get update \
&& apt-get -y install libpq-dev gcc \
&& pip install --upgrade pip \
&& pip install --default-timeout=100 -r requirements.txt
COPY . .
ENV DJANGO_SETTINGS_MODULE track.settings
EXPOSE 8000
version: "3.9"
services:
django:
build: .
command: python manage.py runserver 0.0.0.0:8000
stdin_open: true
tty: true
volumes:
- .:/track
ports:
- "8000:8000"
depends_on:
- db
db:
image: postgres:9.6.10-alpine
restart: always
user: postgres
volumes:
- track_db:/var/lib/postgresql/data
environment:
POSTGRES_USER: postgres
POSTGRES_PASS: postgres
POSTGRES_DB: postgres
volumes:
track_db: {}
Please, what am I doing wrong?
Jango service must be linked to db:
services:
django:
build: .
command: python manage.py runserver 0.0.0.0:8000
stdin_open: true
tty: true
volumes:
- .:/track
ports:
- "8000:8000"
depends_on:
- db
links:
- db
Now you can connect to db by container hostname

Unable to communicate with postgres using docker and django

I don't achieve to communicate with my database postgres using Docker and Django. Here is my docker-compose.yml :
version: '3'
services:
web:
container_name: web
build:
context: ./web
dockerfile: Dockerfile
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./web/:/usr/src/web/
ports:
- 8000:8000
- 3000:3000
- 35729:35729
env_file:
- database.env
stdin_open: true
depends_on:
- database
database:
container_name: database
image: postgres
volumes:
- database-data:/var/lib/postgresql/data/
ports:
- 5432:5432
volumes:
database-data:
Here is my database.env :
# database.env
POSTGRES_USERNAME=admin
POSTGRES_PASSWORD=pass
POSTGRES_DBNAME=db
POSTGRES_HOST=database
POSTGRES_PORT=5432
PGUSER=admin
PGPASSWORD=pass
PGDATABASE=db
PGHOST=database
PGPORT=5432
DATABASE=db
SQL_HOST=database
SQL_PORT=5432
And here is my Dockerfile :
# pull official base image
FROM python:3.8.3-alpine
# set work directory
WORKDIR /usr/src/web
# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install psycopg2 dependencies
RUN apk update \
&& apk add postgresql-dev gcc python3-dev musl-dev
RUN apk add zlib-dev jpeg-dev gcc musl-dev
# install nodejs
RUN apk add --update nodejs nodejs-npm
# copy project
ADD . .
# install dependencies
RUN pip install --upgrade pip
RUN pip install -r requirements.txt
# run entrypoint.sh
ENTRYPOINT ["sh", "/usr/src/web/entrypoint.sh"]
And there my entrypoint.sh :
#!/bin/sh
if [ "$DATABASE" = "db" ]
then
echo "Waiting for postgres..."
while ! nc -z $SQL_HOST $SQL_PORT; do
sleep 10
done
echo "PostgreSQL started"
fi
exec "$#"
I build the docker using that : docker-compose up -d --build
Then I type that : docker-composexec web npm start --prefix ./front/ .
I can access to the frontent : http://localhost:3000
But when I do docker logs database I got that :
2021-01-18 06:31:49.207 UTC [1] LOG: database system is ready to accept connections
2021-01-18 06:31:51.640 UTC [32] FATAL: password authentication failed for user "admin"
2021-01-18 06:31:51.640 UTC [32] DETAIL: Role "admin" does not exist.
Connection matched pg_hba.conf line 99: "host all all all md5"
Here is the status :
37ee3e314d52 web "sh /usr/src/web/ent…" About a minute ago Up About a minute 0.0.0.0:3000->3000/tcp, 0.0.0.0:8000->8000/tcp, 5432/tcp web
65dfeae57a94 postgres "docker-entrypoint.s…" About a minute ago Up About a minute 0.0.0.0:5432->5432/tcp database
Coud you help me ?
Thank you very much !
It seems like the postgres user you are using doesn't exist. You can add some environment variables to database docker-compose to create those (you probably need to create the database, too), Or you can write some script to create those for the first time.
version: '3'
services:
web:
container_name: web
build:
context: ./web
dockerfile: Dockerfile
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./web/:/usr/src/web/
ports:
- 8000:8000
- 3000:3000
- 35729:35729
env_file:
- database.env
stdin_open: true
depends_on:
- database
database:
container_name: database
image: postgres
volumes:
- database-data:/var/lib/postgresql/data/
ports:
- 5432:5432
environment:
- POSTGRES_USER=admin
- POSTGRES_PASSWORD=pass
- POSTGRES_DB=db
volumes:
database-data:
About postgres image envs you can check this link .

docker-compose django app cannot find postgresql db

I'm trying to create a Django app in a docker container. The app would use a postgres db with postgis extension, which I have in another database. I'm trying to solve this using docker-compose but can not get it working.
I can get the app working without the container with the database containerized just fine. I can also get the app working in a container using a sqlite db (so a file included without external container dependencies). Whatever I do, it can't find the database.
My docker-compose file:
version: '3.7'
services:
postgis:
image: kartoza/postgis:12.1
volumes:
- postgres_data:/var/lib/postgresql/data/
ports:
- "${POSTGRES_PORT}:5432"
environment:
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_DB=${POSTGRES_DB}
env_file:
- .env
web:
build: .
# command: sh -c "/wait && python manage.py migrate --no-input && python /code/app/manage.py runserver 0.0.0.0:${APP_PORT}"
command: sh -c "python manage.py migrate --no-input && python /code/app/manage.py runserver 0.0.0.0:${APP_PORT}"
# restart: on-failure
ports:
- "${APP_PORT}:8000"
volumes:
- .:/code
depends_on:
- postgis
env_file:
- .env
environment:
WAIT_HOSTS: 0.0.0.0:${POSTGRES_PORT}
volumes:
postgres_data:
name: ${POSTGRES_VOLUME}
My Dockerfile (of the app):
# Pull base image
FROM python:3.7
LABEL maintainer="yb.leeuwen#portofrotterdam.com"
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install dependencies
# RUN pip install pipenv
RUN pip install pipenv
RUN mkdir /code/
COPY . /code
WORKDIR /code/
RUN pipenv install --system
# RUN pipenv install pygdal
RUN apt-get update &&\
apt-get install -y binutils libproj-dev gdal-bin python-gdal python3-gdal postgresql-client
## Add the wait script to the image
ADD https://github.com/ufoscout/docker-compose-wait/releases/download/2.7.3/wait /wait
RUN chmod +x /wait
# set work directory
WORKDIR /code/app
# RUN python manage.py migrate --no-input
# CMD ["python", "manage.py", "migrate", "--no-input"]
# RUN cd ${WORKDIR}
# If we want to run docker by itself we need to use below
# but if we want to run from docker-compose we'll set it there
EXPOSE 8000
# CMD /wait && python manage.py migrate --no-input
# CMD ["python", "manage.py", "migrate", "--no-input"]
# CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
My .env file:
# POSTGRES
POSTGRES_PORT=25432
POSTGRES_USER=username
POSTGRES_PASSWORD=pass
POSTGRES_DB=db
POSTGRES_VOLUME=data
POSTGRES_HOST=localhost
# GEOSERVER
# DJANGO
APP_PORT=8000
And finally my in my settings.py of the django app:
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.postgis',
'NAME': os.getenv('POSTGRES_DBNAME'),
'USER': os.getenv('POSTGRES_USER'),
'PASSWORD': os.getenv('POSTGRES_PASS'),
'HOST': os.getenv("POSTGRES_HOST", "localhost"),
'PORT': os.getenv('POSTGRES_PORT')
}
}
I've tried quite a lot of things (as you see in some comments). I realized that docker-compose doesn't seem to wait until postgres is fully up, spinning and accepting requests so I tried to build in a waiting function (as suggested on the website). I first had migrations and running the server inside the Dockerfile (migrations in the build process and runserver as the startup command), but that requires postgres and as it wasn't waiting for it it didn't function. I finally took it all out to the docker-compose.yml file but still can't get it working.
The error I get:
web_1 | Is the server running on host "localhost" (127.0.0.1) and accepting
web_1 | TCP/IP connections on port 25432?
web_1 | could not connect to server: Cannot assign requested address
web_1 | Is the server running on host "localhost" (::1) and accepting
web_1 | TCP/IP connections on port 25432?
Does anybody have an idea why this isn't working?
I see that in your settings.py of the django app, you are connecting to Postgres via
'HOST': os.getenv("POSTGRES_HOST", "localhost"),
While in .env you are setting the value of to POSTGRES_HOST to localhost. This means that the web container is trying to reach the Postgres server postgis at localhost which should not be the case.
In order to solve this problem, simply update your .env file to be like this:
POSTGRES_PORT=5432
...
POSTGRES_HOST=postgis
...
The reason is that in your case, the docker-compose brings up 2 containers: postgis and web inside the same Docker network and they can reach each other via their DNS name i.e. postgis and web respectively.
Regarding the port, web container can reach postgis at port 5432 but not 25432 while your host machine can reach the database at port 25432 but not 5432
you can not use localhost for the docker containers, it will be pointing to the container itself, not to the host of the containers. Instead switch to use the service name.
to fix the issue, change your env to
# POSTGRES
POSTGRES_PORT=5432
POSTGRES_USER=username
POSTGRES_PASSWORD=pass
POSTGRES_DB=db
POSTGRES_VOLUME=data
POSTGRES_HOST=postgis
# DJANGO
APP_PORT=8000
and you compose file to
version: '3.7'
services:
postgis:
image: kartoza/postgis:12.1
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_DB=${POSTGRES_DB}
env_file:
- .env
web:
build: .
# command: sh -c "/wait && python manage.py migrate --no-input && python /code/app/manage.py runserver 0.0.0.0:${APP_PORT}"
command: sh -c "python manage.py migrate --no-input && python /code/app/manage.py runserver 0.0.0.0:${APP_PORT}"
# restart: on-failure
ports:
- "${APP_PORT}:8000"
volumes:
- .:/code
depends_on:
- postgis
env_file:
- .env
environment:
WAIT_HOSTS: postgis:${POSTGRES_PORT}
volumes:
postgres_data:
name: ${POSTGRES_VOLUME}

"error readlink /var/lib/docker/overlay2 invalid argument" when I run docker-compose up

I'm trying to dockerize project written in django and postgresql. What I've already done:
I have environment variables stores in env_file:
SECRET_KEY=value
DEBUG=value
ALLOWED_HOSTS=['192.168.99.100']
DB_NAME=postgres
DB_USER=postgres
DB_PASSWORD=postgres
DB_HOST=db
DB_PORT=5432
My Dockerfile:
FROM python:3.7-stretch
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
COPY requirements.txt /code/
WORKDIR /code/
RUN pip install -r requirements.txt
COPY . /code/
My docker-compose.yml file:
version: '3'
services:
db:
image: postgres:11-alpine
restart: always
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
POSTGRES_PASSWORD: postgres
container_name: "my_postgres"
ports:
- "5432:5432"
web:
build: .
command: python /code/cameo/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code/
env_file:
- env_file
ports:
- "8000:8000"
depends_on:
- db
volumes:
postgres_data:
Please help, I don't know what I'm doing wrong.
If you are running Docker on Windows, some files may have been corrupted.
The following command worked for me :
"docker-compose down --rmi all"