I have a .sql file which I'd like to seed my django + postgres app with on build. That way I'll have some data in my app during development.
Here's my docker stuff:
Dockerfile
FROM python:3
ENV PYTHONUNBUFFERED=1
WORKDIR /code
COPY requirements.txt /code/
RUN pip3 install -r requirements.txt
COPY . /code/
docker-compose.yml
version: "3.9"
services:
db:
image: postgres
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
Django supports this usecase with fixtures: https://docs.djangoproject.com/en/3.2/howto/initial-data/
You can serialize your .sql database into a json file and feed that during build with the loaddata command.
Related
I have been trying to containerise a project of mine that uses postgresql. But for some reason it is unable to perform the commands completely, when I check the web container, by following it on docker-desktop I can see that it is performing the migrations but at some point it stops and gives me the following error:
2023-01-17 19:51:29 Apply all migrations: admin, auth, company, contenttypes, oauth2_provider, product, sessions, social_django, users, warehouse
2023-01-17 19:51:29 Running migrations:
2023-01-17 19:51:29 No migrations to apply.
2023-01-17 19:51:29 sh: manage.py: not found
For some reason it does not see manage.py.
When I run docker-compose run web sh and when I check if manage.py exists in the directory using ls in the CLI I discover it is there.
What could I be doing wrong here?
The Dockerfile:
# pull official base image
FROM python:3.9-alpine
# set work directory
WORKDIR /app
# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install psycopg2 dependencies
RUN apk update \
&& apk add postgresql-dev gcc python3-dev musl-dev
# install python dependencies
RUN apk add -u zlib-dev jpeg-dev libffi-dev gcc musl-dev
RUN python3 -m pip install --upgrade pip
COPY requirements.txt /app/requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
# copy project
COPY . .
The docker-compose.yml:
version: '3.9'
services:
nginx:
container_name: core_web
restart: on-failure
image: nginx:stable
volumes:
- ./nginx/nginx.dev.conf:/etc/nginx/conf.d/default.conf
- static_volume:/app/static
ports:
- "80:80"
depends_on:
- web
web:
container_name: web
build: .
env_file: .env
ports:
- "5000:5000"
command: >
sh -c " python manage.py makemigrations && python manage.py migrate &&
manage.py runserver 0.0.0.0:5000"
volumes:
- .:/app
- static_volume:/app/static
depends_on:
- db
db:
container_name: core_db
image: postgres:15
env_file: .env
environment:
- POSTGRES_USER=${DB_USER}
- POSTGRES_PASSWORD=${DB_PASS}
- POSTGRES_DB=${DB_NAME}
volumes:
- postgres_data:/var/lib/postgresql/data/
volumes:
static_volume:
postgres_data:
I changed the web of my docker-compose.yml file:
web:
tty: true
container_name: web
build: .
restart: always
env_file: .env
ports:
- "5000:5000"
command: >
sh -c " python3 /app/manage.py makemigrations && python3 /app/manage.py migrate && python3 /app/manage.py runserver 0.0.0.0:5000"
volumes:
- .:/app
- static_volume:/app/static
depends_on:
- db
Basically, I added tty: true and changed command to:
sh -c " python3 /app/manage.py makemigrations && python3 /app/manage.py migrate && python3 /app/manage.py runserver 0.0.0.0:5000"
What I believe was happening:
I did not specify the folder of the location of the manage.py, which is what I believe I set in volumes to be /app. Thus I had to state in the command its location, feel free to correct me on this.
This made it functional and allowed me to access the backend and its postgres database.
So I have a successfully built image where I have postgres mounted as a volume, however I keep getting django.db.utils.OperationalError: could not connect to server: Connection refused- when I ran docker-compose up. My Dockerfile and docker-compose.yml looks like this:
# syntax=docker/dockerfile:1
FROM python:3.10-slim-buster
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
WORKDIR /track
COPY requirements.txt .
RUN apt-get update \
&& apt-get -y install libpq-dev gcc \
&& pip install --upgrade pip \
&& pip install --default-timeout=100 -r requirements.txt
COPY . .
ENV DJANGO_SETTINGS_MODULE track.settings
EXPOSE 8000
version: "3.9"
services:
django:
build: .
command: python manage.py runserver 0.0.0.0:8000
stdin_open: true
tty: true
volumes:
- .:/track
ports:
- "8000:8000"
depends_on:
- db
db:
image: postgres:9.6.10-alpine
restart: always
user: postgres
volumes:
- track_db:/var/lib/postgresql/data
environment:
POSTGRES_USER: postgres
POSTGRES_PASS: postgres
POSTGRES_DB: postgres
volumes:
track_db: {}
Please, what am I doing wrong?
Jango service must be linked to db:
services:
django:
build: .
command: python manage.py runserver 0.0.0.0:8000
stdin_open: true
tty: true
volumes:
- .:/track
ports:
- "8000:8000"
depends_on:
- db
links:
- db
Now you can connect to db by container hostname
I'm trying to dockerize project written in django and postgresql. What I've already done:
I have environment variables stores in env_file:
SECRET_KEY=value
DEBUG=value
ALLOWED_HOSTS=['192.168.99.100']
DB_NAME=postgres
DB_USER=postgres
DB_PASSWORD=postgres
DB_HOST=db
DB_PORT=5432
My Dockerfile:
FROM python:3.7-stretch
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
COPY requirements.txt /code/
WORKDIR /code/
RUN pip install -r requirements.txt
COPY . /code/
My docker-compose.yml file:
version: '3'
services:
db:
image: postgres:11-alpine
restart: always
volumes:
- postgres_data:/var/lib/postgresql/data/
environment:
POSTGRES_PASSWORD: postgres
container_name: "my_postgres"
ports:
- "5432:5432"
web:
build: .
command: python /code/cameo/manage.py runserver 0.0.0.0:8000
volumes:
- .:/code/
env_file:
- env_file
ports:
- "8000:8000"
depends_on:
- db
volumes:
postgres_data:
Please help, I don't know what I'm doing wrong.
If you are running Docker on Windows, some files may have been corrupted.
The following command worked for me :
"docker-compose down --rmi all"
Official site says, that if you wanna build postgresql databse just put varibles POSTGRES_USER, POSTGRES_PASSWORD and POSTGRES_DB. I put them into .env file and write in docker-compose file env_file: ./.env, but then django says: django.db.utils.OperationalError: FATAL: database "languages" does not exist. It means I should write .sql script with commands. Questions:
Writing .sql sript is the only way to make it work?
How can I write it and where to put it if I don't want explicitly
write sensetive info like password, I wanna use .env file as i will store .sql script in public repo?
My .env file
DEBUG=True
SECRET_KEY=...
POSTGRESQL_DATABASE=languages
POSTGRESQL_USERNAME=admin
POSTGRESQL_PASSWORD=pass
POSTGRESQL_ADDRESS=postgres
POSTGRES_DB=languages
My docker-compose file:
version: '3'
services:
web:
build: ./web
env_file: ./web/microservice/.env
volumes:
- ./web:/code/
depends_on:
- postgres
command: python manage.py runserver 0.0.0.0:8000
nginx:
build: ./nginx/
ports:
- "80:80"
- "443:443"
volumes:
- ./web/static:/code/static
depends_on:
- web
postgres:
image: postgres:latest
ports:
- "5432:5432"
env_file: ./web/microservice/.env
volumes:
- /var/lib/postgresql/data/
My Dockefile:
FROM python:3.5
ENV PYTHONUNBUFFERED 1
RUN mkdir /code
WORKDIR /code
ADD ./requirements.txt /code/
RUN pip install --no-cache-dir -r requirements.txt
ADD . /code/
EXPOSE 8000
CMD ["/usr/local/bin/gunicorn", "translation_microservice.wsgi", "-w", "2", "-b", ":8000", "--access-logfile", "-", "--capture-output"]
Seems like you are using incorrect variable names. Change POSTGRESQL_USERNAME to POSTGRES_USER and so on in your .env file
You might have to clear the docker volume containing your database if it has already been built using another environment. The default is to create a user and a database named postgres
Do not add the .env file to a public repo.
I have a docker-compose file with django project that trying to use database situated on host machine.
Now my Dockerfile is:
FROM python:3-slim
ENV PYTHONUNBUFFERED 1
RUN mkdir /code.
WORKDIR /code
ADD . /code/
RUN pip install -r requirements.txt
RUN export dockerhost=$(docker-machine ip)
docker-compose.yml:
version: "2"
networks:
workernetwork:
webnetwork:
services:
static:
volumes:
- /static:/static
- /media:/media
image: alpine:latest
web:
build: .
command: bash -c "SECRET_KEY=temp_value python /code/manage.py collectstatic --noinput && python /code/manage.py migrate && /code/run_gunicorn.sh"
volumes:
- .:/code
volumes_from:
- static
env_file:
- secrets.env
ports:
- 443:443
networks:
- webnetwork
extra_hosts:
- "dockerhost:${dockerhost}"
DATABASES in settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'revolution',
'USER': get_env_setting('POSTGRES_USER'),
'PASSWORD': get_env_setting('POSTGRES_PASSWORD'),
'HOST': 'dockerhost',
'PORT': 5432,
}
}
What Im doing wrong?
Thx for attention!
Finally resolved with docker volume:
as first, create volume:
docker volume create --name=coredb
docker-compose.yml
version: "2"
services:
...
web:
build:
context: .
command: bash -c "python /code/manage.py collectstatic --noinput && python /code/manage.py migrate && /code/run_gunicorn.sh"
volumes:
- /static:/data/web/static
- /media:/data/web/media
- .:/code
env_file:
- ../.env
depends_on:
- db
db:
restart: always
image: postgres
env_file:
- ../.env
volumes:
- pgdata: /var/lib/postgresql/data
volumes:
pgdata:
external:
name: coredb
.env variables:
POSTGRES_DB={hidden}
POSTGRES_USER={hidden}
POSTGRES_PASSWORD={hidden}
Important:
You need to create database and user manually via docker exec -it backend_db_1 bash and following this instructions till Install Django within a Virtual Environment chapter