How to deploy django-q worker with docker? - django

Assume this simple docker-compose file.
version: "3.9"
services:
redis:
image: redis:alpine
ports:
- 6379:6379
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- redis
How can i add django-q worker to handle tasks from web container?
I could probably build same image with different command such as python manage.py qcluster but I dont think this solution si elegant. Could you suggest some approach how to do that?

Probably the most easy thing that you can do is to add a new cluster for qcluster.
Something like this:
version: "3.9"
services:
redis:
image: redis:alpine
ports:
- 6379:6379
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- redis
djangoq:
build: .
command: python manage.py qcluster
volumes:
- .:/code
ports:
- "8000:8001"
depends_on:
- redis

Related

How to make the data of table kept even after containers deleted?

My purpose is to keep data even after deleting and rebuilding containers.
More specifically, I want to keep data even after putting like "docker-comand down" or "docker-comand up -d --build".
My environment is
Docker
Django
PostgreSQL
Nginx
Gunicorn
docker-compose.prod.yml
version: '3.8'
services:
web:
build:
context: ./app
dockerfile: Dockerfile.prod
command: gunicorn ecap.wsgi:application --bind 0.0.0.0:8000
volumes:
- static_volume:/home/app/web/staticfiles
- media_volume:/home/app/web/mediafiles
expose:
- 8000
env_file:
- ./.env.prod
depends_on:
- db
db:
image: postgres:13.0-alpine
volumes:
- db_data:/var/lib/postgresql/data/
env_file:
- ./.env.prod.db
nginx:
build: ./nginx
volumes:
- ./static_volume:/home/app/web/staticfiles
- ./media_volume:/home/app/web/mediafiles
ports:
- 80
depends_on:
- web
volumes:
db_data:
static_volume:
media_volume:
env.prod.db
POSTGRES_USER=nita
POSTGRES_PASSWORD=*******
POSTGRES_DB=ecap_prod
I assume that the problems is how to write volumes in the yml file.
Although I followed the shown way, I cannot keep the data.
The problem is in the entrypoint.sh file.
In the file, there is the code python manage.py flush --no-input.
When I removed the code, the data can persist.

how to set delay to restart container in docker-compose?

I have a problem with the initial launch of docker-compose up, when DB is not initialized yet and django throws an error.
I tried to use 'restart_police', but it didn't help and the webservice was restarted almost without waiting and forward the DB service, regardless of which reload period I wouldn't set
version: "3.9"
services:
web:
build: .
command: bash -c "python manage.py makemigrations && python manage.py migrate && python manage.py runserver 0.0.0.0:8000"
volumes:
- .:/app
ports:
- "8000:8000"
deploy:
restart_policy:
condition: on-failure
delay: 15s
environment:
- POSTGRES_NAME=${POSTGRES_DB}
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_HOST=db
depends_on:
- db
db:
container_name: db_pg
image: postgres
hostname: postgres
environment:
POSTGRES_DB: ${POSTGRES_DB}
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_HOST: db
volumes:
- ./data/db:/var/lib/postgresql/data
restart: unless-stopped
pgadmin:
image: dpage/pgadmin4
depends_on:
- db
ports:
- "5555:80"
environment:
PGADMIN_DEFAULT_EMAIL: pgadmin4#pgadmin.org
PGADMIN_DEFAULT_PASSWORD: admin
volumes:
- ./data/pgadmin:/var/lib/pgadmin/data

How to import a Postgres database into a docker container?

I'm trying to import PostgreSql dump to docker container, but it doesn't work
Dockerfile:
FROM postgres
COPY postgres.sql /docker-entrypoint-initdb.d/
version: "3.9"
docker-compose.yml
services:
db:
build: ./DB
volumes:
- ./data/db:/var/lib/postgresql/data
environment:
- POSTGRES_DB=gamenews
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=321678
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
environment:
- POSTGRES_NAME=gamenews
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=321678
depends_on:
- db
structure:
docker compose up LOGS:
enter image description here
I would suggest that you use a proper Postgres image:
postgres:
image: postgres:13
volumes:
- '.:/app:rw'
- 'postgres:/var/lib/postgresql/data'
Here's a list of all the tags you can use: https://hub.docker.com/_/postgres
Just spin that up, your volume maps the data to your hd, so it's not ephemeral in the container, then you can run pg_restore on your dump file.

django live reload doesnt work with 2nd docker-compose file

I want to create 2 environments. Test and the standard dev environment. I need to run the django test server within test environment and the regular server, manage.py runserver on the other. The main dev environemnt will use the docker-compse.yml and the test environment will use test.yml. When I run docker-compose up, live-reload works normally. When i run docker-compose -f test.yml up, the test server runs but docker does not do live reloads. I add the same services to both files to shorten the CLI syntax.
docker-compose.yml
version: "3.9"
services:
web:
build:
dockerfile: ./compose/django/Dockerfile
context: .
container_name: main_app_django
env_file:
- ./.local/.env
command: compose/django/start.sh
volumes:
- .:/code
ports:
- "8000:8000"
redis:
container_name: main_app_redis
image: "redis:alpine"
command: redis-server
ports:
- "6379:6379"
celeryworker:
build:
dockerfile: ./compose/celery/Dockerfile
context: .
container_name: main_app_celery
command: celery -A app worker -l INFO
env_file:
- ./.local/.env
volumes:
- .:/code
depends_on:
- redis
test.yml
version: "3.9"
services:
web:
build:
dockerfile: ./compose/django/Dockerfile
context: .
container_name: test_main_app_django
env_file:
- ./.local/.env
command: >
sh -c "python manage.py makemigrations &&
python manage.py migrate && python manage.py testserver cypress/fixtures/user.json cypress/fixtures/tracks.json --addrport 0.0.0.0:8000"
volumes:
- .:/code
ports:
- "8000:8000"
redis:
container_name: test_main_app_redis
image: "redis:alpine"
command: redis-server
ports:
- "6379:6379"
celeryworker:
build:
dockerfile: ./compose/celery/Dockerfile
context: .
container_name: test_main_app_celery
command: celery -A appworker -l INFO
env_file:
- ./.local/.env
volumes:
- .:/code
depends_on:
- redis

Script sql in docker-entrypoint-initdb.d not executed

I try to initialize my postgresql database using sql script in docker-entrypoint-initdb.d folder
I have no error message but database is not initialized even if I suppress container and rebuilt
what is wrong with my docker-compose.yml file?
docker-compose.dev.yml
version: '3.7'
services:
web:
restart: always
container_name: coverage-africa-web-dev
build:
context: ./app
dockerfile: Dockerfile.dev
restart: always
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./app:/usr/src/app
- ./docker-entrypoint-initdb.d/initdata.dev.sql:/docker-entrypoint-initdb.d/initdata.dev.sql
ports:
- 8000:8000
env_file:
- ./.env.dev