How to scale Heroku Django Celery app with Docker - django

I'm trying to deploy my Django app to Heroku and i have several doubts about dynos, workers and deploy configuration.
In my heroku.yml file I have 2 types of processes, one for the web and the other for celery, I would like them to both have only 1 dyno but with several workers and to be scalable if necessary.
heroky.yml:
build:
docker:
web: Dockerfile-django
celery: Dockerfile-django
run:
web: gunicorn project.wsgi --log-file -
celery: celery -A project worker -B --loglevel=INFO
docker-compose.yml:
version: '3.7'
services:
web:
container_name: dilains_django_ctnr
build:
context: .
dockerfile: Dockerfile-django
restart: always
env_file: ./project/project/.env
command: python manage.py check
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./project:/dilains
depends_on:
- postgres
- redis
ports:
- 8000:8000
networks:
- dilains-ntwk
redis:
container_name: dilains_redis_ctnr
build:
context: .
dockerfile: Dockerfile-redis
volumes:
- ./redis-data:/data
ports:
- 3679:3679
networks:
- dilains-ntwk
celery:
container_name: dilains_celery_ctnr
build:
context: .
dockerfile: Dockerfile-django
restart: always
env_file: ./project/project/.env
command: celery -A project worker -B --loglevel=INFO
volumes:
- ./project:/dilains
depends_on:
- redis
- web
- postgres
networks:
- dilains-ntwk
networks:
dilains-ntwk:
driver: bridge
Dockerfile-django:
FROM python:3.7-alpine
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apk update && apk add --no-cache bash postgresql postgresql-dev gcc python3-dev musl-dev jpeg-dev zlib-dev libjpeg
RUN mkdir /dilains
COPY ./project /dilains/
COPY ./requirements.txt /dilains/
WORKDIR /dilains
RUN pip install -r requirements.txt
EXPOSE 8000
I tried scale with this commands to scale each process type with 4 workers:
$ heroku ps -a app_name
=== celery (Standard-1X): /bin/sh -c celery\ -A\ project\ worker\ -B\ --loglevel\=INFO (1)
celery.1: up 2020/10/23 08:05:31 +0200 (~ 41m ago)
=== web (Standard-1X): /bin/sh -c gunicorn\ project.wsgi\ --log-file\ - (1)
web.1: up 2020/10/23 08:05:40 +0200 (~ 41m ago)
$ heroku ps:scale web=1 worker=4 -a app_name
$ heroku ps:scale celery=1 worker=4 -a app_name
I'm paying Stardar-1X and tells: number of process type - unlimited and horizontally scaling - yes
Anybody could help please ?

Related

Django + Celery + Nginx

I try to run my django application in docker with Celery and Nginx.
Docker-compose
version: '3'
services:
helpdesk_web:
build:
context: ./
dockerfile: Dockerfile
container_name: helpdesk_web
volumes:
- ./static:/usr/src/app/static
- media:/usr/src/app/media
ports:
- "8000:8000"
- "5678:5678"
env_file:
- ./.env
restart: always
depends_on:
- helpdesk_db
- helpdesk_redis
helpdesk_db:
image: postgres
container_name: helpdesk_db
volumes:
- postgres_data:/var/lib/postgresql/data/
env_file:
- ./.env
ports:
- "5432:5432"
environment:
POSTGRES_DB: helpdesk_db
POSTGRES_PASSWORD: itds
POSTGRES_USER: itds
nginx:
build:
context: ./docker/nginx
dockerfile: Dockerfile
container_name: helpdesk_nginx
restart: on-failure
depends_on:
- helpdesk_web
- helpdesk_db
ports:
- "80:80"
volumes:
- ./static:/usr/src/app/static
- media:/usr/src/app/media
helpdesk_redis:
image: redis
ports:
- "6379:6379"
helpdesk_celery:
build:
context: .
dockerfile: Dockerfile
command: celery -A helpdesk worker -l INFO --pool=solo
depends_on:
- helpdesk_web
- helpdesk_redis
helpdesk_celery-beat:
build:
context: .
dockerfile: Dockerfile
command: celery -A helpdesk beat -l INFO --scheduler django_celery_beat.schedulers:DatabaseScheduler
depends_on:
- helpdesk_web
- helpdesk_redis
volumes:
postgres_data:
media:
Dockerfile
FROM python:3.10
WORKDIR /usr/src/app
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN pip install --upgrade pip
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
RUN chmod +x entrypoint.sh
ENTRYPOINT ["/usr/src/app/entrypoint.sh"]
entrypoint.sh
#! /bin/sh
if [ "$DATABASE" = "postgres" ]
then
echo "Waiting for postgres..."
while ! nc -z $SQL_HOST $SQL_PORT; do
sleep 0.1
done
echo "PostgreSQL started"
fi
python manage.py migrate --no-input
exec gunicorn helpdesk.wsgi:application -b 0.0.0.0:8000 --workers=$WORKERS_COUNT
When I run docker-compose I get that error
I have dev docker-compose without nginx and it works fine. It seems that there are some problems between celery and nginx. I am a beginner to docker so I don't know what to do.
What am I missing?
EDIT №1:
It's like celery and celery-beat are set as gunicorn.
The Compose command: overrides the Dockerfile CMD, and is passed as arguments to the ENTRYPOINT. A typical setup is to have your entrypoint script do first-time setup, like waiting for a database, but then end with the special shell command exec "$#" to run the command passed to it as arguments.
#!/bin/sh
# Wait for the database, run migrations, set environment variables, ...
if [ "$DATABASE" = "postgres" ]; then ...; fi
python manage.py migrate --no-input
# Run the command passed as arguments, not a fixed command here
exec "$#"
In your Dockerfile declare the default CMD you want the container to run, like GUnicorn.
ENTRYPOINT ["/usr/src/app/entrypoint.sh"]
CMD gunicorn helpdesk.wsgi:application -b 0.0.0.0:8000 --workers=$WORKERS_COUNT
Now it will honor the Compose command:, or another command you docker-compose run.
version: '3.8'
services:
web:
build: .
ports: ['8000:8000', '5678:5678']
env_file: [.env]
restart: always
depends_on: [db, redis]
# and the default image CMD
celery:
build: .
command: celery -A helpdesk worker -l INFO --pool=solo
depends_on: [redis]
celery-beat:
build: .
command: celery -A helpdesk beat -l INFO --scheduler django_celery_beat.schedulers:DatabaseScheduler
depends_on: [redis]
db: { ... }
redis: { image: redis }

Django/Docker: migration not detected and not applied

Stack: Django/Docker/Docker-compose/Postgresql (not in container)
I have made modifications, including models updates, saved and push to my remote Gitlab repository.
Then, I pulled modification from my Gitlab repo on the preprod server and I can see that I have the modified version on the server.
But when I stop and restart the container, it does not detect any changes and does not apply the migrations.
I also checked, the entrypoint.preprod.sh file contains the makemigrations and migrate commands.
I have tried by rebuilding it with docker-compose build then run it, but it doesn't work anymore.
I tried by connecting directly to my container (docker exec -it web sh) but makemigrations are not detected and migrations are therefore not applied.
I must be missing something but what?
docker-compose-preprod.yml
version: '3.7'
services:
web:
restart: always
container_name: virage_web
build:
context: ./app
dockerfile: Dockerfile.preprod
restart: always
command: gunicorn core.wsgi:application --bind 0.0.0.0:8000
volumes:
- app_volume:/usr/src/app
- static_volume:/usr/src/app/static
- media_volume:/usr/src/app/media
expose:
- 8000
env_file:
- ./.env.preprod
entrypoint: [ "/usr/src/app/entrypoint.preprod.sh" ]
depends_on:
- redis
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/"]
interval: 30s
timeout: 10s
retries: 50
redis:
container_name: virage_redis
image: "redis:alpine"
celery:
container_name: virage_celery
build:
context: ./app
dockerfile: Dockerfile.preprod
command: celery -A core worker -l info
volumes:
- app_volume:/usr/src/app
env_file:
- ./.env.preprod
depends_on:
- web
- redis
celery-beat:
container_name: virage_celery-beat
build:
context: ./app
dockerfile: Dockerfile.preprod
command: celery -A core beat -l info
volumes:
- app_volume:/usr/src/app
env_file:
- ./.env.preprod
depends_on:
- web
- redis
nginx:
container_name: virage_nginx
build: ./nginx
restart: always
volumes:
- static_volume:/usr/src/app/static
- media_volume:/usr/src/app/media
ports:
- 1350:80
depends_on:
- web
volumes:
static_volume:
media_volume:
app_volume:
Dockerfile.preprod
# Pull the official base image
FROM python:3.8.3-alpine
# Set a work directory
WORKDIR /usr/src/app
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# install psycopg2 dependencies
RUN apk update && apk add postgresql-dev gcc g++ python3-dev musl-dev
RUN apk --update add libxml2-dev libxslt-dev libffi-dev musl-dev libgcc openssl-dev curl postgresql-client
RUN apk add jpeg-dev zlib-dev freetype-dev lcms2-dev openjpeg-dev tiff-dev tk-dev tcl-dev nano
RUN pip3 install psycopg2 psycopg2-binary
# install xgettext for i18n
RUN apk add gettext
# Install dependencies
COPY requirements/ requirements/
RUN pip install --upgrade pip && pip install -r requirements/preprod.txt
# Copy the entrypoint.sh file
COPY entrypoint.preprod.sh .
# Copy the initdata sql file
COPY initdata.preprod.sql .
# Copy the project's files
COPY . .
RUN chmod +x entrypoint.preprod.sh
entrypoint.preprod.sh
#!/bin/sh
if [ "$DATABASE" = "postgres" ]
then
echo "Waiting for postgres..."
while ! nc -z $SQL_HOST $SQL_PORT; do
sleep 0.1
done
echo "PostgreSQL started"
fi
# python manage.py flush --no-input
python manage.py makemigrations
python manage.py migrate
python manage.py collectstatic --no-input
python manage.py makemessages -l fr
python manage.py compilemessages
exec "$#"

Unable to Run Celery and celery beat using docker in django application (Unable to load celery application)

when I am trying to run my application I using without docker its working perfectly , but In docker-compose I am getting this error :
| Error: Invalid value for '-A' / '--app':
| Unable to load celery application.
| The module sampleproject was not found.
my docker-compose file
app:
container_name: myapp
hostname: myapp
build:
context: .
dockerfile: Dockerfile
image: sampleproject
tty: true
command: >
bash -c "
python manage.py migrate &&
python manage.py runserver 0.0.0.0:8000
"
env_file: .env
ports:
- "8000:8000"
volumes:
- .:/project
depends_on:
- database
- redis
redis:
image: redis:alpine
celery:
build:
context: ./
dockerfile: Dockerfile
command: celery -A sampleproject worker -l info
depends_on:
- database
- redis
celery-beat:
build: .
command: celery -A sampleproject beat -l info
depends_on:
- database
- redis
- celery
my Docker file
FROM python:3.8
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt --no-cache-dir \
&& rm -rf requirements.txt
RUN mkdir /project
WORKDIR /project
my folder structure is something like this :
I had the same problem and with that we found the solution. In fact, celery is right to complain about not being able to run, as it needs an instance of the application.
For that, it is only necessary to add the volume directive in the docker-compose .yaml file, directing to the project folder, in case into the service celery and celery-beat.
Example:
app:
container_name: myapp
hostname: myapp
build:
context: .
dockerfile: Dockerfile
image: sampleproject
tty: true
command: >
bash -c "
python manage.py migrate &&
python manage.py runserver 0.0.0.0:8000
"
env_file: .env
ports:
- "8000:8000"
volumes:
- .:/project
depends_on:
- database
- redis
redis:
image: redis:alpine
celery:
build:
context: ./
dockerfile: Dockerfile
command: celery -A sampleproject worker -l info
volumes:
- .:/project
depends_on:
- database
- redis
celery-beat:
build: .
command: celery -A sampleproject beat -l info
volumes:
- .:/project
depends_on:
- database
- redis
- celery
So when the celery container is executed it will see that there is a volume and will execute the project without any problems.

Issue Dockerising Django + Celery... Unable to load celery application

I'm having a slight issue with spinning up a Dockerised stack of: Django, Redis, Celery on a server backbone of nginx and gunicorn. The Django application connects to an external PostgreSQL database.
I have the following docker-compose.yaml file:
version: '3.7'
services:
web:
build:
context: ./django
dockerfile: Dockerfile
expose:
- "8000"
volumes:
- django-static:/usr/src/app/static
- django-uploads:/usr/src/app/uploads
depends_on:
- redis
links:
- redis:redis
environment:
...
command: /usr/src/app/wave.sh
nginx:
build:
context: nginx
dockerfile: Dockerfile
restart: on-failure
ports:
- 80:80
- 443:443
volumes:
- ssl_data:/etc/resty-auto-ssl
environment:
ALLOWED_DOMAINS: "${STAGING_ALLOWED_DOMAINS}"
SITES: "${STAGING_SITES}"
redis:
restart: always
image: redis:latest
ports:
- '6379:6379'
celery:
build: ./django
command: celery --app=Wave.celery.app worker --loglevel=DEBUG
environment:
...
volumes:
- './django:/usr/src/app/'
links:
- redis:redis
depends_on:
- web
- redis
- nginx
restart: on-failure
celery-beat:
build: ./django
command: celery --app=Wave.celery.app beat --loglevel=DEBUG
environment:
...
volumes:
- './django:/usr/src/app/'
links:
- redis:redis
depends_on:
- web
- redis
- nginx
restart: on-failure
volumes:
ssl_data:
django-static:
django-uploads:
However, on utilising:
docker-compose -f docker-compose.wave-api-staging.yml up --build --force-recreate
The wave.sh script is as follows:
#!/bin/sh
rm -rf celerybeat.pid
ln -s /run/shm /dev/shm
python3 manage.py collectstatic --no-input
python3 manage.py migrate
python3 manage.py shell < createsuperuser.py
pip3 freeze
/usr/local/bin/gunicorn Wave.wsgi:application --timeout 3600 --log-level=DEBUG -w 5 -b :8000
Which replaces the following command instruction in the web service:
command: bash -c "rm -rf celerybeat.pid && python3 manage.py makemigrations && python3 manage.py migrate --run-syncdb && python3 manage.py shell < createsuperuser.py && python3 manage.py runserver 0.0.0.0:8000"
celery.py:
import os
import datetime
from celery import Celery
from celery.schedules import crontab
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'Wave.settings')
app = Celery('Wave', include=['amq.v1.SurfTaskRunner.demand', 'amq.v1.SurfTaskRunner.periodic'])
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(
crontab(minute='6,16,26,36,46,56', hour='*', day_of_week='*', day_of_month='*'),
dispatch_surfers_waivers_periodic_task.s(),
)
#app.task(bind=True)
def dispatch_surfers_waivers_periodic_task(self):
Dockerfile:
FROM python:3.7.2-slim
RUN apt-get update && apt-get -y install cron && apt-get -y install nano
ENV PYTHONUNBUFFERED 1
RUN mkdir /usr/src/app
WORKDIR /usr/src/app
COPY ./requirements.txt /usr/src/app/requirements.txt
RUN pip install -r requirements.txt
COPY ./wave.sh /usr/src/app/wave.sh
RUN ["chmod", "+x", "/usr/src/app/wave.sh"]
COPY . /usr/src/app
I receive the following errors from the celery and celery beat services:
celery_1 | Error:
celery_1 | Unable to load celery application.
celery_1 | The module Wave was not found.
celery-beat_1 | Error:
celery-beat_1 | Unable to load celery application.
celery-beat_1 | The module Wave was not found.
I have tried reading and google extensively, but so far have found zero clues as to what could be going wrong here...I'm at the point of completely ditching Celery if I can't get this to work?
However, the same process can be spun up locally without nginx, and with the standard python manage.py runserver command.
Update: here is my local development docker-compose.yml file:
version: '3.7'
services:
django:
build:
context: ./django
dockerfile: Dockerfile
ports:
- 8982:8982
depends_on:
- postgres
- redis
links:
- postgres:postgres
- redis:redis
command: bash -c "rm -rf celerybeat.pid && python3 manage.py makemigrations && python3 manage.py migrate --run-syncdb && python3 manage.py shell < createsuperuser.py && python3 manage.py runserver 0.0.0.0:8982"
volumes: ['./django:/usr/src/app/']
environment:
...
redis:
restart: always
image: redis:latest
ports:
- '6379:6379'
celery:
build: ./django
command: celery -A Wave worker --loglevel=DEBUG
environment:
...
volumes:
- './django:/usr/src/app/'
links:
- postgres:postgres
- redis:redis
depends_on:
- django
- postgres
- redis
restart: on-failure
celery-beat:
build: ./django
command: celery -A Wave beat --loglevel=DEBUG
environment:
...
volumes:
- './django:/usr/src/app/'
links:
- postgres:postgres
- redis:redis
depends_on:
- django
- postgres
- redis
restart: on-failure
postgres:
build:
context: ./postgres
dockerfile: Dockerfile
ports:
- 5433:5432
expose:
- 5432
environment:
- POSTGRES_DB=conveyor
- POSTGRES_USER=conveyor
- POSTGRES_PASSWORD=89$r55cP%fSDDghypoAsdd

Set Up RabbitMQ on Docker with Django

I'm trying to set up a Docker container with Django. My Docker file is:
FROM python:3.7-alpine
MAINTAINER Freshness Productions
ENV PYTHONUNBUFFERED 1
COPY ./requirements.txt /requirements.txt
RUN apk add --update --no-cache postgresql-client
RUN apk add --update --no-cache --virtual .tmp-build-deps \
gcc libc-dev linux-headers postgresql-dev
RUN pip install -r /requirements.txt
RUN apk del .tmp-build-deps
RUN mkdir /app
WORKDIR /app
COPY ./app /app
RUN adduser -D user
USER user
RUN rabbitmqctl add_user test testpass1
RUN rabbitmqctl add_vhost myvhost
RUN rabbitmqctl set_permissions -p myvhost test ".*" ".*" ".*"
RUN rabbitmq-server
And my docker-compose.yml is:
version: "3"
services:
app:
build:
context: .
image: &app app
ports:
- "8000:8000"
volumes:
- ./app:/app
command: >
sh -c "python manage.py wait_for_db &&
python manage.py migrate &&
python manage.py runserver 0.0.0.0:8000
export C_FORCE_ROOT='true'
celery -A app beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler"
env_file: &envfile
- env.env
depends_on:
- db
- broker
db:
image: postgres:10-alpine
environment:
- POSTGRES_DB=app
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=supersecretpassword
worker:
build: .
image: *app
restart: "no"
env_file: *envfile
command: sh -c "celery -A app worker --loglevel=info"
volumes:
- ./app:/app
depends_on:
- broker
broker:
image: rabbitmq:3
env_file: *envfile
ports:
- 5672:5672
I don't see what I need to do - other than I'm using two different images in my docker-compose? Does that matter? I get the error:
ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**#127.0.0.1:5672//: [Errno 111] Connection refused.
when I try to run docker-compose up
There you can find this configuration it works perfectly on with RabbitMQ, Python Django, MYQL, and Celery and Celery Beat.
don't forget to up-vote if you like. Thanks.
docker-compose.yml
version: "3.7"
services:
django:
container_name: django_python
image: python:3.6
command: bash -c "pip3 install -r requirements.txt && python manage.py runserver 0.0.0.0:8000"
volumes:
- ${APP_PATH}var/www/api.mylaser.fr/mylazer_backend:/app
- ${APP_PATH}var/www/static.mylaser.fr:/app/static
depends_on:
- mysql
working_dir: /app
environment:
- PYTHONUNBUFFERED=1
- MYSQL_HOST=mysql
- MYSQL_USER=${MYSQL_USER}
- MYSQL_PASSWORD=${MYSQL_PASSWORD}
- MYSQL_DATABASE=${MYSQL_DATABASE}
- CELERY_BROKER=${CELERY_BROKER}
mysql:
container_name: python_mysql
image: mariadb:latest
#ports:
# - "3306:3306"
volumes:
- ${APP_PATH}var/mysql:/var/lib/mysql
- ${APP_PATH}etc/mysql:/etc/mysql
environment:
- MYSQL_ROOT_PASSWORD=${MYSQL_ROOT_PASSWORD}
- MYSQL_DATABASE=${MYSQL_DATABASE}
- MYSQL_USER=${MYSQL_USER}
- MYSQL_PASSWORD=${MYSQL_PASSWORD}
- TZ=Europe/Paris
rabbitmq:
image: rabbitmq:3.8.2-management
container_name: rabbitmq
volumes:
- ${APP_PATH}etc/rabbitmq:/etc/rabbitmq/:rw
- ${APP_PATH}var/rabbitmq:/var/lib/rabbitmq/:rw
- ${APP_PATH}var/log/rabbitmq:/var/log/rabbitmq/:rw
environment:
HOSTNAME: ${RABBITMQ_HOSTNAME}
RABBITMQ_ERLANG_COOKIE: ${RABBITMQ_ERLANG_COOKIE}
RABBITMQ_DEFAULT_USER: ${RABBITMQ_DEFAULT_USER}
RABBITMQ_DEFAULT_PASS: ${RABBITMQ_DEFAULT_PASS}
ports:
- ${RABBITMQ_PORT}:5672
- ${RABBITMQ_MGT_PORT}:15672
worker:
image: python:3.6
container_name: worker
command: bash -c "pip3 install -r requirements.txt && celery -A ${PROJECT_NAME} worker -l info"
working_dir: /app
volumes:
- ${APP_PATH}var/www/api.mylaser.fr/mylazer_backend:/app
environment:
- PYTHONUNBUFFERED=1
- MYSQL_HOST=mysql
- MYSQL_USER=${MYSQL_USER}
- MYSQL_PASSWORD=${MYSQL_PASSWORD}
- MYSQL_DATABASE=${MYSQL_DATABASE}
- CELERY_BROKER=${CELERY_BROKER}
depends_on:
- mysql
- rabbitmq
beat-worker:
image: python:3.6
container_name: beat-worker
command: bash -c "pip3 install -r requirements.txt && celery -A ${PROJECT_NAME} beat -l info"
working_dir: /app
volumes:
- ${APP_PATH}var/www/api.mylaser.fr/mylazer_backend:/app
environment:
- PYTHONUNBUFFERED=1
- MYSQL_HOST=mysql
- MYSQL_USER=${MYSQL_USER}
- MYSQL_PASSWORD=${MYSQL_PASSWORD}
- MYSQL_DATABASE=${MYSQL_DATABASE}
- CELERY_BROKER=${CELERY_BROKER}
depends_on:
- mysql
- rabbitmq