Django - setup path to celerybeat scheduler in supervisor - django

In my django settings.py file I have the following code for the celerybeat scheduler
CELERYBEAT_SCHEDULE = {
'call-every-30-seconds': {
'task': 'myapp.tasks.update_value',
'schedule': timedelta(minutes=30),
},
}
How would I set the path to my CELERYBEAT_SCHEDULE in my supervisord.conf file, which looks like this
[program:celerybeat]
command=celery beat -A RPF1 --schedule path/to/celerybeat/schedule --loglevel=INFO
Any information will be appreciated. Thank you.

Drop off the --schedule argument. It's unnecessary. Celery will pick up the CELERYBEAT_SCHEDULE from the Django environment and use that.

Related

How to route tasks to different queues with Celery and Django

I am using the following stack:
Python 3.6
Celery v4.2.1 (Broker: RabbitMQ v3.6.0)
Django v2.0.4.
According Celery's documentation, running scheduled tasks on different queues should be as easy as defining the corresponding queues for the tasks on CELERY_ROUTES, nonetheless all tasks seem to be executed on Celery's default queue.
This is the configuration on my_app/settings.py:
CELERY_BROKER_URL = "amqp://guest:guest#localhost:5672//"
CELERY_ROUTES = {
'app1.tasks.*': {'queue': 'queue1'},
'app2.tasks.*': {'queue': 'queue2'},
}
CELERY_BEAT_SCHEDULE = {
'app1_test': {
'task': 'app1.tasks.app1_test',
'schedule': 15,
},
'app2_test': {
'task': 'app2.tasks.app2_test',
'schedule': 15,
},
}
The tasks are just simple scripts for testing routing:
File app1/tasks.py:
from my_app.celery import app
import time
#app.task()
def app1_test():
print('I am app1_test task!')
time.sleep(10)
File app2/tasks.py:
from my_app.celery import app
import time
#app.task()
def app2_test():
print('I am app2_test task!')
time.sleep(10)
When I run Celery with all the required queues:
celery -A my_app worker -B -l info -Q celery,queue1,queue2
RabbitMQ will show that only the default queue "celery" is running the tasks:
sudo rabbitmqctl list_queues
# Tasks executed by each queue:
# - celery 2
# - queue1 0
# - queue2 0
Does somebody know how to fix this unexpected behavior?
Regards,
I have got it working, there are few things to note here:
According Celery's 4.2.0 documentation, CELERY_ROUTES should be the variable to define queue routing, but it only works for me using CELERY_TASK_ROUTES instead. The task routing seems to be independent from Celery Beat, therefore this will only work for tasks scheduled manually:
app1_test.delay()
app2_test.delay()
or
app1_test.apply_async()
app2_test.apply_async()
To make it work with Celery Beat, we just need to define the queues explicitly in the CELERY_BEAT_SCHEDULE variable. The final setup of the file my_app/settings.py would be as follows:
CELERY_BROKER_URL = "amqp://guest:guest#localhost:5672//"
CELERY_TASK_ROUTES = {
'app1.tasks.*': {'queue': 'queue1'},
'app2.tasks.*': {'queue': 'queue2'},
}
CELERY_BEAT_SCHEDULE = {
'app1_test': {
'task': 'app1.tasks.app1_test',
'schedule': 15,
'options': {'queue': 'queue1'}
},
'app2_test': {
'task': 'app2.tasks.app2_test',
'schedule': 15,
'options': {'queue': 'queue2'}
},
}
And to run Celery listening on those two queues:
celery -A my_app worker -B -l INFO -Q queue1,queue2
Where
-A: name of the project or app.
-B: Initiates the task scheduler Celery beat.
-l: Defines the logging level.
-Q: Defines the queues handled by this worker.
I hope this saves some time to other developers.
adding queue parameter to the decorator may help you,
#app.task(queue='queue1')
def app1_test():
print('I am app1_test task!')
time.sleep(10)
Okay as i have tried the same command that you have used to run the worker so I found that you just have to remove the "celery after the -Q parameter and that'll be fine too.
So the old command is
celery -A my_app worker -B -l info -Q celery,queue1,queue2
And the new command is
celery -A my_app worker -B -l info -Q queue1,queue2

Python Django Celery is taking too much memory

I am running a celery server which have 5,6 task to run periodically. Celery is taking too much memory after 5,6 days of continuous execution.
Celery documentation is very confusing. I am using following settings.
# celeryconfig.py
import os
os.environ['DJANGO_SETTINGS_MODULE'] = 'xxx.settings'
# default RabbitMQ broker
BROKER_URL = "amqp://guest:guest#localhost:5672//"
from celery.schedules import crontab
# default RabbitMQ backend
CELERY_RESULT_BACKEND = None
#4 CONCURRENT proccesess are running.
CELERYD_CONCURRENCY = 4
# specify location of log files
CELERYD_LOG_FILE="/var/log/celery/celery.log"
CELERY_ALWAYS_EAGER = True
CELERY_IMPORTS = (
'xxx.celerydir.cron_tasks.deprov_cron_script',
)
CELERYBEAT_SCHEDULE = {
'deprov_cron_script': {
'task': 'xxx.celerydir.cron_tasks.deprov_cron_script.check_deprovision_vms',
'schedule': crontab(minute=0, hour=17),
'args': ''
}
}
I am running celery service using nohup command(this will run this in background).
nohup celery beat -A xxx.celerydir &
After going through documentation. I came to know that DEBUG was True in settings.
Just change value of DEBUG in settings.
REF:https://github.com/celery/celery/issues/2927

django-celery as a daemon: not working

I have a website project written with django, celery and rabbitmq. And a '.delay' task (the task creates a new folder) is called when a button is clicked.
Everything works fine with celery (the .delay task is called, and a new folder is created) when I run celery with manage.py like:
python manage.py celeryd
However, when I ran celery as the daemon, even there was no error, the task was not executed (no folder was created).
I was kind of following the tutorial: http://www.arruda.blog.br/programacao/django-celery-in-daemon/
My settings are:
/etc/default/celeryd
:
# Name of nodes to start, here we have a single node
CELERYD_NODES="w1"
# Where to chdir at start.
CELERYD_CHDIR="/var/www/myproject"
# How to call "manage.py celeryd_multi"
CELERYD_MULTI="$CELERYD_CHDIR/manage.py celeryd_multi"
# How to call "manage.py celeryctl"
CELERYCTL="$CELERYD_CHDIR/manage.py celeryctl"
# Extra arguments to celeryd
CELERYD_OPTS=""
# Name of the celery config module.
CELERY_CONFIG_MODULE="myproject.settings"
# %n will be replaced with the nodename.
CELERYD_LOG_FILE="/var/log/celery/w1.log"
CELERYD_PID_FILE="/var/run/celery/w1.pid"
# Workers should run as an unprivileged user.
#CELERYD_USER="root"
#CELERYD_GROUP="root"
# Name of the projects settings module.
export DJANGO_SETTINGS_MODULE="myproject.settings"
the correlated folders are created too
for the '/etc/default/celeryd/init.d' file, I used this version:
https://raw.github.com/ask/celery/1da3aa43d1e6de525beeda398d0acb8841d5b4d2/contrib/generic-init.d/celeryd
for /var/www/myproject/myproject/settings.py, I have:
:
import djcelery
djcelery.setup_loader()
BROKER_HOST = "127.0.0.1"
BROKER_PORT = 5672
BROKER_VHOST = "/"
BROKER_USER = "guest"
BROKER_PASSWORD = "guest"
INSTALLED_APPS = (
'djcelery',
...
)
There was no error when I start celery by using:
/etc/init.d/celeryd start
and no results neither. Does someone know how to fix the problem?
Celery's docs have a daemon troubleshooting section that might be helpful. Celery has a flag that lets you run your init script without actually daemonizing, and that should show what's going wrong:
C_FAKEFORK=1 sh -x /etc/init.d/celeryd start
Newer versions of that init script have a dryrun command that's an easier-to-remember way to run the start command without daemonizing.

Celery. Why my task work ONLY if I run this manualy in shell(manage.py shell)?

>>> from app.tasks import SendSomething
>>> eager_result = SendSomething().apply()
Why my task work ONLY if I run this manualy in shell(manage.py shell)?
settings.py
from datetime import timedelta
CELERYBEAT_SCHEDULE = {'send-something':
{'task': 'app.tasks.SendSomething',
'schedule': timedelta(seconds=300),
}}
I run:
python manage.py celeryd
and I have:
[Tasks]
. app.tasks.SendSomething
[2013-05-01 18:44:22,895: WARNING/MainProcess] celery#aaa ready.
but not working.
celeryd is the worker process. By default it does not schedule the periodic tasks. You can either run with the -B option to run the beat process along with the worker
python manage.py celeryd -B
or you can run an additional celerybeat process
python manage.py celerybeat
See http://celery.readthedocs.org/en/latest/userguide/periodic-tasks.html#starting-the-scheduler

Use Django's call command to start celery worker with celerybeat

I'd like to start a Django celery worker from a Python script, with celerybeat. On the command line, I would do:
python manage.py celery worker --beat --schedule celerybeat-schedule.db
I tried this from a script, but it threw an exception:
from django.core.management import call_command
call_command("cerlery", "worker", "--beat", "--schedule", "celerybeat-schedule.db")
I worked around it by doing this:
from djcelery.management.commands import celery
args = ['manage.py', 'celery', 'worker', '--beat', '--schedule',
'celerybeat-schedule']
command = celery.Command()
command.run_from_argv(args)
But if it's possible to use call_command, I'd like to know how.
I tend to do this..
./manage.py celeryd --event --beat --loglevel=INFO --logfile=./celeryd.log
Then to run the camera..
./manage.py celeryev --camera=djcelery.snapshot.Camera --logfile=./celeryev.log
Hope this helps.