Django, Celery, RabbitMQ - Tasks are not been executed - django

I have setup my Django env like this:
INSTALLED_APPS = (
....
'djcelery',
)
BROKER_URL = "amqp://guest:guest#localhost:5672//"
CELERY_IMPORTS = ('bulksms.tasks', 'premiumsms.tasks', 'reports.tasks')
CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler"
import djcelery
djcelery.setup_loader()
In the Django admin I add the task I want to periodic tasks (I can see all the tasks there) to run every minute for testing purposes but the task never runs.
Running django then
python manage.py celeryd -E --loglevel=DEBUG
python manage.py celerycam
In the admin site under Djcelery there is also no option for Tasks to ADD (not sure if it used to be there.)

If you want to run periodic tasks with Celery then you need to run the celerybeat process with either
python manage.py celery beat
or using another thread in the worker process
python manage.py celery worker -E --loglevel=DEBUG -B
See the docs on starting the scheduler http://celery.readthedocs.org/en/latest/userguide/periodic-tasks.html#starting-the-scheduler

Related

Call celery tasks from different server

I have a django app + redis on one server and celery on another server. I want to call celery task from django app.
My task.py on Celery Server:
from celery import Celery
app = Celery('tasks')
app.conf.broker_url = 'redis://localhost:6379/0'
#app.task(bind=True)
def test():
print('Testing')
Calling the celery task from Django Server:
from celery import Celery
celery = Celery()
celery.conf.broker_url = 'redis://localhost:6379/0'
celery.send_task('tasks.test')
I am running the celery worker using this command:
celery -A tasks worker --loglevel=INFO
When i call the celery task from django, it pings the celery server but i get the following error:
Received unregistered task of type 'tasks.test'. The message has been
ignored and discarded.
Did you remember to import the module containing this task? Or maybe
you're using relative imports?
How to fix this or is there any way to call the task?
Your task should be a shared task within celery as follows:
tasks.py
from celery import Celery, shared_task
app = Celery('tasks')
app.conf.broker_url = 'redis://localhost:6379/0'
#shared_task(name="test")
def test(self):
print('Testing')
and start celery as normal:
celery -A tasks worker --loglevel=INFO
Your application can then call your test task:
main.py
from celery import Celery
celery = Celery()
celery.conf.broker_url = 'redis://localhost:6379/0'
celery.send_task('tasks.test')

Celery task to process data from database Postgres and django

I'm trying to use celery and celery beat to run a scheduled task to process data from the database, but when I try to run the task I get this error "django.db.utils.OperationalError: FATAL: role "tanaka" does not exist". The code for the scheduled task is shown below
settings.py
CELERY_BEAT_SCHEDULE = {
'task-number-one': {
'task': 'loans.tasks.update_loan_book',
'schedule': 60,
},
}
tasks.py
#shared_task
def update_loan_book():
tenants = Tenant.objects.all()
for tenant in tenants:
#logic to update tenant object
The code works when I run the task using the "celery -A proj worker -l info -B" command but does not work when I daemonize celery and celery beat. Config files for celery and celery beat are shown below. I am using supervisord.
[program:projworker]
command=/home/tanaka/microfinance/bin/celery -A cloud_based_microfinance worker -l info
directory=/home/tanaka/Repositories/microfinance_project
user=tanaka
numprocs=1
stdout_logfile=/var/log/celery/proj_worker.log
stderr_logfile=/var/log/celery/proj_worker.log
autostart=true
autorestart=true
startsecs=10
stopwaitsecs = 600
killasgroup=truepriority=998
[program:projbeat]
command=/home/tanaka/microfinance/bin/celery -A cloud_based_microfinance beat -l info
directory=/home/tanaka/Repositories/microfinance_project
user=tanaka
numprocs=1
stdout_logfile=/var/log/celery/proj_beat.log
stderr_logfile=/var/log/celery/proj_beat.log
autostart=true
autorestart=true
startsecs=10
priority=999
When I try to run the task as a daemon I get "django.db.utils.OperationalError: FATAL: role "tanaka" does not exist" in the proj_worker.log file.
Solved the problem, if your task is going to access the database your first need a db connection
from django.db import connection
#shared_task
def update_loan_book():
with connection.cursor() as cursor:
tenants = Tenant.objects.all()
for tenant in tenants:
#logic to update tenant object
Celery and django are two different processes therefore you have to import db connection from django for celery to access the db. If you are not using django ORM this is a helpful resource

Django+Celery in Heroku not executing async task

I have Django+Celery in Heroku, and Celery is set up as:
import djcelery
djcelery.setup_loader()
BROKER_URL = "django://" # tell kombu to use the Django database as the message queue
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_RESULT_BACKEND = 'djcelery.backends.database:DatabaseBackend'
CELERY_ALWAYS_EAGER = False
CELERY_TIMEZONE = 'Europe/Madrid'
I have 2 tasks defined in tasks.py, one periodic and another that is executed on asynchronous calls:
#task
def test_one_shot():
print "One shot"
#periodic_task(run_every=crontab(minute="*/5"))
def test_periodic():
print "Periodic"
Heroku is configured with a main web worker and a auxiliar worker:
web: gunicorn config.wsgi:application ON
worker: python manage.py celery worker -B -l info ON
With this setup, I run the test_one_shot task as follows:
test_one_shot.apply_async(eta=datetime.now()+timedelta(minutes=2))
And although it appears as registered in the heroku logs:
Received task: test.tasks.test_one_shot[f29c609d-b6e8-45d4-808d-2ca690f029af] eta:[2016-08-07 00:09:30.262362+02:00]
It never executes. On the other hand, the periodic task test_periodic is executed as expected. What am I doing wrong?
Thanks!
EDIT: The task was executed was not appearing in the logs due a datetime time aware issue. However when the task is programmatically called, it is never executed.
I end up changing the celery backend to use RabbitMQ in Heroku following this guide, and the problem get solved.
Basically, I installed RabbitMQ on Heroku:
$ heroku addons:add cloudamqp
And set the new configuration for it:
import djcelery
djcelery.setup_loader()
CELERY_TIMEZONE = 'Europe/Madrid'
BROKER_URL = env("CLOUDAMQP_URL", default="django://")
BROKER_POOL_LIMIT = 1
BROKER_CONNECTION_MAX_RETRIES = None
CELERY_TASK_SERIALIZER = "json"
CELERY_ACCEPT_CONTENT = ["json", "msgpack"]
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_ALWAYS_EAGER = False
if BROKER_URL == "django://":
INSTALLED_APPS += ("kombu.transport.django",)

Celery executing scheduled task a hundred times

I've configured celery in my django app in order to run a task every morning. The task simply sends an email to a group of users. The problem is that the same email is being sent a few hundred times!!
This is my celery config:
BROKER_URL = 'redis://127.0.0.1:6379/0'
BROKER_TRANSPORT = 'redis'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
from celery.schedules import crontab
CELERYBEAT_SCHEDULE = {
'alert_user_is_not_buying-everyday-at-7': {
'task': 'opti.tasks.alert_users_not_buying',
'schedule': crontab(hour=7, minute=0),
},
}
and the task is:
#app.task(bind=True)
def alert_user_is_not_buying(self):
send_mail_to_users()
And I use this commands to start the worker and beat (I use supervisor for that):
exec celery --app=opti beat --loglevel=INFO
exec celery --app=opti worker --loglevel=INFO
I believe that there's no problem wih my send_mail_to_users() method, It looks like the emails are sent every 30 seconds....
What is missing?
Your CELERYBEAT_SCHEDULE setting is likely going unused, as you have CELERYBEAT_SCHEDULER set to use the DatabaseScheduler. How is that scheduler configured? I would guess that's where the problem is coming from.

Celery, Django, Heroku -- ImportError: No module named tasks

I am trying to run celery with IronMQ and cache in a Django project on Heroku but I am receiving the following:
2013-04-14T22:29:17.479887+00:00 app[celeryd.1]: ImportError: No module named tasks
What am I doing wrong? The following is my relevant code and djcelery and my app are both in installed apps:
REQUIREMENTS (Rabbit AMQP is in there because I tried that before IronMQ):
Django==1.5.1
amqp==1.0.11
anyjson==0.3.3
billiard==2.7.3.27
boto==2.8.0
celery==3.0.18
dj-database-url==0.2.1
django-celery==3.0.17
django-storages==1.1.8
gunicorn==0.17.2
iron-cache==0.2.0
iron-celery==0.3.1
iron-core==1.0.2
iron-mq==0.4
iso8601==0.1.4
kombu==2.5.10
psycopg2==2.4.6
python-dateutil==2.1
pytz==2013b
requests==1.2.0
six==1.3.0
wsgiref==0.1.2
PROCFILE:
web: gunicorn myapp.wsgi
celeryd: celery -A tasks worker --loglevel=info -E
SETTINGS:
BROKER_URL = 'ironmq://'
CELERY_RESULT_BACKEND = 'ironcache://'
import djcelery
import iron_celery
djcelery.setup_loader()
TASKS:
from celery import task
#task()
def batchAdd(result_length, result_amount):
VIEWS:
from app import tasks
r = batchAdd.delay(result_length, result_amount)
return HttpResponse(r.task_id)
ALSO TRIED (in VIEWS):
from tasks import batchAdd
r = batchAdd.delay(result_length, result_amount)
return HttpResponse(r.task_id)
AND TRIED THIS AS WELL (in VIEWS):
from app.tasks import batchAdd
r = batchAdd.delay(result_length, result_amount)
return HttpResponse(r.task_id)
Also here is my structure:
projectname
--app
----__init__.py
----__init__.pyc
----admin.py
----admin.pyc
----forms.py
----forms.pyc
----models.py
----models.pyc
----tasks.py
----tests.py
----views.py
----views.pyc
--manage.py
--Procfile
--projectname
----__init__.py
----__init__.pyc
----settings.py
----settings.pyc
----static
----templates
----urls.py
----urls.pyc
----wsgi.py
----wsgi.pyc
--requirements.txt
Have you tried to load celery via manage.py ?
python manage.py celery worker --loglevel=info
You can't just run your celery using:
celery -A tasks worker --loglevel=info -E
Celery requires celeryconfig file with -A option. You should run you celery as described in djcelery docs.
python manage.py celery worker --loglevel=info
Also you should fix your views.py as
from app.tasks import batchAdd