Django+Celery in Heroku not executing async task - django

I have Django+Celery in Heroku, and Celery is set up as:
import djcelery
djcelery.setup_loader()
BROKER_URL = "django://" # tell kombu to use the Django database as the message queue
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_RESULT_BACKEND = 'djcelery.backends.database:DatabaseBackend'
CELERY_ALWAYS_EAGER = False
CELERY_TIMEZONE = 'Europe/Madrid'
I have 2 tasks defined in tasks.py, one periodic and another that is executed on asynchronous calls:
#task
def test_one_shot():
print "One shot"
#periodic_task(run_every=crontab(minute="*/5"))
def test_periodic():
print "Periodic"
Heroku is configured with a main web worker and a auxiliar worker:
web: gunicorn config.wsgi:application ON
worker: python manage.py celery worker -B -l info ON
With this setup, I run the test_one_shot task as follows:
test_one_shot.apply_async(eta=datetime.now()+timedelta(minutes=2))
And although it appears as registered in the heroku logs:
Received task: test.tasks.test_one_shot[f29c609d-b6e8-45d4-808d-2ca690f029af] eta:[2016-08-07 00:09:30.262362+02:00]
It never executes. On the other hand, the periodic task test_periodic is executed as expected. What am I doing wrong?
Thanks!
EDIT: The task was executed was not appearing in the logs due a datetime time aware issue. However when the task is programmatically called, it is never executed.

I end up changing the celery backend to use RabbitMQ in Heroku following this guide, and the problem get solved.
Basically, I installed RabbitMQ on Heroku:
$ heroku addons:add cloudamqp
And set the new configuration for it:
import djcelery
djcelery.setup_loader()
CELERY_TIMEZONE = 'Europe/Madrid'
BROKER_URL = env("CLOUDAMQP_URL", default="django://")
BROKER_POOL_LIMIT = 1
BROKER_CONNECTION_MAX_RETRIES = None
CELERY_TASK_SERIALIZER = "json"
CELERY_ACCEPT_CONTENT = ["json", "msgpack"]
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_ALWAYS_EAGER = False
if BROKER_URL == "django://":
INSTALLED_APPS += ("kombu.transport.django",)

Related

Celery task is pending in the browser but succeeded in the python shell of Django

I'm using Django, Celery, and RabbitMQ for simple tasks on Ubuntu but celery gives no response.
I can't figure out why the task is pending in the browser, while it is done when I used the shell by executing python3 manage.py shell.
Here is my tasks.py file:
from celery import shared_task, task
#shared_task
def createContainer(container_data):
print(container_data,"create")
return "created"
#shared_task
def destroyContainer(container_data):
print(container_data,"destroy")
return "destroyed"
Here is my views.py file:
def post(self,request):
if str(request.data["process"]) == "create":
postdata = {
"image_name" : request.data["image_name"],
"image_tag" : request.data["image_tag"],
"owner" : request.user.id
}
# I tried to print the postdata variable before the task and it is working
createContainer.delay(postdata)
elif str(request.data["process"]) == "destroy":
postdata = {
"cont_id" : request.data["cont_id"]
}
# I tried to print the postdata variable before the task and it is working
destroyContainer.delay(postdata)
# I tried to print anything here, but it was not reachable and never executed
Here is the code I tried in the shell:
>>> from dockerapp.tasks import create_container
>>> create_container.delay("fake data")
>>> <AsyncResult: c37c47f3-6965-4f2e-afcd-01de60f82565>
Also, I can see the logs of celery here in another terminal by executing celery -A dockerproj worker -l info
It results in these lines when I used the shell:
Received task: dockerapp.tasks.create_container[c37c47f3-6965-4f2e-afcd-01de60f82565]
fake data #print
create #print
Task dockerapp.tasks.create_container[c37c47f3-6965-4f2e-afcd-01de60f82565] succeeded in 0.003456833990640007s
but it shows no results when I use it with the browser in a POST request.
However, I saw many solutions adding some lines to the settings.py file as celery configurations
I tried all of these lines:
CELERY_BROKER_URL = 'amqp://127.0.0.1'
CELERY_TIMEZONE = 'UTC'
CELERY_TRACK_STARTED = True
CELERY_TASK_TRACK_STARTED = True
CELERY_CACHE_BACKEND = 'amqp'
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_IGNORE_RESULT = False
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
I even tried this with celery terminal:
celery -A dockerproj worker -l info -P threads
celery -A dockerproj worker -l info --pool=solo (people said it fixed the issue in windows, however, i tried it)

Celery executing scheduled task a hundred times

I've configured celery in my django app in order to run a task every morning. The task simply sends an email to a group of users. The problem is that the same email is being sent a few hundred times!!
This is my celery config:
BROKER_URL = 'redis://127.0.0.1:6379/0'
BROKER_TRANSPORT = 'redis'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
from celery.schedules import crontab
CELERYBEAT_SCHEDULE = {
'alert_user_is_not_buying-everyday-at-7': {
'task': 'opti.tasks.alert_users_not_buying',
'schedule': crontab(hour=7, minute=0),
},
}
and the task is:
#app.task(bind=True)
def alert_user_is_not_buying(self):
send_mail_to_users()
And I use this commands to start the worker and beat (I use supervisor for that):
exec celery --app=opti beat --loglevel=INFO
exec celery --app=opti worker --loglevel=INFO
I believe that there's no problem wih my send_mail_to_users() method, It looks like the emails are sent every 30 seconds....
What is missing?
Your CELERYBEAT_SCHEDULE setting is likely going unused, as you have CELERYBEAT_SCHEDULER set to use the DatabaseScheduler. How is that scheduler configured? I would guess that's where the problem is coming from.

celery + django - how to write task state to database

I'm running Celery with Django and RabbitMQ and want to see the task states in the database table. Unfortunately no entries are written into the table djcelery_taskstate and I can't figure out why.
My settings:
CELERY_ENABLE_UTC = True
BROKER_URL = "amqp://guest:guest#localhost:5672/"
CELERY_RESULT_BACKEND = "database"
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_TRACK_STARTED = True
CELERY_SEND_EVENTS = True
CELERY_IMPORTS = ("project_management.tasks", "accounting.tasks", "time_tracking.tasks", )
CELERY_ALWAYS_EAGER = False
import djcelery
djcelery.setup_loader()
My Task:
class TestTask(Task):
def run(self, po_id):
self.update_state(state=states.STARTED, meta={'total': 0, 'done': False})
#do something..
self.update_state(state=states.SUCCESS, meta={'total': 100, 'done': True})
I'm starting the task as follows in a view:
TestTask.apply_async(args=[], kwargs={})
I'm starting celery workers as follows.
python manage.py celeryd -v 1 -B -s celery -E -l INFO
Console gives me the following output:
[2013-05-19 11:10:03,774: INFO/MainProcess] Task accounting.tasks.TestTask[5463b2ed-0eba-451d-b828-7a89fcd36348] succeeded in 0.0538640022278s: None
Any idea what is wrong with my setup?
You need to start up the snapshot camera as well in order to see the results in the database.
python manage.py celerycam
Once you have that running, you will be able to see entries in the djcelery tables.

Django, Celery, RabbitMQ - Tasks are not been executed

I have setup my Django env like this:
INSTALLED_APPS = (
....
'djcelery',
)
BROKER_URL = "amqp://guest:guest#localhost:5672//"
CELERY_IMPORTS = ('bulksms.tasks', 'premiumsms.tasks', 'reports.tasks')
CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler"
import djcelery
djcelery.setup_loader()
In the Django admin I add the task I want to periodic tasks (I can see all the tasks there) to run every minute for testing purposes but the task never runs.
Running django then
python manage.py celeryd -E --loglevel=DEBUG
python manage.py celerycam
In the admin site under Djcelery there is also no option for Tasks to ADD (not sure if it used to be there.)
If you want to run periodic tasks with Celery then you need to run the celerybeat process with either
python manage.py celery beat
or using another thread in the worker process
python manage.py celery worker -E --loglevel=DEBUG -B
See the docs on starting the scheduler http://celery.readthedocs.org/en/latest/userguide/periodic-tasks.html#starting-the-scheduler

Using MongoDB as message queue for Celery

I'm trying to use MongoDB as the message queue for Celery (in a Django app). The current development version of Celery (2.2.0rc2) is supposed to let you do this, but I can't seem to get any workers to pick up tasks I'm creating.
Versions:
celery v2.2.0rc3
mongodb 1.6.5
pymongo 1.9
django-celery 2.2.0rc2
In my settings, I have:
CELERY_RESULT_BACKEND = "mongodb"
CELERY_MONGODB_BACKEND_SETTINGS = {
# Shouldn't need these - defaults are correct.
"host": "localhost",
"port": 27017,
"database": "celery",
"taskmeta_collection": "messages",
}
BROKER_BACKEND = 'mongodb'
BROKER_HOST = "localhost"
BROKER_PORT = 27017
BROKER_USER = ""
BROKER_PASSWORD = ""
BROKER_VHOST = ""
import djcelery
djcelery.setup_loader()
I've created a test tasks.py file as follows:
from celery.decorators import task
#task()
def add(x, y):
return x + y
If I fire up celeryd in the background, it appears to start normally. I then open a python shell and run the following:
>>> from myapp.tasks import add
>>> result = add.delay(5,5)
>>> result
<AsyncResult: 7174368d-288b-4abe-a6d7-aeba987fa886>
>>> result.ready()
False
Problem is that no workers ever pick up the tasks. Am I missing a setting or something? How do I point celery to the message queue?
We had this same issue. While the doc says all tasks should be registered in Celery by calling
import djcelery
djcelery.setup_loader()
it wasn't working properly. So, we still used the
CELERY_IMPORTS = ('YOUR_APP.tasks',)
setting in settings.py. Also, make sure you restart Celery if you add a new task because Celery has to register the tasks when it first starts.
Django, Celerybeat and Celery with MongoDB as the Broker
Remember that Kombu work only with mongo 1.3+ because it need the functionality findandmodify.
If you are on ubuntu the last version in repository is the 1.2, than doesn't work.
Maybe you have also to set
BROKER_VHOST = "dbname"
Keep me posted if it works
Be sure to add this to your settings, or the workers can't find the task and will fail silently.
CELERY_IMPORTS = ("namespace", )
I had the same issue but when I upgraded to celery 2.3.3 everything worked like a charm.