django celery SQS "No result backend is configured." - django

Not duplicate of Celery: No Result Backend Configured? because SQS is used.
Keep getting the following error:
No result backend is configured. Please see the documentation for more
information.
My production settings are the following:
CELERY_BROKER_URL = 'sqs://%s:%s#' % (
urllib.parse.quote(env.str('TASK_QUEUE_USER_ID'), safe=''),
urllib.parse.quote(env.str('TASK_QUEUE_USER_SECRET'), safe=''))
BROKER_URL = CELERY_BROKER_URL
CELERY_ENABLE_REMOTE_CONTROL = False
CELERY_RESULT_BACKEND = None # Disabling the results backend
RESULT_BACKEND = None # Disabling the results backend
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_DEFAULT_QUEUE = 'async_tasks'
SQS_QUEUE_NAME = 'async_tasks'
CELERY_ENABLE_REMOTE_CONTROL = False
CELERY_SEND_EVENTS = False
CELERY_BROKER_TRANSPORT_OPTIONS = {
'region': 'eu-west-2',
'polling_interval': 3,
'visibility_timeout': 3600,
}
CELERY_SEND_TASK_ERROR_EMAILS = True
#
# https://stackoverflow.com/questions/8048556/celery-with-amazon-sqs#8567665
#
CELERY_BROKER_TRANSPORT = 'sqs'
BROKER_TRANSPORT = 'sqs'
Running celery from the command line:
DJANGO_ENV=production celery -A async_tasks worker -l info
connects to SQS and polls, but when I try to do a demo call from the command line DJANGO_ENV=production python manage.py check_async:
from django.core.management.base import BaseCommand, CommandError
import async_tasks.tasks as tasks
class Command(BaseCommand):
help = 'Check if infrastructure for async tasks has been setup correctly.'
def handle(self, *args, **options):
try:
print('Sending async request.')
t = tasks.add.apply_async((2, 4))
out = t.get(timeout=1)
print(out)
print(t.status)
except Exception as e:
print(e)
raise CommandError('Error occured')
I get the error above. Have tried in development machine with redis and everything works well.
Any ideas?

You need a Celery Result Backend configured to be able to store and collect task results. Using Celery with an SQS broker w/o a result backend is ok for "fire and forget" patterns, but it's not enough if you want to be able to access the results of your tasks through methods like get().

Maybe this will help someone. The answer above is correct, but if you still want to use Django, SQS and Celery and still want to see the results you can use Django's ORM or Cache Framework as a backend by using the django-celery-results library.
Django-celery-results
Celery Documentation - ORM Cache Framework

Related

Celery task is pending in the browser but succeeded in the python shell of Django

I'm using Django, Celery, and RabbitMQ for simple tasks on Ubuntu but celery gives no response.
I can't figure out why the task is pending in the browser, while it is done when I used the shell by executing python3 manage.py shell.
Here is my tasks.py file:
from celery import shared_task, task
#shared_task
def createContainer(container_data):
print(container_data,"create")
return "created"
#shared_task
def destroyContainer(container_data):
print(container_data,"destroy")
return "destroyed"
Here is my views.py file:
def post(self,request):
if str(request.data["process"]) == "create":
postdata = {
"image_name" : request.data["image_name"],
"image_tag" : request.data["image_tag"],
"owner" : request.user.id
}
# I tried to print the postdata variable before the task and it is working
createContainer.delay(postdata)
elif str(request.data["process"]) == "destroy":
postdata = {
"cont_id" : request.data["cont_id"]
}
# I tried to print the postdata variable before the task and it is working
destroyContainer.delay(postdata)
# I tried to print anything here, but it was not reachable and never executed
Here is the code I tried in the shell:
>>> from dockerapp.tasks import create_container
>>> create_container.delay("fake data")
>>> <AsyncResult: c37c47f3-6965-4f2e-afcd-01de60f82565>
Also, I can see the logs of celery here in another terminal by executing celery -A dockerproj worker -l info
It results in these lines when I used the shell:
Received task: dockerapp.tasks.create_container[c37c47f3-6965-4f2e-afcd-01de60f82565]
fake data #print
create #print
Task dockerapp.tasks.create_container[c37c47f3-6965-4f2e-afcd-01de60f82565] succeeded in 0.003456833990640007s
but it shows no results when I use it with the browser in a POST request.
However, I saw many solutions adding some lines to the settings.py file as celery configurations
I tried all of these lines:
CELERY_BROKER_URL = 'amqp://127.0.0.1'
CELERY_TIMEZONE = 'UTC'
CELERY_TRACK_STARTED = True
CELERY_TASK_TRACK_STARTED = True
CELERY_CACHE_BACKEND = 'amqp'
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_IGNORE_RESULT = False
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
I even tried this with celery terminal:
celery -A dockerproj worker -l info -P threads
celery -A dockerproj worker -l info --pool=solo (people said it fixed the issue in windows, however, i tried it)

Django process running tasks instead of Celery

I'm running an app with Celery+redis for asynchronous tasks.
I managed to get Celery see the list of tasks. However my tasks aren't executed by Celery workers but by Django process instead.
I tried invoking the tasks with .delay() and .apply_async() without success (actually in these cases the call to the task gets blocked indefinitely and nothing is shown in the logs).
I might be missing something very basic but cannot see where.
Relevant settings follow:
settings.py
CELERY_REDIS_DB = os.environ.get("REDIS_DB_CELERY", 0)
CELERY_REDIS_HOST = os.getenv("REDIS_HOSTNAME", "redis")
CELERY_REDIS_PORT = 6379
CELERY_RESULT_BACKEND = BROKER_URL = (
f'redis://{CELERY_REDIS_HOST}:{CELERY_REDIS_PORT}/{CELERY_REDIS_DB}'
)
CELERY_TIMEZONE = TIME_ZONE
CELERY_RESULT_EXPIRES = 5 * 60 * 60
celery.py
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'maat.settings')
app = Celery('maat')
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
tasks.py
#shared_task
def remove_task(
environment=None,
payload=None,
**kwargs,
):
LOGGER.info('Received task, now removing')
...
views.py
class MyClass(
LoginRequiredMixin,
PermissionRequiredMixin,
SingleObjectMixin,
APIView
):
return_403 = True
model = models.Environment
slug_field = 'name'
slug_url_kwarg = 'environment'
def delete(self, request, environment):
tasks.remove_task(environment=environment, payload=request.data)
Django 3.2
celery[redis] 5.1.2
redis 3.5.3
django-celery-results 2.0.1
Assuming celery.py lives somewhere that is accessible to views.py, the problem is likely that because you are using a shared task, django does not know how to communicate with the celery broker without more information.
In order to do that, we can follow the advice from our helpful friends at stack overflow and do something like the following in your view:
from path.to.celery import app
app.set_default()

Django+Celery in Heroku not executing async task

I have Django+Celery in Heroku, and Celery is set up as:
import djcelery
djcelery.setup_loader()
BROKER_URL = "django://" # tell kombu to use the Django database as the message queue
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_RESULT_BACKEND = 'djcelery.backends.database:DatabaseBackend'
CELERY_ALWAYS_EAGER = False
CELERY_TIMEZONE = 'Europe/Madrid'
I have 2 tasks defined in tasks.py, one periodic and another that is executed on asynchronous calls:
#task
def test_one_shot():
print "One shot"
#periodic_task(run_every=crontab(minute="*/5"))
def test_periodic():
print "Periodic"
Heroku is configured with a main web worker and a auxiliar worker:
web: gunicorn config.wsgi:application ON
worker: python manage.py celery worker -B -l info ON
With this setup, I run the test_one_shot task as follows:
test_one_shot.apply_async(eta=datetime.now()+timedelta(minutes=2))
And although it appears as registered in the heroku logs:
Received task: test.tasks.test_one_shot[f29c609d-b6e8-45d4-808d-2ca690f029af] eta:[2016-08-07 00:09:30.262362+02:00]
It never executes. On the other hand, the periodic task test_periodic is executed as expected. What am I doing wrong?
Thanks!
EDIT: The task was executed was not appearing in the logs due a datetime time aware issue. However when the task is programmatically called, it is never executed.
I end up changing the celery backend to use RabbitMQ in Heroku following this guide, and the problem get solved.
Basically, I installed RabbitMQ on Heroku:
$ heroku addons:add cloudamqp
And set the new configuration for it:
import djcelery
djcelery.setup_loader()
CELERY_TIMEZONE = 'Europe/Madrid'
BROKER_URL = env("CLOUDAMQP_URL", default="django://")
BROKER_POOL_LIMIT = 1
BROKER_CONNECTION_MAX_RETRIES = None
CELERY_TASK_SERIALIZER = "json"
CELERY_ACCEPT_CONTENT = ["json", "msgpack"]
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_ALWAYS_EAGER = False
if BROKER_URL == "django://":
INSTALLED_APPS += ("kombu.transport.django",)

celery + django - how to write task state to database

I'm running Celery with Django and RabbitMQ and want to see the task states in the database table. Unfortunately no entries are written into the table djcelery_taskstate and I can't figure out why.
My settings:
CELERY_ENABLE_UTC = True
BROKER_URL = "amqp://guest:guest#localhost:5672/"
CELERY_RESULT_BACKEND = "database"
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_TRACK_STARTED = True
CELERY_SEND_EVENTS = True
CELERY_IMPORTS = ("project_management.tasks", "accounting.tasks", "time_tracking.tasks", )
CELERY_ALWAYS_EAGER = False
import djcelery
djcelery.setup_loader()
My Task:
class TestTask(Task):
def run(self, po_id):
self.update_state(state=states.STARTED, meta={'total': 0, 'done': False})
#do something..
self.update_state(state=states.SUCCESS, meta={'total': 100, 'done': True})
I'm starting the task as follows in a view:
TestTask.apply_async(args=[], kwargs={})
I'm starting celery workers as follows.
python manage.py celeryd -v 1 -B -s celery -E -l INFO
Console gives me the following output:
[2013-05-19 11:10:03,774: INFO/MainProcess] Task accounting.tasks.TestTask[5463b2ed-0eba-451d-b828-7a89fcd36348] succeeded in 0.0538640022278s: None
Any idea what is wrong with my setup?
You need to start up the snapshot camera as well in order to see the results in the database.
python manage.py celerycam
Once you have that running, you will be able to see entries in the djcelery tables.

Using MongoDB as message queue for Celery

I'm trying to use MongoDB as the message queue for Celery (in a Django app). The current development version of Celery (2.2.0rc2) is supposed to let you do this, but I can't seem to get any workers to pick up tasks I'm creating.
Versions:
celery v2.2.0rc3
mongodb 1.6.5
pymongo 1.9
django-celery 2.2.0rc2
In my settings, I have:
CELERY_RESULT_BACKEND = "mongodb"
CELERY_MONGODB_BACKEND_SETTINGS = {
# Shouldn't need these - defaults are correct.
"host": "localhost",
"port": 27017,
"database": "celery",
"taskmeta_collection": "messages",
}
BROKER_BACKEND = 'mongodb'
BROKER_HOST = "localhost"
BROKER_PORT = 27017
BROKER_USER = ""
BROKER_PASSWORD = ""
BROKER_VHOST = ""
import djcelery
djcelery.setup_loader()
I've created a test tasks.py file as follows:
from celery.decorators import task
#task()
def add(x, y):
return x + y
If I fire up celeryd in the background, it appears to start normally. I then open a python shell and run the following:
>>> from myapp.tasks import add
>>> result = add.delay(5,5)
>>> result
<AsyncResult: 7174368d-288b-4abe-a6d7-aeba987fa886>
>>> result.ready()
False
Problem is that no workers ever pick up the tasks. Am I missing a setting or something? How do I point celery to the message queue?
We had this same issue. While the doc says all tasks should be registered in Celery by calling
import djcelery
djcelery.setup_loader()
it wasn't working properly. So, we still used the
CELERY_IMPORTS = ('YOUR_APP.tasks',)
setting in settings.py. Also, make sure you restart Celery if you add a new task because Celery has to register the tasks when it first starts.
Django, Celerybeat and Celery with MongoDB as the Broker
Remember that Kombu work only with mongo 1.3+ because it need the functionality findandmodify.
If you are on ubuntu the last version in repository is the 1.2, than doesn't work.
Maybe you have also to set
BROKER_VHOST = "dbname"
Keep me posted if it works
Be sure to add this to your settings, or the workers can't find the task and will fail silently.
CELERY_IMPORTS = ("namespace", )
I had the same issue but when I upgraded to celery 2.3.3 everything worked like a charm.