Celery task remains 'pending' - django

I configured a Celery instance like this:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery(
'project',
backend='rpc://',
broker='pyamqp://',
result_backend = 'rpc://'
)
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print(f'Request: {self.request!r}')
I have a task:
app = Celery('spotifycluster',broker='pyamqp://guest#localhost//')
#app.task
def some_task(X):
time.sleep(2)
return sum(X)
When I call the task and check its state, it's always 'PENDING'.
task = some_task.delay(features)
task_id = task.task_id
state = AsyncResult(id=task_id).state
But the terminal shows:
[2021-06-30 16:11:00,072: INFO/MainProcess] Task spotify.tasks.AffinityPropagation_task[09a1812b-1044-480a-a0fb-49be2e5cdc94] received
[2021-06-30 16:11:02,077: INFO/ForkPoolWorker-2] Task spotify.tasks.AffinityPropagation_task[09a1812b-1044-480a-a0fb-49be2e5cdc94] succeeded in 2.002569585999993s: 4
Which is confusing to me. I read other issues, but those were mostly related to a bug on Windows. I'm running on Mac. What am I missing here? Suggestions are much appreciated.

Try to change to another broker. In my case nothing helped until I totally moved to Redis from RabbitMQ. The setting below is for Django running on Heroku.
CELERY_RESULT_BACKEND = os.environ.get('REDIS_URL', 'redis://localhost:6379/0')
CELERY_BROKER_URL = os.environ.get('REDIS_URL', 'redis://localhost:6379/0')

Related

Celery beat sends scheduled task only once and then doesn't send task when restarted

I am using Celery beat to perform a task that is supposed to be executed at on specific time. I was trying to excute it now by changing the time just to see if it works correctly. What I have noticed is it sends the task correctly when I run a fresh command that is celery -A jgs beat -l INFO but then suppose I change the time in the schedule section from two minutes or three minutes from now and then again run the above command, beat does not send the task. Then I noticed something strange. If I go to the admin area and delete all the other old tasks that were created in the crontab table, and then run the command again it sends the task again to the worker.
The tasks are being traced by the worker correctly and also the celery worker is working correctly. Below are the codes that I wrote to perform the task.
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
from celery.schedules import crontab
from django.utils import timezone
from datetime import timezone
# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'jgs.settings')
app = Celery('jgs')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.conf.enable_utc = False
app.conf.update(timezone = 'Asia/Kolkata')
# app.conf.update(BROKER_URL=os.environ['REDIS_URL'],
# CELERY_RESULT_BACKEND=os.environ['REDIS_URL'])
app.config_from_object('django.conf:settings', namespace='CELERY')
# Celery beat settings
app.conf.beat_schedule = {
'send-expiry-email-everyday': {
'task': 'control.tasks.send_expiry_mail',
'schedule': crontab(hour=1, minute=5),
}
}
# Load task modules from all registered Django apps.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print(f'Request: {self.request!r}')
control/tasks.py
from celery import shared_task
from django.core.mail import message, send_mail, EmailMessage
from django.conf import settings
from django.template.loader import render_to_string
from datetime import datetime, timedelta
from account.models import CustomUser
from home.models import Contract
#shared_task
def send_expiry_mail():
template = render_to_string('expiry_email.html')
email = EmailMessage(
'Registration Successfull', #subject
template, # body
settings.EMAIL_HOST_USER,
['emaiid#gmail.com'], # sender email
)
email.fail_silently = False
email.content_subtype = 'html' # WITHOUT THIS THE HTML WILL GET RENDERED AS PLAIN TEXT
email.send()
return "Done"
settings.py
############# CELERY SETTINGS #######################
CELERY_BROKER_URL = 'redis://127.0.0.1:6379'
# CELERY_BROKER_URL = os.environ['REDIS_URL']
CELERY_ACCEPT_CONTENT =['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERY_RESULT_BACKEND = 'django-db'
# CELERY BEAT CONFIGURATIONS
CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'
commands that I am using
for worker
celery -A jgs.celery worker --pool=solo -l info
for beat
celery -A jgs beat -l INFO
Please correct me where I going wrong or what I am writing wrong, I completely in beginer phase in this async part.
I am really sorry if my sentences were confusing above.

Stop Celery caching Django template

I have Django 3.24, celery 4.4.7 and celerybeat 2.2.0 setup via the RabbitMQ broker.
I have have a celery task that renders a Django template and then sends it to a number of email recipients.
The Django template is dynamic in as much as changes can be made to it's content at any time, which in turn rewrites the template. The trouble is that on occasions, I have to restart celery to get it to re-read the template.
My question is, is there any way of forcing celery to reread the template file, without requiring a full celery restart?
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backoffice.settings')
app = Celery('backoffice')
default_config = 'backoffice.celery_config'
app.config_from_object(default_config)
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
celery_config.py
from django.conf import settings
broker_url = "amqp://someusername:somepassword#webserver:5672/backoffice"
worker_send_task_event = False
task_ignore_result = True
task_time_limit = 60
task_soft_time_limit = 50
task_acks_late = True
worker_prefetch_multiplier = 10
worker_cancel_long_running_tasks_on_connection_loss = True
celery command
celery -A backoffice worker -B -l info --without-heartbeat --without-gossip --without-mingle

celery .delay freezes for this task but runs for others

I am trying to send notifications using celery.
#shared_task(name='send_notifis')
def send_notifs(device_ids, title, message):
from pills_reminder.models import UserNotifications, UserDevice
devices = UserDevice.objects.filter(id__in=device_ids)
print(devices)
device_tokens = []
for device in devices:
UserNotifications.objects.create(
uid=device.device_id,
title=title,
message=message,
)
print(UserNotifications)
device_tokens.append(device.registration_token)
if len(device_tokens) > 1:
device_tokens = ["".join(token.split()) for token in device_tokens]
response = push_service.notify_multiple_devices(registration_ids=device_tokens,
message_title=title,
message_body=message)
elif len(device_tokens) == 1:
registration_id = "".join(device_tokens[0].split())
response = push_service.notify_single_device(registration_id=registration_id,
message_title=title,
message_body=message)
else:
pass
print(response)
return True
this works without .delay() and when running using
python manage.py shell
>>> send_notifs.delay(devices, title='title', message='message')
<AsyncResult: f54188f8-cec6-42dd-a840-b097abffd7f4>
but it freezes when i call using Django Model post_save signal.
#receiver(post_save, sender=Notification)
def Notification_post_save_handler(sender, instance, **kwargs):
print('hello from post_save signal')
devices = instance.get_devices()
# send_notifs(devices)
if len(devices)>0:
send_notifs.delay(devices,
title=instance.title,
message=instance.message)
This above code freezes execution, but without .delay. it works fine.
UPADATE:1
the above task with .delay is running from python manage.py shell not from runserver . so the problem is with celery and Django settings. Hence i dig deep and found out
while running from shell i get,
>>> add.app.conf #(add is a async task here)
{'broker_url': 'redis://localhost:6379/1'}, ...
but running from runserver gives:
`{'broker_url': None}`
Now i am looking for how to set the settings properly ? I am using django-configurations with celery.py as
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'core.settings')
os.environ.setdefault('DJANGO_CONFIGURATION', 'Development')
import configurations
configurations.setup()
app = Celery('core')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Any help is appreciated. Thank you for your time and patience.
use this
from configurations import importer
importer.install()
in place of
import configurations
configurations.setup()
Turns out the settings sepecified in django-configuration docs for including celery was causing the problem. Modified celery.py, and now it is working fine.
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'core.settings')
os.environ.setdefault('DJANGO_CONFIGURATION', 'Development')
from configurations import importer
importer.install()
# import configurations
# configurations.setup()
app = Celery('core')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))

Celery doesn't run task

Help me, please, to understand what I am doing wrong. Celery doesn't run my task.
Settings.py
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE
proj/celery.py
from __future__ import absolute_import
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
init.py
from __future__ import absolute_import, unicode_literals
from celery import app as celery_app
__all__ = ['celery_app']
Code
#shared_task
def generate(instance, sender, **kwargs):
for i in CK_PROGRAM_NAME:
program_kf = i[0]
ck = instance.dk*program_kf
program_name = i[1]
program_obj = Program.objects.get(name=program_name)
foodprogram_generator(instance, ck, program_kf, program_obj, sender, **kwargs)
return
#receiver(post_save, sender=LeadUser)
def leaduser_foodprogram_post_save(instance, sender, **kwargs):
generate.delay(instance, sender, **kwargs)
return
Worker is run by: celery -A proj worker --loglevel=INFO
The logic is:
after client_object is created, post_save signal starts leaduser_foodprogram_post_save, that adds to a queue generate()
I can see result, so I think it is not run.
Without celery everything works properly.
Thanks for you answers!
A couple of things::
* the config_from_object with namespace might strip that from the variables, so you might not get what u want as a configuration,
* when you see shared task you need to make sure you are calling the task from the configured celery app, as the main point of using shared task is to actually share tasks between different apps. Have a look at the "set_default" function on the celery app object, by just calling that on the celery setup you should see a difference.
Anyway best way to check is to put rdb in there and inspect the celery app, check the configuration and if the broker is not set then the second point explained on my previous comment should get you going
Thanks guys for your answers, there was no specific problem, but I rechecked everything and according to this article run my task:

django celery rabbitmq issue: "WARNING/MainProcess] Received and deleted unknown message. Wrong destination"

My settings.py
CELERY_ACCEPT_CONTENT = ['json', 'msgpack', 'yaml', 'pickle', 'application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_RESULT_BACKEND = 'djcelery.backends.cache:CacheBackend'
celery.py code
from __future__ import absolute_import
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'webapp.settings')
from django.conf import settings
app = Celery('webapp')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
tasks.py code
from __future__ import absolute_import
from celery.utils.log import get_task_logger
from celery import shared_task
import datetime
logger = get_task_logger(__name__)
#shared_task
def sample_code():
logger.info("Run time:" + str(datetime.datetime.now().strftime("%Y-%m-%d %H:%M")))
return None
On shell I am importing and running as "sample_code.delay()"
Full error stack:
[2016-02-12 00:28:56,331: WARNING/MainProcess] Received and deleted unknown message. Wrong destination?!?
The full contents of the message body was: body: '\x80\x02}q\x01(U\x07expiresq\x02NU\x03utcq\x03\x88U\x04argsq\x04]q\x05U\x05chordq\x06NU\tcallbacksq\x07NU\x08errbacksq\x08NU\x07tasksetq\tNU\x02idq\nU$f02e662e-4eda-4180-9af4-2c8a1ceb57c4q\x0bU\x07retriesq\x0cK\x00U\x04taskq\rU$app.tasks.sample_codeq\x0eU\ttimelimitq\x0fNN\x86U\x03etaq\x10NU\x06kwargsq\x11}q\x12u.' (232b)
{content_type:u'application/x-python-serialize' content_encoding:u'binary'
delivery_info:{'consumer_tag': u'None4', 'redelivered': False, 'routing_key': u'celery', 'delivery_tag': 8, 'exchange': u'celery'} headers={}}
Please let me know where I am wrong
The way it was solved for me is change in command for running celery
It was giving issue for:
celery -A <app_path> worker --loglevel=DEBUG
But it's running without issue if we use:
celery -A <app_path> worker -l info
It may be helpful for other if they face same issue.