I have been trying to create a task for a while, which consists of creating a sample of a specimen every 5 hours. I have managed to configure celery with redis and execute the task that is as an example in the documentation but when I want to do something more complex that includes a query set it does not execute me.the task disappears from the list when restarting the queue.
this is the structure of the project:
proj:
Muestras:
-views.py
-tasks.py
-models.py
Servicios:
-models.py
proj:
-celery.py
-settings.py
In settings.py:
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/London'
CELERY_BEAT_SCHEDULE = {
'generar-muestras': { # name of the scheduler
'task': 'Muestras.tasks.crear_muestras_tarea',
'schedule': 30.0, # set the period of running
},
}
This is a view that is within Muestras.views
from .models import Muestra
from backend.Servicios.models import Servicio
#this works in console
def generar_muestras():
services = Servicio.models.all()
for i in services:
muestra = Muestra(servicio_id=i.id)
muestra.save
In Muestras.tasks.py
from __future__ import absolute_import, unicode_literals
from celery import task
from .views import generar_muestras
#task
def crear_muestras_task():
print('hola esto tiene una funcion')
#generar_muestras()
this is what i have in celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import setting
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
and when execute
celery -A proj worker -l info -B
everything works well and executes the task but when I make this line in the Muestras.tasks.py and import from .views the view.
generar_muestras()
the task disappears from the list and i get this error:
[2018-11-04 22:31:37,734: INFO/MainProcess] celery#linux-z6z3 ready.
[2018-11-04 22:31:37,876: ERROR/MainProcess] Received unregistered task of
type 'Muestras.tasks.crear_muestras_tarea'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?
Please see
http://docs.celeryq.org/en/latest/internals/protocol.html
for more information.
The full contents of the message body was:
b'[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord":
null}]' (77b)
Traceback (most recent call last):
File "/home/wecbxxx/PycharmProjects/porj/venv/lib64/python3.6/site-
packages/celery/worker/consumer/consumer.py", line 558, in
on_task_received
strategy = strategies[type_]
KeyError: 'Muestras.tasks.crear_muestras_tarea'
You didn't share your settings.py or how you run the celery worker so I am taking a wild guess.
Your task should be listed under imports setting of celery. See here.
Your task should be decorated by #app.task(). See here
I suggest you go through celery's user guide. I think it can use some structural improvement but should be enough to understand the basics.
To expand on #gokhan's answer, there are two things that you should make sure of:
Decorate your task with #app.task
from __future__ import absolute_import, unicode_literals
from proj.celery import app
from .views import generar_muestras
#app.task
def crear_muestras_task():
print('hola esto tiene una funcion')
#generar_muestras()
Make sure that Muestras appears in settings.INSTALLED_APPS. This will allow the autodiscover to discover your tasks:
Next, a common practice for reusable apps is to define all tasks in a separate tasks.py module, and Celery does have a way to auto-discover these modules:
app.autodiscover_tasks()
With the line above Celery will automatically discover tasks from all of your installed apps, following the tasks.py convention:
Related
I have a doubt regarding the implementation of celery with rabbitMQ since only the first function (debug_task()) that I have defined in celery.py is executed.
The problem is that send_user_mail(randomNumber, email) is not working. debug_task is working, so it's registered.
This is the celery console
[2022-10-08 22:28:48,081: ERROR/MainProcess] Received unregistered
task of type 'callservices.celery.send_proveedor_mail_new_orden'. The
message has been ignored and discarded.
Did you remember to import the module containing this task? Or maybe
you are using relative imports?
Why it's unregistered?
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
from django.core.mail import EmailMultiAlternatives, send_mail
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'callservices.settings')
app = Celery('tasks',broker='pyamqp://guest#localhost//')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(settings.INSTALLED_APPS)
#app.task()
def debug_task():
print("hi all")
#app.task()
def send_user_mail(randomNumber, email):
subject = 'email validation - ServicesYA'
cuerpo="Your number is: "+str(randomNumber)
send_mail(subject, cuerpo ,'xxx.ssmtp#xxx.com', [email],fail_silently = False)
return 1
This is init.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from celery import app as celery_app
__all__ = ('celery_app',)
and in settings.py I add this line:
BROKER_URL = "amqp://guest:guest#localhost:5672//"
I am using Celery beat to perform a task that is supposed to be executed at on specific time. I was trying to excute it now by changing the time just to see if it works correctly. What I have noticed is it sends the task correctly when I run a fresh command that is celery -A jgs beat -l INFO but then suppose I change the time in the schedule section from two minutes or three minutes from now and then again run the above command, beat does not send the task. Then I noticed something strange. If I go to the admin area and delete all the other old tasks that were created in the crontab table, and then run the command again it sends the task again to the worker.
The tasks are being traced by the worker correctly and also the celery worker is working correctly. Below are the codes that I wrote to perform the task.
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
from celery.schedules import crontab
from django.utils import timezone
from datetime import timezone
# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'jgs.settings')
app = Celery('jgs')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.conf.enable_utc = False
app.conf.update(timezone = 'Asia/Kolkata')
# app.conf.update(BROKER_URL=os.environ['REDIS_URL'],
# CELERY_RESULT_BACKEND=os.environ['REDIS_URL'])
app.config_from_object('django.conf:settings', namespace='CELERY')
# Celery beat settings
app.conf.beat_schedule = {
'send-expiry-email-everyday': {
'task': 'control.tasks.send_expiry_mail',
'schedule': crontab(hour=1, minute=5),
}
}
# Load task modules from all registered Django apps.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print(f'Request: {self.request!r}')
control/tasks.py
from celery import shared_task
from django.core.mail import message, send_mail, EmailMessage
from django.conf import settings
from django.template.loader import render_to_string
from datetime import datetime, timedelta
from account.models import CustomUser
from home.models import Contract
#shared_task
def send_expiry_mail():
template = render_to_string('expiry_email.html')
email = EmailMessage(
'Registration Successfull', #subject
template, # body
settings.EMAIL_HOST_USER,
['emaiid#gmail.com'], # sender email
)
email.fail_silently = False
email.content_subtype = 'html' # WITHOUT THIS THE HTML WILL GET RENDERED AS PLAIN TEXT
email.send()
return "Done"
settings.py
############# CELERY SETTINGS #######################
CELERY_BROKER_URL = 'redis://127.0.0.1:6379'
# CELERY_BROKER_URL = os.environ['REDIS_URL']
CELERY_ACCEPT_CONTENT =['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/Kolkata'
CELERY_RESULT_BACKEND = 'django-db'
# CELERY BEAT CONFIGURATIONS
CELERY_BEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'
commands that I am using
for worker
celery -A jgs.celery worker --pool=solo -l info
for beat
celery -A jgs beat -l INFO
Please correct me where I going wrong or what I am writing wrong, I completely in beginer phase in this async part.
I am really sorry if my sentences were confusing above.
I have never used Celery before and am trying to configure it correctly. I am using redis as the broker and hosting on heroku. This is my first time trying to run asynchronous tasks and I'm struggling. I have a Management command that I would like to run periodically.
celery.py
from __future__ import absolute_import, unicode_literals
import os
import celery
from celery import Celery
import django
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'coffee.settings')
app = Celery('coffee')
app.config_from_object('django.conf:settings', namespace = 'CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind= True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
app.conf.beat_schedule = {
'add-every-30-seconds':{
'task': 'inventory.tasks.boarshead',
'schedule' : 30.0,
'args' : ()
},
}
settings.py
CACHES = {
"default": {
"BACKEND": "redis_cache.RedisCache",
"LOCATION": os.environ.get('REDIS_URL'),
}
}
tasks.py
from celery import shared_task
import celery
import time
from django.core import management
#celery.task
def boarshead():
try:
print("in celery module")
"""Boarshead expired sessions by using Django Management Command."""
management.call_command("clearsessions", verbosity=0)
CreateBoarsHeadList.py
return "success"
except:
print(e)
init.py
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
procfile
worker: celery worker --app=tasks.inventory.app
On Celery+Rabbit (and REDIS, not used as backend for years) you will need a proc file for the "web" (Django) and one for the worker, did not see listed. Worker / dyno allocation allows use and access to the manage functionality. Here is the procfile from one of my apps:
web: gunicorn SOME_APP.wsgi --log-file -
worker: celery worker -A QUEUE_APP_NAME -l info --without-gossip --without-mingle --without-heartbeat
QUEUE_APP_NAME is the name of a module (app) where I have all my Celery work and code. worker is called via Procfile in the QUEUE_APP_NAME module (dir), similar code to your Celery file. May not solve you, but getting Celery working is a slow battle.
I have to use Celery 4.0.2 with RabbitMQ 3.6.10 to handle an async task. Then, I have followed this tutorial: https://www.codementor.io/uditagarwal/asynchronous-tasks-using-celery-with-django-du1087f5k
However, I have a slight problem with my task because it's impossible to have a result. My task always remains in "PENDING" state.
My question is what i have to do to get a result ?
Thank you in advance for your answer.
Here my code:
Here my __init_.py:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ['celery_app']
Here a part of my setting.py:
BROKER_URL = 'amqp://guest:guest#localhost//'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
Here my celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')
app = Celery('mysite',
backend='amqp',
broker='amqp://guest#localhost//')
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
And my tasks.py
# Create your tasks here
from __future__ import absolute_import, unicode_literals
from celery import shared_task
#shared_task
def add(x, y):
test = "ok"
current_task.update_state(state='PROGRESS',
meta={'test': ok})
return x + y
And here my Django Shell:
>>> from blog.tasks import *
>>> job = add.delay(2,3)
>>> job.state
'PENDING'
>>> job.result
>>>
With a picture of my RabbitMQ interface:
You need to start a worker that will process the tasks you add to queue. From your virtualenv run:
celery worker -A blog -l info
My settings.py
CELERY_ACCEPT_CONTENT = ['json', 'msgpack', 'yaml', 'pickle', 'application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_RESULT_BACKEND = 'djcelery.backends.cache:CacheBackend'
celery.py code
from __future__ import absolute_import
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'webapp.settings')
from django.conf import settings
app = Celery('webapp')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
tasks.py code
from __future__ import absolute_import
from celery.utils.log import get_task_logger
from celery import shared_task
import datetime
logger = get_task_logger(__name__)
#shared_task
def sample_code():
logger.info("Run time:" + str(datetime.datetime.now().strftime("%Y-%m-%d %H:%M")))
return None
On shell I am importing and running as "sample_code.delay()"
Full error stack:
[2016-02-12 00:28:56,331: WARNING/MainProcess] Received and deleted unknown message. Wrong destination?!?
The full contents of the message body was: body: '\x80\x02}q\x01(U\x07expiresq\x02NU\x03utcq\x03\x88U\x04argsq\x04]q\x05U\x05chordq\x06NU\tcallbacksq\x07NU\x08errbacksq\x08NU\x07tasksetq\tNU\x02idq\nU$f02e662e-4eda-4180-9af4-2c8a1ceb57c4q\x0bU\x07retriesq\x0cK\x00U\x04taskq\rU$app.tasks.sample_codeq\x0eU\ttimelimitq\x0fNN\x86U\x03etaq\x10NU\x06kwargsq\x11}q\x12u.' (232b)
{content_type:u'application/x-python-serialize' content_encoding:u'binary'
delivery_info:{'consumer_tag': u'None4', 'redelivered': False, 'routing_key': u'celery', 'delivery_tag': 8, 'exchange': u'celery'} headers={}}
Please let me know where I am wrong
The way it was solved for me is change in command for running celery
It was giving issue for:
celery -A <app_path> worker --loglevel=DEBUG
But it's running without issue if we use:
celery -A <app_path> worker -l info
It may be helpful for other if they face same issue.