error in attempt to use tasks.py in django celery - django

I try to do something very basic with celery and django. Specifically in my app in tasks.py:
from __future__ import absolute_import, unicode_literals
from celery import Celery
app = Celery( broker='redis://localhost:6379/0')
#app.task
def add(x, y):
return x + y
and in views.py of this app:
from .tasks import *
def index(request):
add.delay(4, 4)
return HttpResponse("you are in index")
when I start running celery and go to index the celery process raises the error.
I run:
celery -A tasks worker
and the error is:
[2017-04-14 00:53:12,404: ERROR/MainProcess] Received unregistered task
of type 'rango.tasks.add'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?'
what do I do wrong?

Related

Celery second unregistered task

I have a doubt regarding the implementation of celery with rabbitMQ since only the first function (debug_task()) that I have defined in celery.py is executed.
The problem is that send_user_mail(randomNumber, email) is not working. debug_task is working, so it's registered.
This is the celery console
[2022-10-08 22:28:48,081: ERROR/MainProcess] Received unregistered
task of type 'callservices.celery.send_proveedor_mail_new_orden'. The
message has been ignored and discarded.
Did you remember to import the module containing this task? Or maybe
you are using relative imports?
Why it's unregistered?
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
from django.core.mail import EmailMultiAlternatives, send_mail
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'callservices.settings')
app = Celery('tasks',broker='pyamqp://guest#localhost//')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(settings.INSTALLED_APPS)
#app.task()
def debug_task():
print("hi all")
#app.task()
def send_user_mail(randomNumber, email):
subject = 'email validation - ServicesYA'
cuerpo="Your number is: "+str(randomNumber)
send_mail(subject, cuerpo ,'xxx.ssmtp#xxx.com', [email],fail_silently = False)
return 1
This is init.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from celery import app as celery_app
__all__ = ('celery_app',)
and in settings.py I add this line:
BROKER_URL = "amqp://guest:guest#localhost:5672//"

when I run my celery server,the server cursor is blinking before the ready message(Slove)

I'm having a bizarre problem error when I run my celery it's loading but not giving the ready message if I run any task it is running the function but not giving any success message with seconds, even I have make the function with a print statement to make it come to celery the return inside the function, doesn't come to celery as it should be normally with the success message with
this is my celery.py
**from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'ontym.settings')
app = Celery('ontym')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()**
this is the server my picture
celery server
but it should be like this Celery server with ready message
tasks.py
from celery import shared_task
#shared_task
def hello_task(name):
print("hello from task",name)
result

Django celery register periodic task

I'm using Celery 4.4 with Django 2.2
I have to create a Periodic Task, I'm extending PeriodicTask ask as
from celery.schedules import crontab
from celery.task import PeriodicTask
class IncompleteOrderHandler(PeriodicTask):
run_every = crontab(
minute='*/{}'.format(getattr(settings, 'INCOMPLETE_ORDER_HANDLER_PULSE', 5))
)
def run(self, *args, **kwargs):
# Task definition
eligible_users, slot_begin, slot_end = self.get_users_in_last_slot()
map(lambda user: self.process_user(user, slot_begin, slot_end), eligible_users)
Earlier to register the above task, I used to call
from celery.registry import tasks
tasks.register(IncompleteOrderHandler)
But now there is no registry module in the celery. How can I register the above periodic task?
I had the same problem with class based celery tasks. This has to works, but it doesn't!
Accidentally, my problem solved by every one on these two changes:
I import one of the class based tasks at tasks.py in viewsets.py, and suddenly i figured out that after doing that, celery found all of the tasks at tasks.py.
This was my base celery setting file:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'picha.settings')
app = Celery('picha')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
I changed the last line to app.autodiscover_tasks(lambda:
settings.CELERY_TASKS) and add CELERY_TASKS list to settings.py and write all
tasks.py file paths in it and then celery found tasks.
I hope one of these work for you.

celery 4.4.2 and django 3.0.2 raising self.notregistered error but calling the function in python manage.py shell works fine

I am trying to add celery to my django project.
everything works fine when i run it from the manage.py shell, my celery worker returns the result of the add() function.
when i call this function from my view it raises a not registered error and a module object not callable error on celery/base.py line 1253
celery/app/base.py", line 1253, in loaderreturn get_loader_cls(self.loader_cls)(app=self)
TypeError: 'module' object is not callable
raise self.NotRegistered(key)
celery.exceptions.NotRegistered: 'home.tasks.add'
both are from the apache error log
my home.tasks.py looks like this
from celery import shared_task
from celery.schedules import crontab
from django.db import connection
from django.utils import timezone, translation
#import pandas as pd
from home.models import *
from residence_management.models import *
from sqlalchemy import *
from django.conf import settings
from django.contrib.auth.models import User
from decimal import Decimal
#shared_task
def add(x, y):
return x + y
my view where i call the task has the import: from home.tasks import add
in the view itself i just call add(9, 5) and it fails with the above error (without the #shared_task decorator it works fine).
when calling the function in the shell, even with the #shared_task decorator it works fine, i can start the celery worker without problems as well.
[2020-05-13 08:47:23,862: INFO/MainProcess] Received task: home.tasks.add[f7e50f7d-4e3d-4372-bf3e-e1c7175c7a2a]
[2020-05-13 08:47:23,879: INFO/ForkPoolWorker-8] Task home.tasks.add[f7e50f7d-4e3d-4372-bf3e-e1c7175c7a2a] succeeded in 0.015277621999999713s: 14
any ideas where the problem might be? i use redis for the broker and result backend
You can check docs and comments for shared_task on github https://github.com/celery/celery/blob/9d49d90074445ff2c550585a055aa222151653aa/celery/app/init.py
I think for some reasons you do not run creating of celery app. It is better in this case use explicit app.
from .celery import app
#app.task()
def add(x, y):
return x + y

celery not working in django and just waiting (pending)

i'm trying found how celery is working. i have a project that have about 10 app.now i want use celery .
setting.py:
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq#localhost:5672/rabbitmq_vhost'
CELERY_RESULT_BACKEND = 'redis://localhost'
i created a user in rabbitmq with this info:username: rabbitq and password:rabbitmq . then i create a vhost with name rabbitmq_vhost and add rabbitmq permission to it. all is fine i think because all of error about rabbitmq disappear .
here is my test.py:
from .task import when_task_expiration
def test_celery():
result = when_task_expiration.apply_async((2, 2), countdown=3)
print(result.get())
task.py:
from __future__ import absolute_import, unicode_literals
import logging
from celery import shared_task
from proj.celery import app
#app.task
def when_task_expiration(task, x):
print(task.id, 'task done')
return True
celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
now when i call test_celery() in python shell it's pending.i try to replace #shared_task and #app.task(bind=True) but noting changed.even i try use .delay() instead apply_async((2, 2), countdown=3) and again nothing happend.
i'm trying to use celery to call a function in specific time during this quesation that i ask in past.thank you.
You most likely forgot to run at least one Celery worker process. To do so, execute the following in the shell: celery worker -A proj.celery -c 4 -l DEBUG (here I assumed your Celery application is defined in proj/celery.py as you have Celery('proj') in there)