celery 4.4.2 and django 3.0.2 raising self.notregistered error but calling the function in python manage.py shell works fine - django

I am trying to add celery to my django project.
everything works fine when i run it from the manage.py shell, my celery worker returns the result of the add() function.
when i call this function from my view it raises a not registered error and a module object not callable error on celery/base.py line 1253
celery/app/base.py", line 1253, in loaderreturn get_loader_cls(self.loader_cls)(app=self)
TypeError: 'module' object is not callable
raise self.NotRegistered(key)
celery.exceptions.NotRegistered: 'home.tasks.add'
both are from the apache error log
my home.tasks.py looks like this
from celery import shared_task
from celery.schedules import crontab
from django.db import connection
from django.utils import timezone, translation
#import pandas as pd
from home.models import *
from residence_management.models import *
from sqlalchemy import *
from django.conf import settings
from django.contrib.auth.models import User
from decimal import Decimal
#shared_task
def add(x, y):
return x + y
my view where i call the task has the import: from home.tasks import add
in the view itself i just call add(9, 5) and it fails with the above error (without the #shared_task decorator it works fine).
when calling the function in the shell, even with the #shared_task decorator it works fine, i can start the celery worker without problems as well.
[2020-05-13 08:47:23,862: INFO/MainProcess] Received task: home.tasks.add[f7e50f7d-4e3d-4372-bf3e-e1c7175c7a2a]
[2020-05-13 08:47:23,879: INFO/ForkPoolWorker-8] Task home.tasks.add[f7e50f7d-4e3d-4372-bf3e-e1c7175c7a2a] succeeded in 0.015277621999999713s: 14
any ideas where the problem might be? i use redis for the broker and result backend

You can check docs and comments for shared_task on github https://github.com/celery/celery/blob/9d49d90074445ff2c550585a055aa222151653aa/celery/app/init.py
I think for some reasons you do not run creating of celery app. It is better in this case use explicit app.
from .celery import app
#app.task()
def add(x, y):
return x + y

Related

Celery second unregistered task

I have a doubt regarding the implementation of celery with rabbitMQ since only the first function (debug_task()) that I have defined in celery.py is executed.
The problem is that send_user_mail(randomNumber, email) is not working. debug_task is working, so it's registered.
This is the celery console
[2022-10-08 22:28:48,081: ERROR/MainProcess] Received unregistered
task of type 'callservices.celery.send_proveedor_mail_new_orden'. The
message has been ignored and discarded.
Did you remember to import the module containing this task? Or maybe
you are using relative imports?
Why it's unregistered?
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
from django.core.mail import EmailMultiAlternatives, send_mail
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'callservices.settings')
app = Celery('tasks',broker='pyamqp://guest#localhost//')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(settings.INSTALLED_APPS)
#app.task()
def debug_task():
print("hi all")
#app.task()
def send_user_mail(randomNumber, email):
subject = 'email validation - ServicesYA'
cuerpo="Your number is: "+str(randomNumber)
send_mail(subject, cuerpo ,'xxx.ssmtp#xxx.com', [email],fail_silently = False)
return 1
This is init.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from celery import app as celery_app
__all__ = ('celery_app',)
and in settings.py I add this line:
BROKER_URL = "amqp://guest:guest#localhost:5672//"

Django celery register periodic task

I'm using Celery 4.4 with Django 2.2
I have to create a Periodic Task, I'm extending PeriodicTask ask as
from celery.schedules import crontab
from celery.task import PeriodicTask
class IncompleteOrderHandler(PeriodicTask):
run_every = crontab(
minute='*/{}'.format(getattr(settings, 'INCOMPLETE_ORDER_HANDLER_PULSE', 5))
)
def run(self, *args, **kwargs):
# Task definition
eligible_users, slot_begin, slot_end = self.get_users_in_last_slot()
map(lambda user: self.process_user(user, slot_begin, slot_end), eligible_users)
Earlier to register the above task, I used to call
from celery.registry import tasks
tasks.register(IncompleteOrderHandler)
But now there is no registry module in the celery. How can I register the above periodic task?
I had the same problem with class based celery tasks. This has to works, but it doesn't!
Accidentally, my problem solved by every one on these two changes:
I import one of the class based tasks at tasks.py in viewsets.py, and suddenly i figured out that after doing that, celery found all of the tasks at tasks.py.
This was my base celery setting file:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'picha.settings')
app = Celery('picha')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
I changed the last line to app.autodiscover_tasks(lambda:
settings.CELERY_TASKS) and add CELERY_TASKS list to settings.py and write all
tasks.py file paths in it and then celery found tasks.
I hope one of these work for you.

error in attempt to use tasks.py in django celery

I try to do something very basic with celery and django. Specifically in my app in tasks.py:
from __future__ import absolute_import, unicode_literals
from celery import Celery
app = Celery( broker='redis://localhost:6379/0')
#app.task
def add(x, y):
return x + y
and in views.py of this app:
from .tasks import *
def index(request):
add.delay(4, 4)
return HttpResponse("you are in index")
when I start running celery and go to index the celery process raises the error.
I run:
celery -A tasks worker
and the error is:
[2017-04-14 00:53:12,404: ERROR/MainProcess] Received unregistered task
of type 'rango.tasks.add'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?'
what do I do wrong?

Celery ImportError: No module named tasks

I'm creating a test scenario for Celery/RabbitMQ/Django. After browsing/reading the various posts similar to mine, I found this one, the closest, but still does not help me. I'm having the "ImportError: no module named tasks" error when executing celery worker.
Celery: 3.1.5 (not dj-celery)
Django: 1.5.5
Project structure:
testcele/ (project name)
mycelery/ (myapp)
__init__
tasks
testcele/
__init__
celery_task
settings
testcele/testcele/celery_task:
from __future__ import absolute_import
import os
from celery import Celery, task, current_task
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'testcele.settings')
app = Celery('testcele', backend='amqp', broker='amqp://guest#localhost//',
include=['tasks'])
if __name__ == '__main__':
app.start()
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
testcele/testcele/init.py:
from __future__ import absolute_import
from .celery_task import app as celery_app
mycelery/tasks.py:
from __future__ import absolute_import
from celery import Celery, task, current_task, shared_task
#shared_task()
def create_models():
.
.
.
I'm running: "celery worker -A testcele -l INFO", at the "testcele/" sub-dir. I have also tried running from testcele/testcel sub-dir, from testcele/mycelery, replacing "testcele" on the celery worker command with "tasks" or "mycelery". Obviously, this gives other errors.
What I am missing?
Thanks, Ricardo
Try adding a __init__.py file in your mycelery folder to make it a module. If that didn't work specify the tasks when defining your app. Like so:
app = Celery('testcele', backend='amqp', broker='amqp://guest#localhost//',
include=['mycelery.tasks'])

Tests for django app produces template not found error for "render_to_string" when executed via Fabric

When I run tests on my remote server using fabric, I get an error saying:
File "/usr/local/lib/python2.7/dist-packages/django/template/loader.py", line 138, in find_template
raise TemplateDoesNotExist(name)
TemplateDoesNotExist: index.html
I am trying to render the template as a string using "render_to_string()"
If I login to the server and run tests manually (python manage.py test app), it is working properly. This error occurs while running through fabric.
Here is my fabric code:
from __future__ import with_statement
from fabric.api import local
import os
from fabric.api import *
env.hosts = ['server.com']
production_project_path = '/path/to/production/app/'
def run_remote_test():
run("python %s/manage.py test app"%production_project_path)
Did I miss something?
Note: I am not using virtual environment
Then let's make this official. ;)
In this case, the problem was the fact that manage.py expects to be ran from the project directory, so rewriting the abovestanding as:
from __future__ import with_statement
from fabric.api import local
import os
from fabric.api import *
env.hosts = ['server.com']
production_project_path = '/path/to/production/app/'
def run_remote_test():
with cd(production_project_path):
run("python manage.py test app")
has fixed the issue.