Celery second unregistered task - django

I have a doubt regarding the implementation of celery with rabbitMQ since only the first function (debug_task()) that I have defined in celery.py is executed.
The problem is that send_user_mail(randomNumber, email) is not working. debug_task is working, so it's registered.
This is the celery console
[2022-10-08 22:28:48,081: ERROR/MainProcess] Received unregistered
task of type 'callservices.celery.send_proveedor_mail_new_orden'. The
message has been ignored and discarded.
Did you remember to import the module containing this task? Or maybe
you are using relative imports?
Why it's unregistered?
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
from django.core.mail import EmailMultiAlternatives, send_mail
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'callservices.settings')
app = Celery('tasks',broker='pyamqp://guest#localhost//')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(settings.INSTALLED_APPS)
#app.task()
def debug_task():
print("hi all")
#app.task()
def send_user_mail(randomNumber, email):
subject = 'email validation - ServicesYA'
cuerpo="Your number is: "+str(randomNumber)
send_mail(subject, cuerpo ,'xxx.ssmtp#xxx.com', [email],fail_silently = False)
return 1
This is init.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from celery import app as celery_app
__all__ = ('celery_app',)
and in settings.py I add this line:
BROKER_URL = "amqp://guest:guest#localhost:5672//"

Related

Django Celery, App import error only on production

I have a file structure like this:
myapp
artist_applications
tasks.py
tasks
celery.py
# settings.py
INSTALLED_APPS = [
'myapp.artist_application',
...
# celery.py
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myapp.settings.production')
app = Celery("tasks")
app.config_from_object('django.conf:settings', namespace="CELERY")
app.autodiscover_tasks()
# tasks.py
from tasks.celery import app as celery_myapp
from django.apps import apps
from django.conf import settings
import requests
#celery_esns.task(name='sample_task')
def sample_task():
print('TESTING CELERY')
#celery_esns.task(name='publish_artist_task')
def publish_artist_task(payload, artist_id):
r = requests.post(settings.PUBLISH_URL, json = payload)
if r.status_code == 200:
apps.get_model('artist_application', 'Artist').objects.filter(unique_id=artist_id).update(published=True)
else:
raise Exception("Error publishing artist with id: " + artist_id)
On development all is running fine when I start Celery with:
celery -A myapp.tasks worker -Q celery -l info
But on production I run the command (in a virtualenv) and I get the error:
django.core.exceptions.ImproperlyConfigured: Cannot import 'artist_application'. Check that 'myapp.artist_application.apps.ArtistApplication.name' is correct.
Any ideas where to look? I don't get how 'runserver' is loading the apps differently then wsgi?
I had to put the full dotted path in apps.py
from django.apps import AppConfig
class ArtistApplicationConfig(AppConfig):
name = 'myapp.artist_application'
verbose_name = "Artist Application"

celery .delay freezes for this task but runs for others

I am trying to send notifications using celery.
#shared_task(name='send_notifis')
def send_notifs(device_ids, title, message):
from pills_reminder.models import UserNotifications, UserDevice
devices = UserDevice.objects.filter(id__in=device_ids)
print(devices)
device_tokens = []
for device in devices:
UserNotifications.objects.create(
uid=device.device_id,
title=title,
message=message,
)
print(UserNotifications)
device_tokens.append(device.registration_token)
if len(device_tokens) > 1:
device_tokens = ["".join(token.split()) for token in device_tokens]
response = push_service.notify_multiple_devices(registration_ids=device_tokens,
message_title=title,
message_body=message)
elif len(device_tokens) == 1:
registration_id = "".join(device_tokens[0].split())
response = push_service.notify_single_device(registration_id=registration_id,
message_title=title,
message_body=message)
else:
pass
print(response)
return True
this works without .delay() and when running using
python manage.py shell
>>> send_notifs.delay(devices, title='title', message='message')
<AsyncResult: f54188f8-cec6-42dd-a840-b097abffd7f4>
but it freezes when i call using Django Model post_save signal.
#receiver(post_save, sender=Notification)
def Notification_post_save_handler(sender, instance, **kwargs):
print('hello from post_save signal')
devices = instance.get_devices()
# send_notifs(devices)
if len(devices)>0:
send_notifs.delay(devices,
title=instance.title,
message=instance.message)
This above code freezes execution, but without .delay. it works fine.
UPADATE:1
the above task with .delay is running from python manage.py shell not from runserver . so the problem is with celery and Django settings. Hence i dig deep and found out
while running from shell i get,
>>> add.app.conf #(add is a async task here)
{'broker_url': 'redis://localhost:6379/1'}, ...
but running from runserver gives:
`{'broker_url': None}`
Now i am looking for how to set the settings properly ? I am using django-configurations with celery.py as
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'core.settings')
os.environ.setdefault('DJANGO_CONFIGURATION', 'Development')
import configurations
configurations.setup()
app = Celery('core')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Any help is appreciated. Thank you for your time and patience.
use this
from configurations import importer
importer.install()
in place of
import configurations
configurations.setup()
Turns out the settings sepecified in django-configuration docs for including celery was causing the problem. Modified celery.py, and now it is working fine.
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'core.settings')
os.environ.setdefault('DJANGO_CONFIGURATION', 'Development')
from configurations import importer
importer.install()
# import configurations
# configurations.setup()
app = Celery('core')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))

Django scheduled tasks

i want to display the current exchnage rate of USD/Bitcoin price-pair on my website.
Therefor i set celery and a small periodic_task.
Im currently not really able to understand how i call this periodic_task task or display the json data it returns.
this is how my celeter setup look like:
__init_.py
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
celery.py
from __future__ import absolute_import, unicode_literals
from celery import Celery
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
app = Celery('myproject')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
tasks.py
from celery import Celery
from celery.schedules import crontab
from celery.task import periodic_task
from celery.utils.log import get_task_logger
import requests
logger = get_task_logger(__name__)
app = Celery('tasks', broker='redis://127.0.0.1')
#app.task
def test():
return "Test Successful"
#periodic_task(run_every=(crontab(minute='*/15')), name="get_btc_exchange_rate", ignore_result=True)
def get_exchange_rate():
api_url = "https://api.coinmarketcap.com/v1/ticker/?limit=1"
try:
exchange_rate = requests.get(api_url).json()
logger.info("BTC Exchange rate updated.")
except Exception as e:
print(e)
exchange_rate = dict()
return exchange_rate
I'm currently stating celery with this script:
https://gist.github.com/psych0der/44a8994495abee1b4e832420c1c2974d
So my question is how can i trigger that periodic_task and display the return of the json data/field "price_usd"? in a template
Thanks in advance
You'll need to start a celerybeat instance. It will schedule and send off events that you can set on an interval.
http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html

Can't import models to celery tasks.py file

in my tasks.py file I want to import models from polls app, but I get django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet when starting the worker
tasks.py
from __future__ import absolute_import
import sys ,os
from polls.models import User
from .celery import app
#app.task
def add_user(user):
# for user in users:
print('urra')
#user = User(user.first_name, user.last_name, user.email)
# user.save()
celery.py:
from __future__ import absolute_import, unicode_literals
from celery import Celery
import os, sys
from task import celery_config
import dotenv
from os.path import dirname, join
app = Celery('task',
broker='amqp://root:lusine_admin#localhost/task',
backend='amqp://',
include=['task.tasks'])
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "task.settings")
app.config_from_object(celery_config)
# app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
if __name__ == '__main__':
app.start()
Actaually I got error polls module not found, but then from bash I added it to pythonpath and know I get this error.
Your error is with your config. If you want to connect celery with your django, you have to initialize the celery config from the django settings. In your celery.py replace this line:
app.config_from_object(celery_config)
with
app.config_from_object('django.conf:settings', namespace='CELERY')

django celery import function not working

I have been trying to create a task for a while, which consists of creating a sample of a specimen every 5 hours. I have managed to configure celery with redis and execute the task that is as an example in the documentation but when I want to do something more complex that includes a query set it does not execute me.the task disappears from the list when restarting the queue.
this is the structure of the project:
proj:
Muestras:
-views.py
-tasks.py
-models.py
Servicios:
-models.py
proj:
-celery.py
-settings.py
In settings.py:
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/London'
CELERY_BEAT_SCHEDULE = {
'generar-muestras': { # name of the scheduler
'task': 'Muestras.tasks.crear_muestras_tarea',
'schedule': 30.0, # set the period of running
},
}
This is a view that is within Muestras.views
from .models import Muestra
from backend.Servicios.models import Servicio
#this works in console
def generar_muestras():
services = Servicio.models.all()
for i in services:
muestra = Muestra(servicio_id=i.id)
muestra.save
In Muestras.tasks.py
from __future__ import absolute_import, unicode_literals
from celery import task
from .views import generar_muestras
#task
def crear_muestras_task():
print('hola esto tiene una funcion')
#generar_muestras()
this is what i have in celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import setting
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
and when execute
celery -A proj worker -l info -B
everything works well and executes the task but when I make this line in the Muestras.tasks.py and import from .views the view.
generar_muestras()
the task disappears from the list and i get this error:
[2018-11-04 22:31:37,734: INFO/MainProcess] celery#linux-z6z3 ready.
[2018-11-04 22:31:37,876: ERROR/MainProcess] Received unregistered task of
type 'Muestras.tasks.crear_muestras_tarea'.
The message has been ignored and discarded.
Did you remember to import the module containing this task?
Or maybe you're using relative imports?
Please see
http://docs.celeryq.org/en/latest/internals/protocol.html
for more information.
The full contents of the message body was:
b'[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord":
null}]' (77b)
Traceback (most recent call last):
File "/home/wecbxxx/PycharmProjects/porj/venv/lib64/python3.6/site-
packages/celery/worker/consumer/consumer.py", line 558, in
on_task_received
strategy = strategies[type_]
KeyError: 'Muestras.tasks.crear_muestras_tarea'
You didn't share your settings.py or how you run the celery worker so I am taking a wild guess.
Your task should be listed under imports setting of celery. See here.
Your task should be decorated by #app.task(). See here
I suggest you go through celery's user guide. I think it can use some structural improvement but should be enough to understand the basics.
To expand on #gokhan's answer, there are two things that you should make sure of:
Decorate your task with #app.task
from __future__ import absolute_import, unicode_literals
from proj.celery import app
from .views import generar_muestras
#app.task
def crear_muestras_task():
print('hola esto tiene una funcion')
#generar_muestras()
Make sure that Muestras appears in settings.INSTALLED_APPS. This will allow the autodiscover to discover your tasks:
Next, a common practice for reusable apps is to define all tasks in a separate tasks.py module, and Celery does have a way to auto-discover these modules:
app.autodiscover_tasks()
With the line above Celery will automatically discover tasks from all of your installed apps, following the tasks.py convention: