i want to display the current exchnage rate of USD/Bitcoin price-pair on my website.
Therefor i set celery and a small periodic_task.
Im currently not really able to understand how i call this periodic_task task or display the json data it returns.
this is how my celeter setup look like:
__init_.py
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
celery.py
from __future__ import absolute_import, unicode_literals
from celery import Celery
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
app = Celery('myproject')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
tasks.py
from celery import Celery
from celery.schedules import crontab
from celery.task import periodic_task
from celery.utils.log import get_task_logger
import requests
logger = get_task_logger(__name__)
app = Celery('tasks', broker='redis://127.0.0.1')
#app.task
def test():
return "Test Successful"
#periodic_task(run_every=(crontab(minute='*/15')), name="get_btc_exchange_rate", ignore_result=True)
def get_exchange_rate():
api_url = "https://api.coinmarketcap.com/v1/ticker/?limit=1"
try:
exchange_rate = requests.get(api_url).json()
logger.info("BTC Exchange rate updated.")
except Exception as e:
print(e)
exchange_rate = dict()
return exchange_rate
I'm currently stating celery with this script:
https://gist.github.com/psych0der/44a8994495abee1b4e832420c1c2974d
So my question is how can i trigger that periodic_task and display the return of the json data/field "price_usd"? in a template
Thanks in advance
You'll need to start a celerybeat instance. It will schedule and send off events that you can set on an interval.
http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html
Related
I have a doubt regarding the implementation of celery with rabbitMQ since only the first function (debug_task()) that I have defined in celery.py is executed.
The problem is that send_user_mail(randomNumber, email) is not working. debug_task is working, so it's registered.
This is the celery console
[2022-10-08 22:28:48,081: ERROR/MainProcess] Received unregistered
task of type 'callservices.celery.send_proveedor_mail_new_orden'. The
message has been ignored and discarded.
Did you remember to import the module containing this task? Or maybe
you are using relative imports?
Why it's unregistered?
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
from django.core.mail import EmailMultiAlternatives, send_mail
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'callservices.settings')
app = Celery('tasks',broker='pyamqp://guest#localhost//')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(settings.INSTALLED_APPS)
#app.task()
def debug_task():
print("hi all")
#app.task()
def send_user_mail(randomNumber, email):
subject = 'email validation - ServicesYA'
cuerpo="Your number is: "+str(randomNumber)
send_mail(subject, cuerpo ,'xxx.ssmtp#xxx.com', [email],fail_silently = False)
return 1
This is init.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from celery import app as celery_app
__all__ = ('celery_app',)
and in settings.py I add this line:
BROKER_URL = "amqp://guest:guest#localhost:5672//"
I am trying to send notifications using celery.
#shared_task(name='send_notifis')
def send_notifs(device_ids, title, message):
from pills_reminder.models import UserNotifications, UserDevice
devices = UserDevice.objects.filter(id__in=device_ids)
print(devices)
device_tokens = []
for device in devices:
UserNotifications.objects.create(
uid=device.device_id,
title=title,
message=message,
)
print(UserNotifications)
device_tokens.append(device.registration_token)
if len(device_tokens) > 1:
device_tokens = ["".join(token.split()) for token in device_tokens]
response = push_service.notify_multiple_devices(registration_ids=device_tokens,
message_title=title,
message_body=message)
elif len(device_tokens) == 1:
registration_id = "".join(device_tokens[0].split())
response = push_service.notify_single_device(registration_id=registration_id,
message_title=title,
message_body=message)
else:
pass
print(response)
return True
this works without .delay() and when running using
python manage.py shell
>>> send_notifs.delay(devices, title='title', message='message')
<AsyncResult: f54188f8-cec6-42dd-a840-b097abffd7f4>
but it freezes when i call using Django Model post_save signal.
#receiver(post_save, sender=Notification)
def Notification_post_save_handler(sender, instance, **kwargs):
print('hello from post_save signal')
devices = instance.get_devices()
# send_notifs(devices)
if len(devices)>0:
send_notifs.delay(devices,
title=instance.title,
message=instance.message)
This above code freezes execution, but without .delay. it works fine.
UPADATE:1
the above task with .delay is running from python manage.py shell not from runserver . so the problem is with celery and Django settings. Hence i dig deep and found out
while running from shell i get,
>>> add.app.conf #(add is a async task here)
{'broker_url': 'redis://localhost:6379/1'}, ...
but running from runserver gives:
`{'broker_url': None}`
Now i am looking for how to set the settings properly ? I am using django-configurations with celery.py as
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'core.settings')
os.environ.setdefault('DJANGO_CONFIGURATION', 'Development')
import configurations
configurations.setup()
app = Celery('core')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Any help is appreciated. Thank you for your time and patience.
use this
from configurations import importer
importer.install()
in place of
import configurations
configurations.setup()
Turns out the settings sepecified in django-configuration docs for including celery was causing the problem. Modified celery.py, and now it is working fine.
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'core.settings')
os.environ.setdefault('DJANGO_CONFIGURATION', 'Development')
from configurations import importer
importer.install()
# import configurations
# configurations.setup()
app = Celery('core')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
in my tasks.py file I want to import models from polls app, but I get django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet when starting the worker
tasks.py
from __future__ import absolute_import
import sys ,os
from polls.models import User
from .celery import app
#app.task
def add_user(user):
# for user in users:
print('urra')
#user = User(user.first_name, user.last_name, user.email)
# user.save()
celery.py:
from __future__ import absolute_import, unicode_literals
from celery import Celery
import os, sys
from task import celery_config
import dotenv
from os.path import dirname, join
app = Celery('task',
broker='amqp://root:lusine_admin#localhost/task',
backend='amqp://',
include=['task.tasks'])
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "task.settings")
app.config_from_object(celery_config)
# app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
if __name__ == '__main__':
app.start()
Actaually I got error polls module not found, but then from bash I added it to pythonpath and know I get this error.
Your error is with your config. If you want to connect celery with your django, you have to initialize the celery config from the django settings. In your celery.py replace this line:
app.config_from_object(celery_config)
with
app.config_from_object('django.conf:settings', namespace='CELERY')
settings.py
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Asia/Makassar'
# Other Celery settings
CELERY_BEAT_SCHEDULE = {
'add_task': {
'add_task': 'data_loader.tasks.add_task',
'schedule': crontab(minute=55, hour=13),
}
# 'task-number-two': {
# 'task': 'data_loader.tasks.demo',
# 'schedule': crontab(minute=2, hour='18'),
# }
}
celery.py
..............
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'resolution_validator.settings')
app = Celery('resolution_validator')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
tasks.py
............
from __future__ import absolute_import, unicode_literals
from celery.task import task
#task()
def add_task():
print("hello world")
a = 10
return True
init.py
...............
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ['celery_app']
I am also getting the following error
"RecursionError: maximum recursion depth exceeded" with the KeyError.
I am using Django==2.0,celery==4.2.0,python==3.5.2
Not able to get the solution .
There is a problem with your tasks.py. You should use #shared_task decorator instead of #task decorator
from __future__ import absolute_import, unicode_literals
from celery import shared_task
#shared_task
def add_task():
print("hello world")
a = 10
return True
I have to use Celery 4.0.2 with RabbitMQ 3.6.10 to handle an async task. Then, I have followed this tutorial: https://www.codementor.io/uditagarwal/asynchronous-tasks-using-celery-with-django-du1087f5k
However, I have a slight problem with my task because it's impossible to have a result. My task always remains in "PENDING" state.
My question is what i have to do to get a result ?
Thank you in advance for your answer.
Here my code:
Here my __init_.py:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ['celery_app']
Here a part of my setting.py:
BROKER_URL = 'amqp://guest:guest#localhost//'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
Here my celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')
app = Celery('mysite',
backend='amqp',
broker='amqp://guest#localhost//')
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
And my tasks.py
# Create your tasks here
from __future__ import absolute_import, unicode_literals
from celery import shared_task
#shared_task
def add(x, y):
test = "ok"
current_task.update_state(state='PROGRESS',
meta={'test': ok})
return x + y
And here my Django Shell:
>>> from blog.tasks import *
>>> job = add.delay(2,3)
>>> job.state
'PENDING'
>>> job.result
>>>
With a picture of my RabbitMQ interface:
You need to start a worker that will process the tasks you add to queue. From your virtualenv run:
celery worker -A blog -l info