I am wondering what is the correct approach here, and cannot find any data about this while googling:
Assuming I have the class:
class OkPayPaymentChecker(BasePaymentChecker):
pass
in 'nexchnage/tasks.py'
what would be the correct CELERY_BEAT_SCHEDULE entry for this task?
is it:
CELERY_BEAT_SCHEDULE = {
'check_okpay_payments': {
'task': 'nexchange.tasks.OkPayPaymentChecker',
'schedule': timedelta(seconds=60),
},
}
Or should create an instance with it first?
Unanswered related questions:
Celery beat with method tasks not working
Celery Beat: How to define periodic tasks defined as classes (class based tasks)
Related
I am wondering if it is possible to have my end users dynamically adjust the schedule of a periodic task.
So something along these lines:
# celery.py
def get_schedule():
config = get_user_config() # returns a model object of sorts
return config.frequency_in_seconds
app.conf.beat_schedule = {
'my_periodic_task': {
'task': 'my_periodic_task',
'schedule': get_schedule, # schedule updated based on `get_schedule` function
},
}
This way, if a user were to change the frequency_in_seconds field in their user config setting, it would dynamically update the beat schedule.
My preference would be to do this outside of the Django Admin site and without any additional packages (e.g. django-celery-beat).
Any thoughts or ideas would be much appreciated.
Thanks
If you're using django, you can use django-celery-beat to allow end-users to control the schedule using the django admin panel.
If you're using redis as your result backend, you can use this library here https://github.com/parad0x96/django-redbeat
Create the dynamic periodic task.:
from django_redbeat import PeriodicTaskEntry
task = PeriodicTasksEntry.objects.create(
name="The verbose name of the task",
task="yourapp.tasks.task_name",
args=[arg1, arg2,],
schedule=10# the schedule in seconds
)
This will create a dynamic periodic task, which you have control on the schedule and the creation as well.
Run Celery beat like this :
celery -A your_app_name beat -l INFO -S redbeat.RedBeatScheduler --max-interval 10
I have a Django model which has a column called celery_task_id. I am using RabbitMQ as the broker. There's a celery function called test_celery which takes a model object as parameter. Now I have the following lines of code which creates a celery task.
def create_celery_task():
celery_task_id = test_celery.apply_async((model_obj,), eta='Future Datetime Object')
model_obj.celery_task_id = celery_task_id
model_obj.save()
----
----
Now inside the celery function I am verifying if the task id is same as of the one stored in the DB or not.
#app.task
def test_celery(model_obj):
if model_obj.celery_task_id == test_celery.request.id:
## Do something
My problem is there are a lot of cases where I can see the task being received and succeeding in the log but not executing the code inside of if condition.
Is it possible that celery task id changes after redistribution. Or are there any other reasons.
One of the recommendations is not to pass Database/ORM objects into the Celery tasks because the may contain stale data. Try to rewrite the task as:
#app.task
def test_celery(model_obj_id):
model_obj = YourModel.objects.get(id=model_obj_id)
if model_obj:
if model_obj.celery_task_id == test_celery.request.id:
## Do something
I have a pluggable app I'm developing for a Django system. In it, I have a task for creating notifications that looks something like so:
installable_app.tasks
#app.task(name='tasks.generate_notifications')
def generate_notifications(...):
I have a pluggable app I'm developing for a Django system. In it, I have a task for creating notifications that looks something like so:
installable_app.tasks
#app.task(name='tasks.generate_notifications')
def generate_notifications(...):
clients = get_list_of_clients()
for client in clients:
client_generate_notification.delay(client['name'], client['id'])
return "Notification Generation Complete"
#app.task
def client_generate_notification(client_name, client_id):
...
return result
Now I want this to run periodically which can be accomplished with Celery Beat using settings. I also want it to be on its own queue:
settings.py:
CELERYBEAT_SCHEDULE ={
{'generate_schedule_notifications': {
'task': 'tasks.generate_notifications',
'schedule': crontab(hour=6, minute=0),
'options': {'queue': 'notification_gen'},
'args': ('schedule', 'Equipment', 'HVAC')},
}
}
The first task, generate_notifications is run correctly on the queue notification_gen but the client_generate_notification subtasks are run on the default queue.
I know I can specify the queues specifically in the #task decorator but since this is a django app, I would rather they be specified where it is actually run.
I've looked into using the CELERY_ROUTES option but when I tried it, it seemed to overwrite the queues for other tasks I was running.
Is the best practice to define all the possible queues in CELERY_ROUTES or is there a better way to set up my task so that they will both run on the same queue?
Do you want something like it?
I practice django-celery
settings.py
#import datetime
#CELERYBEAT_SCHEDULE = {
# 'hello_task': {
# 'task': 'hello_task',
# 'schedule': datetime.timedelta(seconds=20),
# },
#}
import datetime
CELERYBEAT_SCHEDULE = {
'add-every-30-seconds': {
'task': 'app1.tasks.myfunc',
'schedule': datetime.timedelta(seconds=30),
},
}
I try hello_task schedule at first,then I commented it and try add-every-30-seconds
But it still want to execute when hello_task time comes
So I check the database and found it was save the record int it
Why wouldn't it delete when I commented it ??
Is there any command or normal way to delete it ???
Or If I just delete it from database is well ??
If you're using the django-celery database scheduler those periodic tasks in CELERYBEAT_SCHEDULE dict will be added to Django's database, as you found out. django-celery's scheduler then reads its settings primarily from the database. Removing entries from the dict just means that django-celery has nothing to add to the database.
To delete the task properly, remove it from the Django admin page (Djcelery > Periodic Tasks).
When you change the code that celery is working with, you should restart celery to understand the changes. Such as needing to restart django server whenever you change your python codes.
I'm using Cloudsearch for searching on my website. When I try to add/remove items from the index, it works...as long as I don't do it as a asynch task in celery. If I do it outside of that it works perfectly. I was wondering if someone can point me in the right direction?
My code is below:
from boto.cloudsearch2.layer2 import Layer2
#task
def add_employer(emp_id):
conn = Layer2(aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY)
domain = conn.lookup("mydomain")
emp = # the value
emp_map = {
"name": emp.name,
"id": emp.id,
"url": emp.url
}
domain.layer1.sign_request = True
doc_service = domain.get_document_service()
doc_service.add(emp.id, emp_map)
doc_service.commit()
is there something wrong with this code? I called the exact method with and without celery. It works consistently without celery and never with celery.