Django celery register periodic task - django

I'm using Celery 4.4 with Django 2.2
I have to create a Periodic Task, I'm extending PeriodicTask ask as
from celery.schedules import crontab
from celery.task import PeriodicTask
class IncompleteOrderHandler(PeriodicTask):
run_every = crontab(
minute='*/{}'.format(getattr(settings, 'INCOMPLETE_ORDER_HANDLER_PULSE', 5))
)
def run(self, *args, **kwargs):
# Task definition
eligible_users, slot_begin, slot_end = self.get_users_in_last_slot()
map(lambda user: self.process_user(user, slot_begin, slot_end), eligible_users)
Earlier to register the above task, I used to call
from celery.registry import tasks
tasks.register(IncompleteOrderHandler)
But now there is no registry module in the celery. How can I register the above periodic task?

I had the same problem with class based celery tasks. This has to works, but it doesn't!
Accidentally, my problem solved by every one on these two changes:
I import one of the class based tasks at tasks.py in viewsets.py, and suddenly i figured out that after doing that, celery found all of the tasks at tasks.py.
This was my base celery setting file:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'picha.settings')
app = Celery('picha')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
I changed the last line to app.autodiscover_tasks(lambda:
settings.CELERY_TASKS) and add CELERY_TASKS list to settings.py and write all
tasks.py file paths in it and then celery found tasks.
I hope one of these work for you.

Related

Django: How to run a celery task only on start up?

I have a django app that uses celery to run tasks.
Sometimes, I have a "hard shutdown" and a bunch of models aren't cleaned up.
I created a task called clean_up that I want to run on start up.
Here is the tasks.py
from my_app.core import utils
from celery import shared_task
#shared_task
def clean_up():
f_name = clean_up.__name__
utils.clean_up()
Here is what celery.py looks like:
import os
from celery import Celery
from celery.schedules import crontab
from datetime import timedelta
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "my_app.settings")
app = Celery("proj")
app.config_from_object("django.conf:settings", namespace="CELERY")
# Load task modules from all registered Django apps.
app.autodiscover_tasks()
app.conf.beat_schedule = {
"runs-on-startup": {
"task": "my_app.core.tasks.clean_up",
"schedule": timedelta(days=1000),
},
}
#app.task(bind=True)
def debug_task(self):
print(f"Request: {self.request!r}")
How can I change celery.py to run clean_up only on start up?
Extra info:
this is in a docker compose, so by "hard shutdown" I mean docker compose down
By "on start up" I mean docker compose up
you were looking for signal:
tasks.py add following code:
from celery import signals
#signals.worker_ready.connect
def clean(**kwargs):
...
loggert.info('worker_ready')

celery not working in django and just waiting (pending)

i'm trying found how celery is working. i have a project that have about 10 app.now i want use celery .
setting.py:
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq#localhost:5672/rabbitmq_vhost'
CELERY_RESULT_BACKEND = 'redis://localhost'
i created a user in rabbitmq with this info:username: rabbitq and password:rabbitmq . then i create a vhost with name rabbitmq_vhost and add rabbitmq permission to it. all is fine i think because all of error about rabbitmq disappear .
here is my test.py:
from .task import when_task_expiration
def test_celery():
result = when_task_expiration.apply_async((2, 2), countdown=3)
print(result.get())
task.py:
from __future__ import absolute_import, unicode_literals
import logging
from celery import shared_task
from proj.celery import app
#app.task
def when_task_expiration(task, x):
print(task.id, 'task done')
return True
celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
now when i call test_celery() in python shell it's pending.i try to replace #shared_task and #app.task(bind=True) but noting changed.even i try use .delay() instead apply_async((2, 2), countdown=3) and again nothing happend.
i'm trying to use celery to call a function in specific time during this quesation that i ask in past.thank you.
You most likely forgot to run at least one Celery worker process. To do so, execute the following in the shell: celery worker -A proj.celery -c 4 -l DEBUG (here I assumed your Celery application is defined in proj/celery.py as you have Celery('proj') in there)

if a task call with delay() when will execute exactly?

I'm new in celery and i want use it but i don't know when i call a task with delay() when exactly will execute? and after adding a new task what i must to do to this task work correctly? i use a present project and i extending it, but old task work correctly and my task doesn't .
present app1/task.py:
from __future__ import absolute_import, unicode_literals
import logging
logger = logging.getLogger('notification')
#shared_task
def send_message_to_users(users, client_type=None, **kwargs):
.
.
#doing something here
.
.
logger.info(
'notification_log',
exc_info=False,
)
)
and this is my code in app2/task.py:
#shared_task
def update_students_done_count(homework_id):
homework_students = HomeworkStudent.objects.filter(homework_id=homework_id)
students_done_count = 0
for homework_student in homework_students:
if homework_student.student_homework_status:
students_done_count += 1
homework = get_object_or_404(HomeWork, homework_id=homework_id)
homework.students_done_count = students_done_count
homework.save()
logger.info(
'update_students_done_count_log : with id {id}'.format(id=task_id),
exc_info=False,
)
sample of how both task calling:
send_message_to_users.delay(users = SomeUserList)
update_students_done_count.delay(homework_id=SomeHomeWorkId)
project/celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'hamclassy.settings')
app = Celery('hamclassy')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
projetc/init.py:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ['celery_app']
after any change to task i run workers by this command:
celery -A project worker -l info
when i calling update_students_done_count.delay(homework_id=1) in an endpoint the log shown me that recived task but how much i'm waiting the task not execute. any idea?thank you

Django-Celery: importing from another app into project's celery.py file

I am following the tutorial on here to get periodic tasks defined in my django project working.
The article suggests having a celery.py file of the form:
from celery import Celery
from celery.schedules import crontab
app = Celery()
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
# Calls test('hello') every 10 seconds.
sender.add_periodic_task(10.0, my_task.s('hello'), name='add every 10')
)
#app.task
def my_task(arg):
print(arg)
which works. Now this is good but I don't want to define my tasks locally. my question is, how can I add tasks from other apps?
I have created a blank project called my_proj and it has two apps: my_proj and app_with_tasks. the celery.py file above is at the root level in my_proj app's directory and I want to add periodic tasks from app_with_tasks 's tasks.py file.
I do have app_with_tasks listed in Installed-apps for my_proj settings file but I still can't import anything from an app to anther.
my understanding is that I should use:
from app_with_tasks.tasks import task1
but my_proj will then show as unresolved reference in PyCharm.
I'll tell you what I'm using. Maybe it helps you
my_proj/celery.py
import os
import celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'my_proj.settings')
app = celery.Celery('app_django')
app.config_from_object('django.conf.settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
then in app_with_tasks, add file tasks.py
from my_proj.celery import app
from django.apps import apps
#app.task(bind=False)
def your_task(some_arg):
A_Model = apps.get_model('my_proj', 'A_Model')
....
command to start celery server (restart this every time you change a task to reload tasks.py files)
/path/to/virtualenv/bin/celery --app=my_proj.celery:app --loglevel=INFO --concurrency=4 -n default_worker worker
To call the task (here you should use your add_periodic_task code)
from app_with_tasks.tasks import your_task
your_task.apply_async(args=[123], kwargs=None)

Celery ImportError: No module named tasks

I'm creating a test scenario for Celery/RabbitMQ/Django. After browsing/reading the various posts similar to mine, I found this one, the closest, but still does not help me. I'm having the "ImportError: no module named tasks" error when executing celery worker.
Celery: 3.1.5 (not dj-celery)
Django: 1.5.5
Project structure:
testcele/ (project name)
mycelery/ (myapp)
__init__
tasks
testcele/
__init__
celery_task
settings
testcele/testcele/celery_task:
from __future__ import absolute_import
import os
from celery import Celery, task, current_task
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'testcele.settings')
app = Celery('testcele', backend='amqp', broker='amqp://guest#localhost//',
include=['tasks'])
if __name__ == '__main__':
app.start()
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
testcele/testcele/init.py:
from __future__ import absolute_import
from .celery_task import app as celery_app
mycelery/tasks.py:
from __future__ import absolute_import
from celery import Celery, task, current_task, shared_task
#shared_task()
def create_models():
.
.
.
I'm running: "celery worker -A testcele -l INFO", at the "testcele/" sub-dir. I have also tried running from testcele/testcel sub-dir, from testcele/mycelery, replacing "testcele" on the celery worker command with "tasks" or "mycelery". Obviously, this gives other errors.
What I am missing?
Thanks, Ricardo
Try adding a __init__.py file in your mycelery folder to make it a module. If that didn't work specify the tasks when defining your app. Like so:
app = Celery('testcele', backend='amqp', broker='amqp://guest#localhost//',
include=['mycelery.tasks'])