My celery-beat works locally but not in production? - django

my task:
#shared_task
def my_test():
# Executes every 2 minutes
UserStatisticStatus.objects.filter(id=1).update(loot_boxes=+234)
print('Hello from celery')
app.conf.beat_schedule = {
'my-task-every-10-seconds': {
'task': 'user_statistic_status.tasks.my_test',
'schedule': timedelta(seconds=10)
}
}
my settings:
if 'RDS_DB_NAME' in os.environ:
CELERY_BROKER_URL = 'redis://<myurl>/0'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_BROKER_TRANSPORT_OPTIONS = {
'region': 'eu-central-1',
'polling_interval': 20,
}
CELERY_RESULT_BACKEND = 'redis://<myurl>/1'
CELERY_ENABLE_REMOTE_CONTROL = False
CELERY_SEND_EVENTS = False
CELERY_TASK_ROUTES = {
'my_test': {'queue': 'default'},
}
my celery.py :
import os
from celery import Celery
from project import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery('celery_app')
app.conf.task_routes = {
'my_test': {'queue': 'default'},
}
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
[2023-01-06 09:13:32,581: DEBUG/MainProcess] | Worker: Starting Beat
[2023-01-06 09:13:32,584: DEBUG/MainProcess] ^-- substep ok
[2023-01-06 09:13:32,585: DEBUG/MainProcess] | Worker: Starting Hub
[2023-01-06 09:13:32,585: DEBUG/MainProcess] ^-- substep ok
[2023-01-06 09:13:32,586: DEBUG/MainProcess] | Worker: Starting Pool
[2023-01-06 09:13:32,856: DEBUG/MainProcess] ^-- substep ok
[2023-01-06 09:13:32,864: DEBUG/MainProcess] | Worker: Starting Consumer
[2023-01-06 09:13:32,864: DEBUG/MainProcess] | Consumer: Starting Connection
[2023-01-06 09:13:32,901: INFO/Beat] beat: Starting...
[2023-01-06 09:13:32,972: DEBUG/Beat] Current schedule:
<ScheduleEntry: my-task-every-10-seconds user_statistic_status.tasks.my_test() <freq: 10.00 seconds>
[2023-01-06 09:13:32,972: DEBUG/Beat] beat: Ticking with max interval->5.00 minutes
[2023-01-06 09:13:32,973: DEBUG/Beat] beat: Waking up in 9.99 seconds.
[2023-01-06 09:13:42,969: DEBUG/Beat] beat: Synchronizing schedule...
It gets stuck here and task never executes !!!
The worker connects and shows the tasks correctly and the beat starts too but nothing happening. I've tested it with the local redis server and everything works fine.
Any help will be much appreciated
Thank you

Related

Django Celery task doesn't fire in development

I'm trying to make use of periodic tasks but can't make it work.
I have this test task
# handler/tasks.py
from celery import Celery
app = Celery()
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
# Calls test('hello') every 2 seconds.
sender.add_periodic_task(2, test.s('hello'), name='add every 2')
#app.task
def test(arg):
print(arg)
Celery is configured
# project dir
# salaryx_django/celery.py
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'salaryx_django.settings')
app = Celery('salaryx_django')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
# salaryx_django/settings.py
# CELERY STUFF
BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/London'
The workers are intiated
[2022-04-25 14:57:55,424: INFO/MainProcess] Connected to redis://localhost:6379//
[2022-04-25 14:57:55,426: INFO/MainProcess] mingle: searching for neighbors
[2022-04-25 14:57:56,433: INFO/MainProcess] mingle: all alone
[2022-04-25 14:57:56,452: WARNING/MainProcess] /Users/jonas/Desktop/salaryx_django/venv/lib/python3.8/site-packages/celery/fixups/django.py:203: UserWarning: Using settings.DEBUG leads to a memory
leak, never use this setting in production environments!
warnings.warn('''Using settings.DEBUG leads to a memory
[2022-04-25 14:57:56,453: INFO/MainProcess] celery#Air-von-Jonas ready.
and Redis is waiting for connections
but nothing happens at all..
Celery Beat
(venv) jonas#Air-von-Jonas salaryx_django % celery -A salaryx_django beat
celery beat v5.2.6 (dawn-chorus) is starting.
__ - ... __ - _
LocalTime -> 2022-04-26 05:38:27
Configuration ->
. broker -> redis://localhost:6379//
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%WARNING
. maxinterval -> 5.00 minutes (300s)
You also have to run beat.
From beat entries documentation
The add_periodic_task() function will add the entry to the beat_schedule setting behind the scenes
Simply running celery -A salaryx_django beat in another process should get you going. Read docs for more info.

Celery beat sends task regularly, but celery only processes them from time to time in production

I have a django project with celery integrated using redis.
My celery worker works perfectly in local development, and now I'm deploying in production.
Before daemonizing the process I want to see how celery behaves in the server. The thing is, celery beat sends the tasks correctly every minute (as I scheduled) but the worker seems not to receive it every time. Sometimes it requires 4/5 minutes until the task is received and processed. How is that possible? I have tried debugging, but there is very few information.
see my setup:
settings.py
CELERY_TIMEZONE = 'Europe/Warsaw'
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
# Other Celery settings
CELERY_BEAT_SCHEDULE = {
'task-number-one': {
'task': 'predict_assistance.alerts.tasks.check_measures',
'schedule': crontab(minute='*/1'),
},
}
tasks.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task
#shared_task()
def check_measures():
print('doing something')
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'config.settings.local')
app = Celery('predict_assistance')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
hereby my logs in production:
[2020-03-11 16:09:00,028: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:09:00,038: INFO/MainProcess] Received task: predict_assistance.alerts.tasks.check_measures[86f5c999-a53c-44dc-b568-00d924b5da9e]
[2020-03-11 16:09:00,046: WARNING/ForkPoolWorker-3] doing something
[2020-03-11 16:09:00,047: INFO/ForkPoolWorker-3] predict_assistance.alerts.tasks.check_measures[86f5c999-a53c-44dc-b568-00d924b5da9e]: doing something logger
[2020-03-11 16:09:00,204: INFO/ForkPoolWorker-3] Task predict_assistance.alerts.tasks.check_measures[86f5c999-a53c-44dc-b568-00d924b5da9e] succeeded in 0.16194193065166473s: None
[2020-03-11 16:10:00,049: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:10:00,062: INFO/MainProcess] Received task: predict_assistance.alerts.tasks.check_measures[c7786f38-793f-45e6-abb2-1c901e345e8f]
[2020-03-11 16:10:00,072: WARNING/ForkPoolWorker-3] doing something
[2020-03-11 16:10:00,073: INFO/ForkPoolWorker-3] predict_assistance.alerts.tasks.check_measures[c7786f38-793f-45e6-abb2-1c901e345e8f]: doing something logger
[2020-03-11 16:10:00,242: INFO/ForkPoolWorker-3] Task predict_assistance.alerts.tasks.check_measures[c7786f38-793f-45e6-abb2-1c901e345e8f] succeeded in 0.17491870187222958s: None
[2020-03-11 16:11:00,054: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:12:00,032: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:13:00,035: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:14:00,046: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:14:00,053: INFO/MainProcess] Received task: predict_assistance.alerts.tasks.check_measures[e0b3ef2b-ba15-421c-9a0f-0ef9f3ebb22a]
[2020-03-11 16:14:00,065: WARNING/ForkPoolWorker-3] doing something
[2020-03-11 16:14:00,066: INFO/ForkPoolWorker-3] predict_assistance.alerts.tasks.check_measures[e0b3ef2b-ba15-421c-9a0f-0ef9f3ebb22a]: doing something logger
[2020-03-11 16:14:00,247: INFO/ForkPoolWorker-3] Task predict_assistance.alerts.tasks.check_measures[e0b3ef2b-ba15-421c-9a0f-0ef9f3ebb22a] succeeded in 0.1897202990949154s: None
Do you have any idea why this is happening?
Thanks in advance
As suggested by the comments, the solution was to switch from redis-server service to rabbitmq-server service

Celery/Django is running tasks every millisecond instead of on schedule when timezone has been set

I am running into the issue that, when using a timezone inside my celery.py tasks run every chance it gets, thus running not according to schedule.
This is the output:
scheduler_1 | [2018-11-29 11:00:09,186: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,199: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,204: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,210: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,220: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,228: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,231: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,236: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,239: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,247: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,250: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
My celery.py:
import os
from celery import Celery
from celery.schedules import crontab
from django.conf import settings
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings")
# app = Celery('biko')
task_apps = [
'biko.supplier.tasks',
'biko.common.tasks',
'biko.commerce.tasks',
'biko.shop.tasks'
]
app = Celery('biko', include=task_apps)
app.config_from_object('django.conf:settings')
app.conf.timezone = 'Europe/Amsterdam'
app.autodiscover_tasks()
app.conf.ONCE = {
'backend': 'celery_once.backends.Redis',
'settings': {
'url': 'redis://' + os.getenv('REDIS_HOST'),
'blocking': True,
'default_timeout': 60 * 60,
'blocking_timeout': 86400
}
}
When I remove the app.config.timezone everything works fine.
My django settings regarding timezone...
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
Any ideas what causes these issues?

Django Celery Beat Schduler sends next task with current one

I have trouble configuring Celery Beat to reliably send tasks at the scheduled time. It sometimes sends the next scheduled task at the same time than the current one, which then messes the whole schedule.
Here are the 4 tasks I need to run, configured in celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from celery.schedules import crontab
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'oust.settings.local')
app = Celery('oust')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
app.conf.timezone='Europe/Zurich'
app.conf.beat_schedule = {
'Send pickup reminders': {
'task': 'send_pickup_reminders',
'schedule': crontab(hour=17, minute=0),
},
'Decrease pickup counters': {
'task': 'decrease_pickup_counters',
'schedule': crontab(hour=23, minute=55),
},
'Update all pickups': {
'task': 'update_all_pickups',
'schedule': crontab(hour=2, minute=0),
},
'Generate new invoices': {
'task': 'generate_new_invoices',
'schedule': crontab(hour=0, minute=15),
},
}
And here is an example of a task sent with the previous one :
Apr 07 19:00:00 oust-prod app/beat.1: [2018-04-08 04:00:00,010: INFO/MainProcess] Scheduler: Sending due task celery.backend_cleanup (celery.backend_cleanup)
Apr 07 19:00:00 oust-prod app/worker.1: [2018-04-08 04:00:00,017: INFO/MainProcess] Received task: celery.backend_cleanup[d3d30f13-0244-4582-8944-80fc56ae7b94]
Apr 07 19:00:00 oust-prod app/worker.1: [2018-04-08 04:00:00,022: DEBUG/MainProcess] Task accepted: celery.backend_cleanup[d3d30f13-0244-4582-8944-80fc56ae7b94] pid:9
Apr 07 19:00:00 oust-prod app/worker.1: [2018-04-08 04:00:00,029: INFO/MainProcess] Received task: send_pickup_reminders[e01ea890-86b2-4192-9238-0ee358e2ca1b]
Apr 07 19:00:00 oust-prod app/worker.1: [2018-04-08 04:00:00,080: INFO/ForkPoolWorker-1] Task celery.backend_cleanup[d3d30f13-0244-4582-8944-80fc56ae7b94] succeeded in 0.05823612492531538s: None
Apr 07 19:00:00 oust-prod app/beat.1: [2018-04-08 04:00:00,013: DEBUG/MainProcess] celery.backend_cleanup sent. id->d3d30f13-0244-4582-8944-80fc56ae7b94
Apr 07 19:00:00 oust-prod app/beat.1: [2018-04-08 04:00:00,021: INFO/MainProcess] Scheduler: Sending due task send_pickup_reminders (send_pickup_reminders)
Apr 07 19:00:00 oust-prod app/beat.1: [2018-04-08 04:00:00,031: DEBUG/MainProcess] send_pickup_reminders sent. id->e01ea890-86b2-4192-9238-0ee358e2ca1b
Apr 07 19:00:00 oust-prod app/beat.1: [2018-04-08 04:00:00,034: DEBUG/MainProcess] beat: Waking up in 5.00 seconds.
Apr 07 19:00:00 oust-prod app/worker.1: [2018-04-08 04:00:00,086: DEBUG/MainProcess] Task accepted: send_pickup_reminders[e01ea890-86b2-4192-9238-0ee358e2ca1b] pid:9
Apr 07 19:00:00 oust-prod app/worker.1: [2018-04-08 04:00:00,142: INFO/ForkPoolWorker-1] Task send_pickup_reminders[e01ea890-86b2-4192-9238-0ee358e2ca1b] succeeded in 0.059483697172254324s: None
Apr 07 19:00:05 oust-prod app/beat.1: [2018-04-08 04:00:05,043: DEBUG/MainProcess] beat: Waking up in 5.00 seconds.
Here the scheduler sends celery backend cleanup as normal at 04:00, but then also sends the next scheduled taks, send pickup reminder, which should be executed at 17:00.
This has happened with multiple different tasks and schedules. I just can't find what causes this problem.
Here are the relevant settings I used:
settings.py
INSTALLED_APPS = [
...
'django_celery_beat',
'django_celery_results',
...
]
TIME_ZONE = 'Europe/Zurich'
USE_I18N = True
USE_L10N = True
USE_TZ = True
Production settings
CELERY_BROKER_URL= os.environ['REDISCLOUD_URL']
CELERY_RESULT_BACKEND = 'django-db'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
Version info
Django (2.0)
django-celery-beat (1.1.1)
celery (4.1.0)

Celery periodic tasks does not execute

I am using cloud 9 online IDE for python dev.
Here's my code:
from celery import Celery
from celery.schedules import crontab
from datetime import timedelta
RESULT_URL = 'mongodb://********'
BROKER_URL = 'redis://*********'
app = Celery('tasks', backend=RESULT_URL,broker=BROKER_URL)
CELERY_TIMEZONE = 'UTC'
CELERYBEAT_SCHEDULE = {
'add-every-30-seconds': {
'task': 'tasks.add',
'schedule': timedelta(seconds=30),
'args': (16, 16)
},
}
#app.task
def add(x, y):
print x+y
return x + y
And I am starting it out with the command:
celery -A tasks worker --loglevel=info --beat
Celery starts OK,but stops all activity there.Manually invoked tasks work fine.
Here's the console log:
[2015-04-16 07:53:30,954: INFO/Beat] beat: Starting...
[2015-04-16 07:53:32,696: INFO/MainProcess] Connected to redis://*******
[2015-04-16 07:53:34,722: INFO/MainProcess] mingle: searching for neighbors
[2015-04-16 07:53:37,685: INFO/MainProcess] mingle: all alone
[2015-04-16 07:53:40,343: WARNING/MainProcess] celery#*****-demo-project-563148 ready.
I am using RedisLabs free-version as the Broker and a self hosted mongo as the backend storage. Where am I going wrong?