Django celery beat scheduler not working at Asia/Calcutta timezone - django

I am using Django celery beat with celery,
Problem which I am facing is my when I am using scheduler provided by Django-celery beat it doesn't work but using normal beat without scheduler provided by Django-celery-beat works with below command
Doesn't works => celery worker --app=my_project.celery_app -l info --beat --scheduler django_celery_beat.schedulers:DatabaseScheduler
Works => celery worker --app=my_project.celery_app -l info --beat
one thing I noticed when I change Time_Zone to 'UTC' Django-celery-beat scheduler starts working, but i don't want to change timezone settings for djagno how do I fix this
please find my settings below
USE_TZ = False
TIME_ZONE = 'Asia/Kolkata'
CELERY_TIMEZONE = 'Asia/Kolkata'

Can you try overriding the Database scheduler and override schedule method,
celery worker --app=my_project.celery_app -l info --beat --scheduler django_celery_beat.schedulers:DatabaseScheduler
i have changed and data seems to show on the log file

You could try this:
celery -A my_app.celery:app beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
Also, make sure you have django_celery_beat installed and it's added in your settings.py:
INSTALLED_APPS = (
...,
'django_celery_beat',
)
then python manage.py migrate
here you can find more on the beat configuration.

I am sure, there must be a way to do this,
but, I would recommend you to never use tz=False in Django as every standard application is timezone aware and is using UTC as default, it doesn't matter if you are in India or any other country
Hope it makes your application better if you face any error with UTC feel free to ask here
PS. even if you want to check out other library, do look into below library
https://pypi.org/project/django-celery/

Related

Celery app ignoring beat_schedule option specified in settings.py file

I am trying to extend my django app with celery crontab functionality. For this purposes i created celery.py file where i put code as mentioned in official documentation.
Here is my the code from project/project/celery.py
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE','project.settings')
app=Celery('project')
app.config_from_object('django.conf::settings',namespace='CELERY')
Than inside my project/settings.py file i specify related to celery configs as follow
CELERY_TIMEZONE = "Europe/Moscow"
CELERYBEAT_SHEDULE = {
'test_beat_tasks':{
'task':'webhooks.tasks.adding',
'schedule':crontab(minute='*/1),
},
}
Than i run worker an celery beat in the same terminal by
celery -A project worker -B
But nothing happened i mean i didnt see that my celery beat task printing any output while i expected that my task webhooks.tasks.adding will execute
Than i decided to check that celery configs are applied. For this purposes in command line **python manage.py shell i checked celery.app.conf object
#i imported app from project.celery.py module
from project import celery
#than examined app configs
celery.app.conf
And inside of huge config's output of celery configs i saw that timezone is set to None
As i understand my problem is that initiated in project/celery.py app is ignoring my project/settings.py CELERY_TIMEZONE and CELERY_BEAT_SCHEDULE configs but why so? What i am doing wrong? Please guide me
After i spent so much time researching to solve this problem i found that my mistake was inside how i run worker and celery beat. While running worker as i did it wouldnt execute task in the terminal. To see is task is executing i should run it as follow celery -A project worker -B -l INFO or instead of INFO if you want more detailed output DEBUG can be added. Hope it will help anyone

Am I executing celery shared tasks correct?

Here is the way how I start celery periodic tasks. First I execute this command:
celery worker -A my_project.celery
And after that this command:
celery -A my_project beat -l info -S django
After executing these two commands on two different terminal tabs, my celery beat periodic tasks starts running. If I don't run one of the described commands, my periodic tasks do not run. My question is: is there any any way to start the celery with the single command, or even better with runserver command?
Your method of using Celery is correct. You can use parameter -B, --beat to start beat and worker using single command:
# This will start worker AND beat process
celery worker --app=my_project -l=INFO --beat -S django
But do not use this in production, see this note in Celery docs (http://docs.celeryproject.org/en/latest/reference/celery.bin.worker.html):
-B is meant to be used for development purposes. For production environment, you need to start celery beat separately.
Few notes: 1) I think there is no way to run the Celery and runserver together (I honestly think it's not a good idea); 2) I see django-celery tag in your question. This is and old and deprecated way of integrating Django and Celery:
THIS PROJECT IS ONLY REQUIRED IF YOU WANT TO USE DJANGO RESULT BACKEND
AND ADMIN INTEGRATION (Source: https://github.com/celery/django-celery)

Run celery with Django start

I am using Django 1.11 and Celery 4.0.2.
We are using a PaaS (OpenShift 3) which runs over kubernetes - Dockers.
I am using a Python image, it knows only how to run one command on start (and follow for exit code - restart if fails),
How can I run celery worker in the same time I am running Django to make sure that failure of one of them will kill the both process (worker and Django)
I am using wsgi and gevent to start Django
Thank you!
You could use circus (supervisord is an alternative but they don't support python 3 currently)
In circus you create a circus.ini in your project directory.
Something like:
[watcher:celery]
working_dir = /var/www/your_app
virtualenv = virtualenv
cmd = celery
args = worker --app=your_app --loglevel=DEBUG -E
[watcher:django]
working_dir = /var/www/your_app
virtualenv = virtualenv
cmd = python
args = manage.py runserver
Then you start both with:
virtualenv/bin/circusd circus.ini
It should start both processes. I think this is a good way to create a "start" plan for your project. Maybe you want to add celerybeat or use channels (websockets in django), so you just can add a new watcher in your circus.ini. It's pretty dynamic

Django-Celery in production?

So I've been trying to figure out how to make scheduled tasks, I've found Celery and been able to to make simple scheduled tasks. To do this I need to open up a command line and run celery -A proj beat for the tasks to happen. This works fine in a development environment, but when putting this into production that will be an issue.
So how can I get celery to work without the command line use? When my production server is online, how can I make sure my scheduler goes up with it? Can Celery do this or do I need to go down another method?
We use Celery in our production environment, which happens to be on Heroku. We are in the process of moving to AWS. In both environments, Celery hums along nicely.
It would be helpful to understand what your production environment will look like. I'm slightly confused as to why you would be worried about turning off your computer, as using Django implies that you are running serving up a website... Are you serving your website from your laptop??
Anyway, assuming that you are going to run your production server from a cloud platform, all you have to do is send whatever command lines you need to run Django AND the command lines for Celery (as you have already noted in your question).
In terms of configuration, you say that you have 'scheduled' tasks, so that implies you have set up a beat schedule in your config.py file. If not, it should look something like this (assumes you have a module called tasks.py which holds your celery task definitions:
from celery.schedules import crontab
beat_schedule = {
'task1': {
'task': 'tasks.task_one',
'schedule': 3600
},
'task2': {
'task': 'tibController.tasks.update_old_retail',
'schedule': crontab(hour=12, minute=0, day_of_week='mon-fri'
}
}
Then in your tasks.py just call the config file you just do this:
from celery import Celery
import config
app = Celery('tasks')
app.config_from_object(config)
You can find more on crontab in the docs. You can also checkout this repo for a simple Celery example.
In summary:
Create a config file that identifies which tasks to run when
Load the config file into your Celery app
Get a cloud platform to run your code on.
Run celery exactly like you have already identified
Hope that helps.

Celery daemon picks up task but it never gets executed

I have a celery instance running on heroku on a woker dyno. I use it with django and djcelery and I use the following command to start it:
python manage.py celery worker -B -E --loglevel=info --autoscale=50,8 --without-gossip
I added a task to the queue but it never got executed. The logs show this:
Scaling down -7 processes. (This is a while before the following)
Received task: myapp.tasks.my_task[75f095cb-9652-4cdb-87d4-dc31cecaadff]
Scaling up 1 processes.
But thats it, after this, the job didn't get done at all. I fired other tasks after this as well and all of them were successfully executed. The same issue was repeated a couple of hours later.
I have a few questions:
1. The "Received task ..." log line means that the celery daemon picked the job off of the queue, right? Why did the worker process become inactive?
2. Since my autoscale is set to 50,8, shouldn't there always be 8 workers? Why were all workers scaled down?
3. Is there any way to know that a task is stuck? What is the recommended failure recovery for such a case if the task being fired is mission critical.
My celery version is 3.1.9, django-celery version 3.1.9, redis version 2.6.16
Settings in django settings:
CELERY_SEND_TASK_ERROR_EMAILS = True
import djcelery
djcelery.setup_loader()
BROKER_URL = 'redis://:password#host:port/0'
BROKER_BACKEND = 'django'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'