Celery app ignoring beat_schedule option specified in settings.py file - django

I am trying to extend my django app with celery crontab functionality. For this purposes i created celery.py file where i put code as mentioned in official documentation.
Here is my the code from project/project/celery.py
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE','project.settings')
app=Celery('project')
app.config_from_object('django.conf::settings',namespace='CELERY')
Than inside my project/settings.py file i specify related to celery configs as follow
CELERY_TIMEZONE = "Europe/Moscow"
CELERYBEAT_SHEDULE = {
'test_beat_tasks':{
'task':'webhooks.tasks.adding',
'schedule':crontab(minute='*/1),
},
}
Than i run worker an celery beat in the same terminal by
celery -A project worker -B
But nothing happened i mean i didnt see that my celery beat task printing any output while i expected that my task webhooks.tasks.adding will execute
Than i decided to check that celery configs are applied. For this purposes in command line **python manage.py shell i checked celery.app.conf object
#i imported app from project.celery.py module
from project import celery
#than examined app configs
celery.app.conf
And inside of huge config's output of celery configs i saw that timezone is set to None
As i understand my problem is that initiated in project/celery.py app is ignoring my project/settings.py CELERY_TIMEZONE and CELERY_BEAT_SCHEDULE configs but why so? What i am doing wrong? Please guide me

After i spent so much time researching to solve this problem i found that my mistake was inside how i run worker and celery beat. While running worker as i did it wouldnt execute task in the terminal. To see is task is executing i should run it as follow celery -A project worker -B -l INFO or instead of INFO if you want more detailed output DEBUG can be added. Hope it will help anyone

Related

Active Django settings file from Celery worker (how to set DJANGO_SETTINGS_MODULE Dynamically )

So I already looked around a lot for this but couldn't find a good answer. I'm using Celery celery and Django 3.2.13
., without django-celery package since newer versions of Celery don't require it anymore. I managed to set up tasks and execute them using Redis. Everything is working as it should there. However, I am integrating this in a existing, quite large, Django project. There we specified couple of Django settings files, not just one. We run different one depending on environment, for instance one for local machines and one for server. My problem is that I can't seem to be able to track down which settings file is "active" from the celery worker, which runs celery.py file in my project root (as documentation specifies). There the documentation requires to specify Django settings file like this:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', "celery_test.settings.development")
Now this works, but if I move the stuff locally I need to change it to settings.local to make it work, and that every time. Reading settings object in runtime like I do in standard Django files didn't work since celery worker executes in a different process. So, using this situation, does anyone have any idea on how to dynamically fetch active Django settings file from celery worker? Or perhaps pass it in as a variable when starting celery worker? (like for Django, etc --settings=project.settings.local) Thanks!
I found the command line solution
When initializing the celery worker on the command line, just set the environment variable prior to the celery command.
DJANGO_SETTINGS_MODULE='proj.settings' celery -A proj worker -l info
but im getting an error
my commannd line
DJANGO_SETTINGS_MODULE='celery_test.settings.development' celery -A celery_test worker -l info --pool=solo
DJANGO_SETTINGS_MODULE=celery_test.settings.development : The term
'DJANGO_SETTINGS_MODULE=celery_test.settings.development' is not recognized as the name of a cmdlet, function,
script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path
is correct and try again.
At line:1 char:1
DJANGO_SETTINGS_MODULE='celery_test.settings.development' celery -A c ...
+ CategoryInfo : ObjectNotFound: (DJANGO_SETTINGS...ngs.development:String) [], CommandNotFoundExcept
ion
+ FullyQualifiedErrorId : CommandNotFoundException

Django celery beat scheduler not working at Asia/Calcutta timezone

I am using Django celery beat with celery,
Problem which I am facing is my when I am using scheduler provided by Django-celery beat it doesn't work but using normal beat without scheduler provided by Django-celery-beat works with below command
Doesn't works => celery worker --app=my_project.celery_app -l info --beat --scheduler django_celery_beat.schedulers:DatabaseScheduler
Works => celery worker --app=my_project.celery_app -l info --beat
one thing I noticed when I change Time_Zone to 'UTC' Django-celery-beat scheduler starts working, but i don't want to change timezone settings for djagno how do I fix this
please find my settings below
USE_TZ = False
TIME_ZONE = 'Asia/Kolkata'
CELERY_TIMEZONE = 'Asia/Kolkata'
Can you try overriding the Database scheduler and override schedule method,
celery worker --app=my_project.celery_app -l info --beat --scheduler django_celery_beat.schedulers:DatabaseScheduler
i have changed and data seems to show on the log file
You could try this:
celery -A my_app.celery:app beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
Also, make sure you have django_celery_beat installed and it's added in your settings.py:
INSTALLED_APPS = (
...,
'django_celery_beat',
)
then python manage.py migrate
here you can find more on the beat configuration.
I am sure, there must be a way to do this,
but, I would recommend you to never use tz=False in Django as every standard application is timezone aware and is using UTC as default, it doesn't matter if you are in India or any other country
Hope it makes your application better if you face any error with UTC feel free to ask here
PS. even if you want to check out other library, do look into below library
https://pypi.org/project/django-celery/

Am I executing celery shared tasks correct?

Here is the way how I start celery periodic tasks. First I execute this command:
celery worker -A my_project.celery
And after that this command:
celery -A my_project beat -l info -S django
After executing these two commands on two different terminal tabs, my celery beat periodic tasks starts running. If I don't run one of the described commands, my periodic tasks do not run. My question is: is there any any way to start the celery with the single command, or even better with runserver command?
Your method of using Celery is correct. You can use parameter -B, --beat to start beat and worker using single command:
# This will start worker AND beat process
celery worker --app=my_project -l=INFO --beat -S django
But do not use this in production, see this note in Celery docs (http://docs.celeryproject.org/en/latest/reference/celery.bin.worker.html):
-B is meant to be used for development purposes. For production environment, you need to start celery beat separately.
Few notes: 1) I think there is no way to run the Celery and runserver together (I honestly think it's not a good idea); 2) I see django-celery tag in your question. This is and old and deprecated way of integrating Django and Celery:
THIS PROJECT IS ONLY REQUIRED IF YOU WANT TO USE DJANGO RESULT BACKEND
AND ADMIN INTEGRATION (Source: https://github.com/celery/django-celery)

Django-Celery in production?

So I've been trying to figure out how to make scheduled tasks, I've found Celery and been able to to make simple scheduled tasks. To do this I need to open up a command line and run celery -A proj beat for the tasks to happen. This works fine in a development environment, but when putting this into production that will be an issue.
So how can I get celery to work without the command line use? When my production server is online, how can I make sure my scheduler goes up with it? Can Celery do this or do I need to go down another method?
We use Celery in our production environment, which happens to be on Heroku. We are in the process of moving to AWS. In both environments, Celery hums along nicely.
It would be helpful to understand what your production environment will look like. I'm slightly confused as to why you would be worried about turning off your computer, as using Django implies that you are running serving up a website... Are you serving your website from your laptop??
Anyway, assuming that you are going to run your production server from a cloud platform, all you have to do is send whatever command lines you need to run Django AND the command lines for Celery (as you have already noted in your question).
In terms of configuration, you say that you have 'scheduled' tasks, so that implies you have set up a beat schedule in your config.py file. If not, it should look something like this (assumes you have a module called tasks.py which holds your celery task definitions:
from celery.schedules import crontab
beat_schedule = {
'task1': {
'task': 'tasks.task_one',
'schedule': 3600
},
'task2': {
'task': 'tibController.tasks.update_old_retail',
'schedule': crontab(hour=12, minute=0, day_of_week='mon-fri'
}
}
Then in your tasks.py just call the config file you just do this:
from celery import Celery
import config
app = Celery('tasks')
app.config_from_object(config)
You can find more on crontab in the docs. You can also checkout this repo for a simple Celery example.
In summary:
Create a config file that identifies which tasks to run when
Load the config file into your Celery app
Get a cloud platform to run your code on.
Run celery exactly like you have already identified
Hope that helps.

Permission problems prevent celery from running as daemon?

I'm currently having some trouble running celery as daemon. I use apache to serve my Django application, so I set uid and gid in celery setting all as "www-data". There are 2 places I know so far that need access permission: /var/log/celery/*.log, /var/run/celery/*.pid, and I already set them owned by "www-data". However, celery couldn't get started when I run sudo service celeryd start. If I get rid of the --uid and --gid option for the command, celery could get started by user "root".
One other thing I noticed is that if I could start celery using "root", it will put some files like: celery.bak, celery.dat, celery.dir in my CELERYD_CHDIR, which is my django application directory. I also changed the application directory owned by "www-data", celery still couldn't get started. I copied all the setting files from another machine in which celery runs fine, so I suppose it's not my setting's problem. Does anyone have any clue? Thanks.
Su to celery user and start celery from the command line. Most likely you have an app log, not celery, that you need permission for.