print out celery settings for django project? - django

I'd like to know if celery settings specified in the settings.py file is actually being recognized.
How can I tell if celery picked up the options?

You can inspect the _conf object on the Celery app instance, after is has been initalised and configured:
app = Celery("yourproject")
app.config_from_object("django.conf:settings", namespace="CELERY")
print(app._conf)

Related

Celery app ignoring beat_schedule option specified in settings.py file

I am trying to extend my django app with celery crontab functionality. For this purposes i created celery.py file where i put code as mentioned in official documentation.
Here is my the code from project/project/celery.py
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE','project.settings')
app=Celery('project')
app.config_from_object('django.conf::settings',namespace='CELERY')
Than inside my project/settings.py file i specify related to celery configs as follow
CELERY_TIMEZONE = "Europe/Moscow"
CELERYBEAT_SHEDULE = {
'test_beat_tasks':{
'task':'webhooks.tasks.adding',
'schedule':crontab(minute='*/1),
},
}
Than i run worker an celery beat in the same terminal by
celery -A project worker -B
But nothing happened i mean i didnt see that my celery beat task printing any output while i expected that my task webhooks.tasks.adding will execute
Than i decided to check that celery configs are applied. For this purposes in command line **python manage.py shell i checked celery.app.conf object
#i imported app from project.celery.py module
from project import celery
#than examined app configs
celery.app.conf
And inside of huge config's output of celery configs i saw that timezone is set to None
As i understand my problem is that initiated in project/celery.py app is ignoring my project/settings.py CELERY_TIMEZONE and CELERY_BEAT_SCHEDULE configs but why so? What i am doing wrong? Please guide me
After i spent so much time researching to solve this problem i found that my mistake was inside how i run worker and celery beat. While running worker as i did it wouldnt execute task in the terminal. To see is task is executing i should run it as follow celery -A project worker -B -l INFO or instead of INFO if you want more detailed output DEBUG can be added. Hope it will help anyone

celery cant find static files in django app during development

My django web app loads static files no problem in DEBUG mode with the magic of django.contrib.staticfiles.
However, my celery service can't find the files! This makes sense, because Django docs say that django.contrib.staticfiles works automagically when runserver command is used, but celery doesn't use the runserver command, so how do I get celery to access those files in development?
Here is how one of my models gets an static image to use as a default icon:
from django.templatetags.static import static
some_method():
return static('img/default_icon.png')
But with the same code run by celery, celery can't find the image. Celery says it's looking for the image at "/static/img/default_icon.png" which makes sense because my django settings has:
STATIC_URL = '/static/'
Are you using docker, docker-compose or something like that?
If it is, you need to add static volume to your docker container in docker-compose.yml

.env not working with Django with supervisor

I have a Django 2.2 project and all secrets are in .env file.
I'm using a library dotenv to load .env to the Django application in the manage.py file
import dotenv
def main():
# Read from .env file
env_file = os.path.join(os.path.dirname(os.path.realpath(__file__)), '.env')
dotenv.read_dotenv(env_file)
....
The environment file is working and is loaded well when running locally.
On the server, I'm using the supervisor to run the application with the following configuration.
[supervisord]
[program:myapp]
command=/var/www/html/app/start_gunicorn.sh
directory=/var/www/html/app/
autostart=true
autorestart=true
stopasgroup=true
stopsignal=QUIT
logfile=/home/ubuntu/log/supervisor/supervisor.log
logfile_maxbytes=5MB
logfile_backups=10
loglevel = info
stderr_logfile=/home/ubuntu/log/supervisor/qcg-backend.err.log
stdout_logfile_maxbytes=5MB
stdout_logfile_backups=10
stdout_logfile=/home/ubuntu/log/supervisor/qcg-backend.out.log
stderr_logfile_maxbytes=5MB
stderr_logfile_backups=10
But the environment variables are not loaded and not working in Django.
Running following command from SSH console is working.
python manage.py shell
import os
os.environ.get('DEBUG')
> True
But running the application, environment variables are not accessible and not applied in the application.
manage.py is not invoked when running Django in production. From the dotenv docs, it says that you should add the loader code to the top of wsgi.py as well.
I think putting it on settings.py is more convenient. No need to add it to both manage.py and wsgi.py

Django: Load production settings for celery

My Django project has multiple settings file for development, production and testing. And I am using supervisor to manage the celery worker. My question how to load the settings file for celery based on the environment I am in.
By using environment variables. Let's say you have the following setting files at the root of your repository.
config.settings.development.py
config.settings.production.py
...
The recommended way to have your celery instance defined is in your config like celery.py module:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
# os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'config.settings.production')
app = Celery('proj')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Instead of setting the DJANGO_SETTINGS_MODULE variable within the module (I have commented that out) make sure that those are present in the environment at the time that supervisord is started.
To set those variables in your staging, testing, and production system you can execute the following bash command.
E.g. on your production system:
$ export DJANGO_SETTINGS_MODULE=config.settings.production
$ echo $DJANGO_SETTINGS_MODULE
I would also suggest you load them from an .env file. In my opinion thats more convenient. You could do that with for example python-dotenv.
Update
The .env file is mostly unique on your different systems and is usually not under source/version control. By unique I mean for development you may have a more verbose LOG_LEVEL or a different SECRET_KEY, because you don't want them to show up in your source control or want to be able to adjust them without modifying your source files.
So, in your base.py (production.py and development.py are inheriting) you can load the variables from the file with for example:
import os
from dotenv import load_dotenv
load_dotenv() # .env file has to be in the same directory
# ...
import os
DJANGO_SETTINGS_MODULE = os.getenv("DJANGO_SETTINGS_MODULE")
print(DJANGO_SETTINGS_MODULE)
# ...
I personally don't use the package since I use docker, which has a declarative way of defining an .env file but the code above should give you an idea how it could work. There are similar packages out there like django-environ which is featured in the book Two Scoops of Django. So I would tend to use this instead of python-dotenv, a matter of taste.
You likely want to configure different settings files. From here you have two options. You can use the django-admin settings param at runtime
django-admin runserver --settings=thecelery.settings
Also, you may have the option to configure settings in the settings.py. If you have a 1 settings file currently, this would require you to setup additional settings files, and set environment variables on the instance. Then in your base settings file, you can do soemthing like this
import os
environment your_env = os.environ["environment"]
if your_env == "celery":
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "thecelerysettings")
else:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "defaultsettings")

Custom celery settings file for django-celery

Django-celery seems to use the current django project's settings.py file to read the configuration for celery, and, correct me if I am wrong, there is no way to override this behavior. The problem is that my celery configuration is in a different file outside of the django project, and I need to somehow tell django-celery to use that file instead. How do I do this?
Generally, django-celery used for one project(but many applications). Of course, you can make symlink of the settings.py from one project to other and run Django-celery as:
./manage.py celeryd --settings=path.to.symlink
but in my opinion would be better decision to use celery as daemon with common settings and CELERY_IMPORTS to tasks from any django projects