Celery-Django as Daemon: ImportError: No module named django.conf - django

I am working on Django project which uses Celery. In development, Celery is working fine. My tasks are getting scheduled properly in development. For Daemon, I have created /etc/init.d/celeryd and /etc/defaults/celeryd as per documentation. When I enter command bash -x /etc/init.d/celeryd start, I got error No module named django.conf
Here is my celery.py:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'axonatorprj.settings')
app = Celery('axonatorprj')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
app.conf.update(
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Here is my celeryd:
# Names of nodes to start
# most will only start one node:
CELERYD_NODES="worker1"
# but you can also start multiple and configure settings
# for each in CELERYD_OPTS (see `celery multi --help` for examples).
CELERYD_NODES="worker1 worker2 worker3"
# Absolute or relative path to the 'celery' command:
CELERY_BIN="/usr/local/bin/celery"
#CELERY_BIN="/virtualenvs/def/bin/celery"
# App instance to use
# comment out this line if you don't use an app
CELERY_APP="axonatorprj"
# or fully qualified:
#CELERY_APP="proj.tasks:app"
# Where to chdir at start.
CELERYD_CHDIR="/home/projects/axonator"
# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=300 --concurrency=8"
# %N will be replaced with the first part of the nodename.
CELERYD_LOG_FILE="/var/log/celery/%N.log"
CELERYD_PID_FILE="/var/run/celery/%N.pid"
# Workers should run as an unprivileged user.
# You need to create this user manually (or you can choose
# a user/group combination that already exists, e.g. nobody).
CELERYD_USER="celery"
CELERYD_GROUP="celery"
# If enabled pid and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1
export DJANGO_SETTINGS_MODULE="axonatorprj.settings"
export PYTHONPATH=$PYTHONPATH:/home/projects/axonator

Related

Can't run Celery task in Django - I either get "AppRegistryNotReady: Apps aren't loaded yet" or "RuntimeError: populate() isn't reentrant"

I'm trying to setup a task with Celery in Django to run every day at 23:00.
app = Celery('App.tasks', broker='redis://localhost')
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "App.settings")
django.setup() <== PROBLEM
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(
crontab(hour=23),
calc_average_rating.s(),
)
#app.task
def calc_average_rating(final_content_id):
The problem is that in this function, I have Rating = apps.get_model(app_label='App', model_name='Rating'), and If I don't call django.setup() then I get django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet..
However, If I call django.setup(), the tasks are running fine but I can't do manage.py runserver as I get RuntimeError: populate() isn't reentrant.
Any solutions?
I'm not sure exactly how to reproduce the environment you're in, so here are some observations from my environment, I hope they help
The only place I have a Celery() object is in a standalone file, kept within the "manage.py startproject" generated package,
I think the way I layout a django app is unusual compared to most django users, so to describe it:
# .git/ # top folder is my vcs
# setup.py # packaging for exampleapp
# env/ # python venv created to this service
# exampleapp/ # package generated from startapp
# exampleapp/tasks.py # package generated from startapp
# exampleproject/ # folder generated from startproject
# exampleproject/exampleproject/ # package generated by startproject
# exampleproject/exampleproject/settings.py # generated
# exampleproject/exampleproject/celery.py # created based on celery docs
# exampleproject/exampleproject/celery.py
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'exampleproject.settings')
app = Celery('exampleproject')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {self.request!r}'.format(self=self))
if __name__ == '__main__':
app.start()
and I start the celery jobs like as follows, where my python virtual env folder 'env' is a sibling of the generated exampleproject package
(
cd exampleproject
../env/bin/python3 -m celery -A exampleproject worker -l INFO
# or
../env/bin/python3 -m celery -A exampleproject beat -l INFO --scheduler django_celery_beat.schedulers:DatabaseScheduler
)
# and for django
./env/bin/python3 exampleproject/manage.py runserver
maybe of interest as well
# exampleapp/tasks.py
from celery import shared_task
#shared_task
def add(x, y):
return x+y
# exampleproject/exampleproject/settings.py
# suffixed to end of generated file
INSTALLED_APPS.extend([
'django_celery_results',
'django_celery_beat',
])
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_RESULT_BACKEND = 'django-db'
#CELERY_RESULT_BACKEND = 'django-cache'
With these parts, I haven't noticed any issues loading the entry points

Keyerror from celery task delay

Following the celery getting started with Django instructions, I am able to run tasks, but not run the same task asynchronously using delay().
I added the following requirements to my Django project:
celery==4.3.0
redis==3.3.11
django-celery-results==1.1.2
psycopg2==2.7.3.1
django-cors-headers~=3.1.0
Created this celery.py in the pop_model project directory:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'pop_model.settings.local')
app = Celery('pop_model')
# namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Inserted this code in the project init.py:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
Configured cors in the project settings and added these settings:
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'django-db' # defined in django_celery_results
CELERY_ACCEPT_CONTENT = ['json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
I can start redis, then run celery using these commands:
export DJANGO_SETTINGS_MODULE=pop_model.settings.local
celery worker -A pop_model --loglevel=info
In a python3 shell, I get these results:
>>> from pop_model.celery import debug_task
>>> debug_task()
Request: <Context: {'args': (), 'kwargs': {}}>
>>> task=debug_task.delay()
Traceback (most recent call last):
File "/Users/janee/.virtualenvs/pop-model/lib/python3.6/site-packages/kombu/utils/objects.py", line 42, in __get__
return obj.__dict__[self.__name__]
KeyError: 'backend'
I don't know how to resolve the missing backend key as CELERY_RESULT_BACKEND is defined in the settings file.
The only difference between a normal Python shell and manage.py shell is that it exports your settings module (project_name.settings) in the DJANGO_SETTINGS_MODULE environment variable.
If you run the same interpreter with the proper environment variable you should see no change. Then, it may be that your pop_model.settings.local path is not returning a proper settings module for your app to latch on, or you're using a modified manage.py script (for development environment separation, I suppose) where the settings module is properly loaded.
You should be able to call your function using
./manage.py shell
from your project directory, using the same intepreter of your virtual environment. This could also work because the DJANGO_SETTINGS_MODULE needs a path that is present in the interpreter's search path (more on that here) and you could be calling the interpreter from another directory.

celery not working in django and just waiting (pending)

i'm trying found how celery is working. i have a project that have about 10 app.now i want use celery .
setting.py:
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq#localhost:5672/rabbitmq_vhost'
CELERY_RESULT_BACKEND = 'redis://localhost'
i created a user in rabbitmq with this info:username: rabbitq and password:rabbitmq . then i create a vhost with name rabbitmq_vhost and add rabbitmq permission to it. all is fine i think because all of error about rabbitmq disappear .
here is my test.py:
from .task import when_task_expiration
def test_celery():
result = when_task_expiration.apply_async((2, 2), countdown=3)
print(result.get())
task.py:
from __future__ import absolute_import, unicode_literals
import logging
from celery import shared_task
from proj.celery import app
#app.task
def when_task_expiration(task, x):
print(task.id, 'task done')
return True
celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
now when i call test_celery() in python shell it's pending.i try to replace #shared_task and #app.task(bind=True) but noting changed.even i try use .delay() instead apply_async((2, 2), countdown=3) and again nothing happend.
i'm trying to use celery to call a function in specific time during this quesation that i ask in past.thank you.
You most likely forgot to run at least one Celery worker process. To do so, execute the following in the shell: celery worker -A proj.celery -c 4 -l DEBUG (here I assumed your Celery application is defined in proj/celery.py as you have Celery('proj') in there)

How to set max task per child in Celery with Django?

I am trying to set some settings for Celery in my Django setup, but wherever I put this:
CELERYD_MAX_TASKS_PER_CHILD=1
it always allows to start multiple tasks at the same time. I tried putting it in settings.py and proj.settings. My celery.py is as follows:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj', backend='redis://', broker='redis://localhost')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
The place where it should go is settings.py:
CELERY_WORKER_CONCURRENCY = 1 # this is the one I was actually looking for
CELERY_MAX_TASKS_PER_CHILD = 1
There is no limit by default.
Try
from celery import conf
conf.CELERYD_MAX_TASKS_PER_CHILD = 1 #max_tasks_per_child
Also, you can pass it through the cmd as parameter at starting (depends of ver.):
--maxtasksperchild=1
or
--max-tasks-per-child=1
Source

celery daemon production local config file without django

I am newbie to Celery. I create a project as per instruction provided by the celery4.1 docs.Below is my project folder and files:
mycelery
|
test_celery
|
celery_app.py
tasks.py
__init__.py
1-celery_app.py
from __future__ import absolute_import
import os
from celery import Celery
from kombu import Queue, Exchange
from celery.schedules import crontab
import datetime
app = Celery('test_celery',
broker='amqp://jimmy:jimmy123#localhost/jimmy_v_host',
backend='rpc://',
include=['test_celery.tasks'])
# Optional configuration, see the application user guide.
app.conf.update(
result_expires=3600,
)
if __name__ == '__main__':
app.start()
app.name
2-tasks.py
from __future__ import absolute_import
from test_celery.celery_app import app
import time
from kombu import Queue, Exchange
from celery.schedules import crontab
import datetime
app.conf.beat_schedule = {
'planner_1': {
'task': 'test_celery.tasks.printTask',
'schedule': crontab(minute='*/1'),
},
}
#app.task
def longtime_add(x, y):
print 'long time task begins'
# sleep 5 seconds
time.sleep(5)
print 'long time task finished'
return x + y
#app.task
def printTask():
print 'Hello i am running'
time=str(datetime.datetime.now())
file=open('/home/hub9/mycelery/data.log','ab')
file.write(time)
file.close()
I copied celeryd and celerybeat file from Celery github project and copied to /etc/init.d/ and make them executables. Then i create celeryd and celerybeat file to /etc/default/.
I- /etc/default/celeryd
# Names of nodes to start
# most will only start one node:
#CELERYD_NODES="worker1"
# but you can also start multiple and configure settings
# for each in CELERYD_OPTS (see `celery multi --help` for examples).
CELERYD_NODES="worker1 worker2 worker3"
# Absolute or relative path to the 'celery' command:
CELERY_BIN="/usr/local/bin/celery"
#CELERY_BIN="/virtualenvs/def/bin/celery"
# Where to chdir at start. path to folder containing task
CELERYD_CHDIR="/home/hub9/mycelery/test_celery/"
# App instance to use
# comment out this line if you don't use an app
#CELERY_APP = "file/locatin/of/app"
# or fully qualified:
CELERY_APP="test_celery.celery_app:app"
# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=3000 --concurrency=3 --config=celeryconfig"
# %N will be replaced with the first part of the nodename.
CELERYD_LOG_FILE="/var/log/celery/%N.log"
CELERYD_PID_FILE="/var/run/celery/%N.pid"
# Workers should run as an unprivileged user.
# You need to create this user manually (or you can choose
# a user/group combination that already exists, e.g. nobody).
CELERYD_USER="celery"
CELERYD_GROUP="celery"
# If enabled pid and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1
II- /etc/default/celerybeat
# Names of nodes to start
# most will only start one node:
#CELERYD_NODES="worker1"
# but you can also start multiple and configure settings
# for each in CELERYD_OPTS (see `celery multi --help` for examples).
CELERYD_NODES="worker1 worker2 worker3"
# Absolute or relative path to the 'celery' command:
CELERY_BIN="/usr/local/bin/celery"
#CELERY_BIN="/virtualenvs/def/bin/celery"
# Where to chdir at start. path to folder containing task
CELERYD_CHDIR="/home/hub9/mycelery/test_celery/"
# App instance to use
# comment out this line if you don't use an app
#CELERY_APP = "file/locatin/of/app"
# or fully qualified:
CELERY_APP="test_celery.celery_app:app"
# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=3000 --concurrency=3 --config=celeryconfig"
# %N will be replaced with the first part of the nodename.
CELERYD_LOG_FILE="/var/log/celery/%N.log"
CELERYD_PID_FILE="/var/run/celery/%N.pid"
# Workers should run as an unprivileged user.
# You need to create this user manually (or you can choose
# a user/group combination that already exists, e.g. nobody).
CELERYD_USER="celery"
CELERYD_GROUP="celery"
# If enabled pid and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1
After that i create celery user and group.
Here is my problem i am successfully run this project using celery -A test_celery.celery_app worker -l info --beatcommand but when i start my project using sudo service celeryd start OR sudo service celerybeat start
It gives me import error that no module name test_celery.celery_app.
Please provide me a hint what i am doing wrong.