"module not found" when running Celery with supervisor - django

I'm trying to run celery with django using supervisor.
supervisor_celery.conf
[program:supervisor-celery]
command=/home/user/project/virtualenvironment/bin/celery worker -A project --loglevel=INFO
directory=/home/user/project/project
user=nobody
numprocs=1
stdout_logfile=/home/user/project/logs/celery.log
stderr_logfile=/home/user/project/logs/celery.log
autostart=true
autorestart=true
startsecs=10
stopwaitsecs = 600
stopasgroup=true
priority=1000
on runing supervisor , i got following error in logs file:-
Unable to load celery application.
The module project was not found.
project structure is
project
|-project
|-settings
|-production.py
|-__init__.py
|-celery.py
|-urls.py
|-wsgi.py
|-app
The contents of __init__.py is:-
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
The content of celery.py is
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings.production')
app = Celery('project')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
It would be helpful if anyone can tell me why it's not working?

It seems that your directory wrong(in supervisor conf), it should be
directory=/home/user/project

Related

django standalone script: cannot import name 'Celery' from 'celery'

I am trying to run a standalone Django scipt
import os, sys, django
proj_path = "/path/to/django-project"
import ipdb; ipdb.set_trace()
# This is so Django knows where to find stuff.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "boiler.settings")
sys.path.append(proj_path)
django.setup()
When i run It says
ImportError: cannot import name 'Celery' from 'celery' (/path/to/django-poject/boiler/celery.py)
My folder structure:
django-poject
-- boiler
-- __init__.py
-- settings.py
-- celery.py
-- manage.py
__init__.py
from .celery import app as celery_app
__all__ = ['celery_app']
celery.py
import os
from celery import Celery
import django
import sys
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'boiler.settings')
#This defines the celery app instance
redis = 'redis://:pass#localhost:6379/0'
app = Celery(dirname,
broker=redis,
backend=redis
)
I am able to run celery using
activate virtualenv
cd to django-poject
celery -A boiler worker --loglevel=debug
without any problems
But in standalone its creating problems
You have to name your celery.py something else. Like django_celery.py otherwise it won't work. Celery works fine without it that way, but you want to integrate in with django and like what Santhosh said, the absolute import of itself is giving you issues.
In your project's __init__.py you'll need something like:
from __future__ import absolute_import, unicode_literals
from your_path_to.django_celery import app as celery_app
__all__ = ('celery_app',)

celery + redis Cannot connect to amqp://guest:**#127.0.0.1:5672//: [Errno 111] Connection refused

I can not run the celery worker + redis + django. If I run this command to check that celery worker is ready to receive tasks:
celery -A car_rental worker -l info
I got this error:
[2020-02-24 00:14:42,188: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**#127.0.0.1:5672//: [Errno 111] Connection refused.
Trying again in 2.00 seconds...
In my settings.py I have this:
BROKER_URL = 'redis://localhost:6379'
requirements.txt:
amqp==2.5.2, asgiref==3.2.3, billiard==3.6.2.0, celery==4.4.0, redis==3.4.1
celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'car_rental.settings')
app = Celery('car_rental')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
car_rental/init.py:
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
and the structure of my project is like this:
car_rental
/car_rental
__init__.py
celery.py
setting.py
What I didn't understand is that I am using in the broker_url = 'redis://localhost:6379' but in the error I have: Cannot connect to amqp://guest:**#127.0.0.1:5672//
In this case it should work if you change the parameter from BROKER_URL to CELERY_BROKER_URL. When you gave it the namespace here:
app.config_from_object('django.conf:settings', namespace='CELERY')
At that point you'll want to rename your BROKER_URL parameter to CELERY_BROKER_URL.
CELERY_BROKER_URL = 'redis://localhost:6379'
Another example:
app.config_from_object('django.conf:settings', namespace='CAR')
CAR_BROKER_URL = 'redis://localhost:6379'
I had configured the celery exactly as Dantheman91 explained, and still, I faced the same issue.
Instead of having one settings.py in my project, I was using, base.py, development.py, production.py, and test.py inside the module named settings.
So, in case anyone switching between these roles/instances do not forget to change the same in celery.py.
In similar case scenario, use:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj_name.settings.development')
It should be the same as you use in the manage.py of your project.

Running Management command with celery?

I have never used Celery before and am trying to configure it correctly. I am using redis as the broker and hosting on heroku. This is my first time trying to run asynchronous tasks and I'm struggling. I have a Management command that I would like to run periodically.
celery.py
from __future__ import absolute_import, unicode_literals
import os
import celery
from celery import Celery
import django
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'coffee.settings')
app = Celery('coffee')
app.config_from_object('django.conf:settings', namespace = 'CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind= True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
app.conf.beat_schedule = {
'add-every-30-seconds':{
'task': 'inventory.tasks.boarshead',
'schedule' : 30.0,
'args' : ()
},
}
settings.py
CACHES = {
"default": {
"BACKEND": "redis_cache.RedisCache",
"LOCATION": os.environ.get('REDIS_URL'),
}
}
tasks.py
from celery import shared_task
import celery
import time
from django.core import management
#celery.task
def boarshead():
try:
print("in celery module")
"""Boarshead expired sessions by using Django Management Command."""
management.call_command("clearsessions", verbosity=0)
CreateBoarsHeadList.py
return "success"
except:
print(e)
init.py
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
procfile
worker: celery worker --app=tasks.inventory.app
On Celery+Rabbit (and REDIS, not used as backend for years) you will need a proc file for the "web" (Django) and one for the worker, did not see listed. Worker / dyno allocation allows use and access to the manage functionality. Here is the procfile from one of my apps:
web: gunicorn SOME_APP.wsgi --log-file -
worker: celery worker -A QUEUE_APP_NAME -l info --without-gossip --without-mingle --without-heartbeat
QUEUE_APP_NAME is the name of a module (app) where I have all my Celery work and code. worker is called via Procfile in the QUEUE_APP_NAME module (dir), similar code to your Celery file. May not solve you, but getting Celery working is a slow battle.

Celery doesn't see tasks

Here is my celery config:
config.celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
import sys
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'config.settings')
app = Celery('config',
backend=os.getenv('REDIS_URL', ),
broker=os.getenv('CLOUDAMQP_URL')
)
app.conf.update(BROKER_URL=os.getenv('CLOUDAMQP_URL', 'redis://localhost'),
CELERY_RESULT_BACKEND=os.getenv('REDIS_URL',
'redis://localhost'))
app.config_from_object('django.conf:settings', namespace='CELERY')
sys.path.append(os.path.join(os.getcwd(), "applications"))
app.autodiscover_tasks()
TASK_SERIALIZER = 'json'
Celery can't find tasks in following structure
project_name/
apps/
users/
tasks.py
config/
celery.py
All my apps are registered in INSTALLED APPS and I'm using app registration via apps.py files.
According to Celery's documentation, your file config/__init__.py should have something similar to this:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)

Celery ImportError: No module named tasks

I'm creating a test scenario for Celery/RabbitMQ/Django. After browsing/reading the various posts similar to mine, I found this one, the closest, but still does not help me. I'm having the "ImportError: no module named tasks" error when executing celery worker.
Celery: 3.1.5 (not dj-celery)
Django: 1.5.5
Project structure:
testcele/ (project name)
mycelery/ (myapp)
__init__
tasks
testcele/
__init__
celery_task
settings
testcele/testcele/celery_task:
from __future__ import absolute_import
import os
from celery import Celery, task, current_task
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'testcele.settings')
app = Celery('testcele', backend='amqp', broker='amqp://guest#localhost//',
include=['tasks'])
if __name__ == '__main__':
app.start()
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
testcele/testcele/init.py:
from __future__ import absolute_import
from .celery_task import app as celery_app
mycelery/tasks.py:
from __future__ import absolute_import
from celery import Celery, task, current_task, shared_task
#shared_task()
def create_models():
.
.
.
I'm running: "celery worker -A testcele -l INFO", at the "testcele/" sub-dir. I have also tried running from testcele/testcel sub-dir, from testcele/mycelery, replacing "testcele" on the celery worker command with "tasks" or "mycelery". Obviously, this gives other errors.
What I am missing?
Thanks, Ricardo
Try adding a __init__.py file in your mycelery folder to make it a module. If that didn't work specify the tasks when defining your app. Like so:
app = Celery('testcele', backend='amqp', broker='amqp://guest#localhost//',
include=['mycelery.tasks'])