Celery ImportError: No module named tasks - django

I'm creating a test scenario for Celery/RabbitMQ/Django. After browsing/reading the various posts similar to mine, I found this one, the closest, but still does not help me. I'm having the "ImportError: no module named tasks" error when executing celery worker.
Celery: 3.1.5 (not dj-celery)
Django: 1.5.5
Project structure:
testcele/ (project name)
mycelery/ (myapp)
__init__
tasks
testcele/
__init__
celery_task
settings
testcele/testcele/celery_task:
from __future__ import absolute_import
import os
from celery import Celery, task, current_task
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'testcele.settings')
app = Celery('testcele', backend='amqp', broker='amqp://guest#localhost//',
include=['tasks'])
if __name__ == '__main__':
app.start()
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
testcele/testcele/init.py:
from __future__ import absolute_import
from .celery_task import app as celery_app
mycelery/tasks.py:
from __future__ import absolute_import
from celery import Celery, task, current_task, shared_task
#shared_task()
def create_models():
.
.
.
I'm running: "celery worker -A testcele -l INFO", at the "testcele/" sub-dir. I have also tried running from testcele/testcel sub-dir, from testcele/mycelery, replacing "testcele" on the celery worker command with "tasks" or "mycelery". Obviously, this gives other errors.
What I am missing?
Thanks, Ricardo

Try adding a __init__.py file in your mycelery folder to make it a module. If that didn't work specify the tasks when defining your app. Like so:
app = Celery('testcele', backend='amqp', broker='amqp://guest#localhost//',
include=['mycelery.tasks'])

Related

django standalone script: cannot import name 'Celery' from 'celery'

I am trying to run a standalone Django scipt
import os, sys, django
proj_path = "/path/to/django-project"
import ipdb; ipdb.set_trace()
# This is so Django knows where to find stuff.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "boiler.settings")
sys.path.append(proj_path)
django.setup()
When i run It says
ImportError: cannot import name 'Celery' from 'celery' (/path/to/django-poject/boiler/celery.py)
My folder structure:
django-poject
-- boiler
-- __init__.py
-- settings.py
-- celery.py
-- manage.py
__init__.py
from .celery import app as celery_app
__all__ = ['celery_app']
celery.py
import os
from celery import Celery
import django
import sys
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'boiler.settings')
#This defines the celery app instance
redis = 'redis://:pass#localhost:6379/0'
app = Celery(dirname,
broker=redis,
backend=redis
)
I am able to run celery using
activate virtualenv
cd to django-poject
celery -A boiler worker --loglevel=debug
without any problems
But in standalone its creating problems
You have to name your celery.py something else. Like django_celery.py otherwise it won't work. Celery works fine without it that way, but you want to integrate in with django and like what Santhosh said, the absolute import of itself is giving you issues.
In your project's __init__.py you'll need something like:
from __future__ import absolute_import, unicode_literals
from your_path_to.django_celery import app as celery_app
__all__ = ('celery_app',)

Django celery register periodic task

I'm using Celery 4.4 with Django 2.2
I have to create a Periodic Task, I'm extending PeriodicTask ask as
from celery.schedules import crontab
from celery.task import PeriodicTask
class IncompleteOrderHandler(PeriodicTask):
run_every = crontab(
minute='*/{}'.format(getattr(settings, 'INCOMPLETE_ORDER_HANDLER_PULSE', 5))
)
def run(self, *args, **kwargs):
# Task definition
eligible_users, slot_begin, slot_end = self.get_users_in_last_slot()
map(lambda user: self.process_user(user, slot_begin, slot_end), eligible_users)
Earlier to register the above task, I used to call
from celery.registry import tasks
tasks.register(IncompleteOrderHandler)
But now there is no registry module in the celery. How can I register the above periodic task?
I had the same problem with class based celery tasks. This has to works, but it doesn't!
Accidentally, my problem solved by every one on these two changes:
I import one of the class based tasks at tasks.py in viewsets.py, and suddenly i figured out that after doing that, celery found all of the tasks at tasks.py.
This was my base celery setting file:
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'picha.settings')
app = Celery('picha')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
I changed the last line to app.autodiscover_tasks(lambda:
settings.CELERY_TASKS) and add CELERY_TASKS list to settings.py and write all
tasks.py file paths in it and then celery found tasks.
I hope one of these work for you.

celery not working in django and just waiting (pending)

i'm trying found how celery is working. i have a project that have about 10 app.now i want use celery .
setting.py:
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq#localhost:5672/rabbitmq_vhost'
CELERY_RESULT_BACKEND = 'redis://localhost'
i created a user in rabbitmq with this info:username: rabbitq and password:rabbitmq . then i create a vhost with name rabbitmq_vhost and add rabbitmq permission to it. all is fine i think because all of error about rabbitmq disappear .
here is my test.py:
from .task import when_task_expiration
def test_celery():
result = when_task_expiration.apply_async((2, 2), countdown=3)
print(result.get())
task.py:
from __future__ import absolute_import, unicode_literals
import logging
from celery import shared_task
from proj.celery import app
#app.task
def when_task_expiration(task, x):
print(task.id, 'task done')
return True
celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
now when i call test_celery() in python shell it's pending.i try to replace #shared_task and #app.task(bind=True) but noting changed.even i try use .delay() instead apply_async((2, 2), countdown=3) and again nothing happend.
i'm trying to use celery to call a function in specific time during this quesation that i ask in past.thank you.
You most likely forgot to run at least one Celery worker process. To do so, execute the following in the shell: celery worker -A proj.celery -c 4 -l DEBUG (here I assumed your Celery application is defined in proj/celery.py as you have Celery('proj') in there)

Creating the first Celery task - Django. Error - "ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**#127.0.0.1:5672//:"

I'm trying to create my first Celery task. The task will send the same e-mail every one minute to the same person.
According to the documentation, I create my first task in my project.
from __future__ import absolute_import, unicode_literals
from celery import shared_task
from django.core.mail import send_mail
#shared_task
def send_message():
to = ['test#test.com', ]
send_mail('TEST TOPIC',
'TEST MESSAGE',
'test#test.com',
to)
Then, in my project's ja folder, I add the celery.py file, which looks like this:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
from celery.schedules import crontab
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'app_rama.settings')
app = Celery('app_rama')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks(settings.INSTALLED_APPS)
app.conf.beat_schedule = {
'send-message-every-single-minute': {
'task': 'app.tasks.send_message',
'schedule': crontab(), # change to `crontab(minute=0, hour=0)` if you want it to run daily at midnight
},
}
Then in the __int__.py file of my project I added:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
And the last thing I try to do is run the command:
celery -A app_rama worker -l info
And then I receive the following error:
[2019-06-27 16:01:26,750: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**#127.0.0.1:5672//: [WinError 10061]
I tried many solutions from the forum, but I did not find the correct one.
I was also not helped by adding the following settings to my settings.py file:
CELERY_BROKER_URL = 'amqp://guest:guest#localhost:5672//'
How can I solve this error so that my task works in the background of the application?
Your Celery broker is probably misconfigured. Read the "Using RabbitMQ" document to find out how to setup RabbitMQ properly (I assumed you want to use RabbitMQ as you had "amqp" protocol in your example).
I recommend learning Celery with Redis, as it is easier to setup and manage. Then once you learn the basics you may decide to move to RabbitMQ or some other supported broker...
Also, verify that your RabbitMQ server is running properly. If you use Windows, make sure some software on it does not prevent user processes to connect to the localhost:5672.

Running Management command with celery?

I have never used Celery before and am trying to configure it correctly. I am using redis as the broker and hosting on heroku. This is my first time trying to run asynchronous tasks and I'm struggling. I have a Management command that I would like to run periodically.
celery.py
from __future__ import absolute_import, unicode_literals
import os
import celery
from celery import Celery
import django
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'coffee.settings')
app = Celery('coffee')
app.config_from_object('django.conf:settings', namespace = 'CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind= True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
app.conf.beat_schedule = {
'add-every-30-seconds':{
'task': 'inventory.tasks.boarshead',
'schedule' : 30.0,
'args' : ()
},
}
settings.py
CACHES = {
"default": {
"BACKEND": "redis_cache.RedisCache",
"LOCATION": os.environ.get('REDIS_URL'),
}
}
tasks.py
from celery import shared_task
import celery
import time
from django.core import management
#celery.task
def boarshead():
try:
print("in celery module")
"""Boarshead expired sessions by using Django Management Command."""
management.call_command("clearsessions", verbosity=0)
CreateBoarsHeadList.py
return "success"
except:
print(e)
init.py
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
procfile
worker: celery worker --app=tasks.inventory.app
On Celery+Rabbit (and REDIS, not used as backend for years) you will need a proc file for the "web" (Django) and one for the worker, did not see listed. Worker / dyno allocation allows use and access to the manage functionality. Here is the procfile from one of my apps:
web: gunicorn SOME_APP.wsgi --log-file -
worker: celery worker -A QUEUE_APP_NAME -l info --without-gossip --without-mingle --without-heartbeat
QUEUE_APP_NAME is the name of a module (app) where I have all my Celery work and code. worker is called via Procfile in the QUEUE_APP_NAME module (dir), similar code to your Celery file. May not solve you, but getting Celery working is a slow battle.