No module named 'project_name' with Django + Celery? - django

Day 2 of debugging this I have to turn to stackoverflow I'm on the edge.
I used cookiecutter-django to generate my project months ago.
Project name
config/settings/...
src/
app_name/
__init__.py
manage.py
when I create celery.py inside config I get an error because it is named after a package, so I googled and tried using absolute_import but I just went with naming the file celery_app.py like guys from django-cookie-cutter did
celery_app.py
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.base")
app = Celery("my_awesome_project") #I tried this with 10 different names, doesn't make any difference
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object("django.conf:settings", namespace="CELERY")
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
init.py inside config/
#from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery_app import app as celery_app
__all__ = ('celery_app',)
last lines of the traceback
File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
File "<frozen importlib._bootstrap>", line 984, in _find_and_load_unlocked
ModuleNotFoundError: No module named '<project_name>'
There was a bug with python 3.7 and celery where you had to downgrade importlib-metadata with pip3 install importlib-metadata==4.13.0 I am using python 3.9.6 and have tried both downgrading and installing the latest importlib-metadata
I tried running all variations of celery -A config.celery_app worker -l info and celery -A config.celery_app:app worker -l DEBUG from the root folder
From cookiecutter-django README.md:
To run a celery worker:
bash cd my_awesome_project celery -A config.celery_app worker -l
info
Please note: For Celery's import magic to work, it is important
where the celery commands are run. If you are in the same folder with manage.py, you should be right.
I am also using redis, I have redis-server running and the localhost running as well. Inside redis-cli ping returns pong so it works.
Where I started:
https://realpython.com/asynchronous-tasks-with-django-and-celery/
Posts with similar problems:
(Django-Celery Error) ImportError: No module named myproject
Celery ImportError: No module named proj
KeyError / frozen importlib._bootstrap error on second library import in spyder
Error while running celery worker : ModuleNotFoundError: No module named 'mysite'
https://www.reddit.com/r/django/comments/vcqr5e/celery_does_not_discovers_tasks_inside_django/
"Working" example:
https://github.com/stuartmaxwell/django-celery-example
If anybody has had the similar error please leave some feedback I would really appreciate it.

Added
ROOT_DIR = Path(__file__).resolve(strict=True).parent.parent
sys.path.append(str(ROOT_DIR / "src"))
to the celery_app.py just like in the regular settings and now it works.

Related

Error: Unable to load celery application. The module tasks was not found

I'm trying to get celery up and running on Heroku as per instructions here
When I try to run "heroku local" however it fives me the following error:
10:05:42 PM worker.1 | Error:
10:05:42 PM worker.1 | Unable to load celery application.
10:05:42 PM worker.1 | The module tasks was not found.
Any help is much appreciated.
EDIT: It should be noted that I have a module tasks.py in my root directory with the following code in it:
import celery
app = celery.Celery('example')
#app.task
def add(x, y):
return x + y
Based on the comments, I think you need to populate the init.py in your project folder (same folder as celery.py). You can follow Celery official documentation.
This is what you should add to __init__.py:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
Hope this helps.

Keyerror from celery task delay

Following the celery getting started with Django instructions, I am able to run tasks, but not run the same task asynchronously using delay().
I added the following requirements to my Django project:
celery==4.3.0
redis==3.3.11
django-celery-results==1.1.2
psycopg2==2.7.3.1
django-cors-headers~=3.1.0
Created this celery.py in the pop_model project directory:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'pop_model.settings.local')
app = Celery('pop_model')
# namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Inserted this code in the project init.py:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
Configured cors in the project settings and added these settings:
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'django-db' # defined in django_celery_results
CELERY_ACCEPT_CONTENT = ['json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
I can start redis, then run celery using these commands:
export DJANGO_SETTINGS_MODULE=pop_model.settings.local
celery worker -A pop_model --loglevel=info
In a python3 shell, I get these results:
>>> from pop_model.celery import debug_task
>>> debug_task()
Request: <Context: {'args': (), 'kwargs': {}}>
>>> task=debug_task.delay()
Traceback (most recent call last):
File "/Users/janee/.virtualenvs/pop-model/lib/python3.6/site-packages/kombu/utils/objects.py", line 42, in __get__
return obj.__dict__[self.__name__]
KeyError: 'backend'
I don't know how to resolve the missing backend key as CELERY_RESULT_BACKEND is defined in the settings file.
The only difference between a normal Python shell and manage.py shell is that it exports your settings module (project_name.settings) in the DJANGO_SETTINGS_MODULE environment variable.
If you run the same interpreter with the proper environment variable you should see no change. Then, it may be that your pop_model.settings.local path is not returning a proper settings module for your app to latch on, or you're using a modified manage.py script (for development environment separation, I suppose) where the settings module is properly loaded.
You should be able to call your function using
./manage.py shell
from your project directory, using the same intepreter of your virtual environment. This could also work because the DJANGO_SETTINGS_MODULE needs a path that is present in the interpreter's search path (more on that here) and you could be calling the interpreter from another directory.

Django + Heroku celery module not found

I'm trying to push my lastest version of my webapp live to Heroku. I'm pretty comfortable pushing to Heroku and this just doesn't seem right. For some reason I feel like Heroku is skipping my requirements.txt. Just in case I manually installed celery on my Heroku app.
I'm getting this specific problem with celery, but if Heroku is skipping my requirements.txt this might be a bigger problem.
1. If I run:
heroku run pip install celery
This let's me install the package over and over, shouldn't it kick back a "requirement already met" error?
2. When I try to push to heroku, I keep getting a
File "/app/config/_celery.py", line 4, in <module>
from celery import Celery
ModuleNotFoundError: No module named 'celery'
for the life of me I can't figure out why, I've uninstalled celery, reinstalled it locally. It's on my requirements.txt (Heroku should install it upon push to the remote). celery also seems to work locally just fine.
I'll include what I think is necessary, but let me know if I'm missing something that might provide the answer.
Here's my projects file structure:
POTRTMS(overall project folder)
|
+-config(holds settings)
| |
| +--settings
| | |
| | __init__.py
| | production.py
| | local.py
| | base.py
| |
| _celery.py
| __init__.py (this is the __init__.py i'm referencing below)
_celery.py
from __future__ import absolute_import, unicode_literals
import sys
import os
from celery import Celery
from celery.schedules import crontab
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'config.settings.production')
app = Celery('POTRTMS')
# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
app.conf.beat_schedule = {
'insurance_date': {
'task': 'insurance_date',
'schedule': crontab(minute=0, hour=8),
},
}
__init.py__
from __future__ import absolute_import
from ._celery import app as celery_app
I was able to figure out the problem. Inside of my project root folder I had Pipfile and Pipfile.lock which sounds like a new way to do the requirements.txt.
I'm not sure how these files got into my project but after deleting them all is working.

unable to migrate djcelery in django 1.5.4

I am working on django 1.5.4 and want to implement celery on my project. I installed celery (latest version) and then django-celery package.
on my settings.py, I have included following lines
import djcelery
djcelery.setup_loader()
INSTALLED_APPS = (
# other apps,
'djcelery',
)
As I try to migrate djcelery (using south), it gives following error:
lib/python2.7/site-packages/django/conf/__init__.py", line 134, in
__init__
raise ImportError("Could not import settings '%s' (Is it on sys.path?): %s" % (self.SETTINGS_MODULE, e))
ImportError: Could not import settings 'settings' (Is it on sys.path?): cannot import name Celery
I can't understand what is causing this error, cause I have installed both celery and Django-celery properly on my project's virtual environment
I too had similar problem before which was casued by celery.pyc file within my project folder. Try removing celery.pyc, that might solve your problem.

Celery ImportError: No module named tasks

I'm creating a test scenario for Celery/RabbitMQ/Django. After browsing/reading the various posts similar to mine, I found this one, the closest, but still does not help me. I'm having the "ImportError: no module named tasks" error when executing celery worker.
Celery: 3.1.5 (not dj-celery)
Django: 1.5.5
Project structure:
testcele/ (project name)
mycelery/ (myapp)
__init__
tasks
testcele/
__init__
celery_task
settings
testcele/testcele/celery_task:
from __future__ import absolute_import
import os
from celery import Celery, task, current_task
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'testcele.settings')
app = Celery('testcele', backend='amqp', broker='amqp://guest#localhost//',
include=['tasks'])
if __name__ == '__main__':
app.start()
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
testcele/testcele/init.py:
from __future__ import absolute_import
from .celery_task import app as celery_app
mycelery/tasks.py:
from __future__ import absolute_import
from celery import Celery, task, current_task, shared_task
#shared_task()
def create_models():
.
.
.
I'm running: "celery worker -A testcele -l INFO", at the "testcele/" sub-dir. I have also tried running from testcele/testcel sub-dir, from testcele/mycelery, replacing "testcele" on the celery worker command with "tasks" or "mycelery". Obviously, this gives other errors.
What I am missing?
Thanks, Ricardo
Try adding a __init__.py file in your mycelery folder to make it a module. If that didn't work specify the tasks when defining your app. Like so:
app = Celery('testcele', backend='amqp', broker='amqp://guest#localhost//',
include=['mycelery.tasks'])