Celery No hostname was supplied. Reverting to default 'localhost' - django

I have this in my /var/log/celery/w1.log
I'm following the steps for Celery here.
I have this in my celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'sample.settings')
app = Celery('sample2',
broker='amqp://',
include=['sample2.tasks'])
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
if __name__ == '__main__':
app.start()
what can I do to fix this? Thanks in advance.

Full document in this link.
you should just add the below line to __init__.py near settings.py
from .celery import app as celery_app
__all__ = ['celery_app']
project structure
- proj/
- manage.py
- proj/
- __init__.py
- settings.py
- urls.py

Related

django import-export-celery cannot import resource

I'm following this repo but I got this error:
Error: Import error cannot import name 'ProfileResource' from 'crowdfunding.models' (C:\_\_\_\_\_\crowdfunding\models.py)
which supposedly makes an asynchronous import. The problem is it cannot detect my ProfileResource.
I have specified in my settings.py that my resource be retrieved from admin.py.
def resource():
from crowdfunding.admin import ProfileResource
return ProfileResource
IMPORT_EXPORT_CELERY_MODELS = {
"Profile": {
'app_label': 'crowdfunding',
'model_name': 'Profile',
'resource': resource,
}
}
but it can't seem to do that.
My celery.py is this:
from __future__ import absolute_import, unicode_literals
import os
import sys
from celery import Celery
# sys.path.append("../")
# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mainapp.settings')
from django.conf import settings
app = Celery('mainapp',
broker='amqp://guest:guest#localhost:15672//',
# broker='localhost',
# backend='rpc://',
backend='db+sqlite:///db.sqlite3',
# include=['crowdfunding.tasks']
)
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
and the broker and backend are working fine so it's just the config not being recognized. What could be the problem?
I believe that the problem is that changes to the code do not apply to celery automatically. Every time you change the source code, you need to manually restart celery to apply the new changes that you made to the import path in settings.py.

django with celery error:No result backend is configured

my version:
Django==3.2
celery==5.1.2
my settings.local:
CELERY_RESULT_BACKEND = 'redis://#127.0.0.1:6379/1'
celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# # 设置环境变量
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'cmdb.settings.local')
# 实例化
# app = Celery('celeryPro', include=['message.tasks'])
app = Celery('celeryPro', backend='redis://127.0.0.1:6379/1')
# app = Celery('cmdb')
# namespace='CELERY'作用是允许你在Django配置文件中对Celery进行配置
# 但所有Celery配置项必须以CELERY开头,防止冲突
app.config_from_object('django.conf:settings', namespace='CELERY')
# app.config_from_object(config, namespace='CELERY')
# 自动从Django的已注册app中发现任务
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request:{0!r}'.format(self.request))
always get the error
Your setup is incorrect in two ways.
You are adding the backend only when creating the instance of celery and also calling the config_from_object, as per the docs, any previous configuration is reset.
You are passing the incorrect config file to the config_from_object method. You need to send the file that Celery should use and not the one
that Django uses. You can find more info in the configuration docs.
As an example, you can have your celery.py file configured as below:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# # 设置环境变量
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'cmdb.settings.local')
# 实例化
# app = Celery('celeryPro', include=['message.tasks'])
# app = Celery('celeryPro', backend='redis://127.0.0.1:6379/1')
app = Celery('cmdb')
# namespace='CELERY'作用是允许你在Django配置文件中对Celery进行配置
# 但所有Celery配置项必须以CELERY开头,防止冲突
app.config_from_object('celery_config', namespace='CELERY')
# app.config_from_object(config, namespace='CELERY')
# 自动从Django的已注册app中发现任务
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request:{0!r}'.format(self.request))
and your celery_config.py file could be something like below:
broker_url = 'redis://localhost:6379/1'
result_backend = 'redis://localhost:6379/1'
Having your configuration for celery in a different file allows for more flexibility when you want to extend the configuration.
NOTE - you should keep the celery_config.py file in the root directory of the project i.e., in the same location as the manage.py file.

celery not working in django and just waiting (pending)

i'm trying found how celery is working. i have a project that have about 10 app.now i want use celery .
setting.py:
CELERY_BROKER_URL = 'amqp://rabbitmq:rabbitmq#localhost:5672/rabbitmq_vhost'
CELERY_RESULT_BACKEND = 'redis://localhost'
i created a user in rabbitmq with this info:username: rabbitq and password:rabbitmq . then i create a vhost with name rabbitmq_vhost and add rabbitmq permission to it. all is fine i think because all of error about rabbitmq disappear .
here is my test.py:
from .task import when_task_expiration
def test_celery():
result = when_task_expiration.apply_async((2, 2), countdown=3)
print(result.get())
task.py:
from __future__ import absolute_import, unicode_literals
import logging
from celery import shared_task
from proj.celery import app
#app.task
def when_task_expiration(task, x):
print(task.id, 'task done')
return True
celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
now when i call test_celery() in python shell it's pending.i try to replace #shared_task and #app.task(bind=True) but noting changed.even i try use .delay() instead apply_async((2, 2), countdown=3) and again nothing happend.
i'm trying to use celery to call a function in specific time during this quesation that i ask in past.thank you.
You most likely forgot to run at least one Celery worker process. To do so, execute the following in the shell: celery worker -A proj.celery -c 4 -l DEBUG (here I assumed your Celery application is defined in proj/celery.py as you have Celery('proj') in there)

Celery doesn't see tasks

Here is my celery config:
config.celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
import sys
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'config.settings')
app = Celery('config',
backend=os.getenv('REDIS_URL', ),
broker=os.getenv('CLOUDAMQP_URL')
)
app.conf.update(BROKER_URL=os.getenv('CLOUDAMQP_URL', 'redis://localhost'),
CELERY_RESULT_BACKEND=os.getenv('REDIS_URL',
'redis://localhost'))
app.config_from_object('django.conf:settings', namespace='CELERY')
sys.path.append(os.path.join(os.getcwd(), "applications"))
app.autodiscover_tasks()
TASK_SERIALIZER = 'json'
Celery can't find tasks in following structure
project_name/
apps/
users/
tasks.py
config/
celery.py
All my apps are registered in INSTALLED APPS and I'm using app registration via apps.py files.
According to Celery's documentation, your file config/__init__.py should have something similar to this:
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)

How to set max task per child in Celery with Django?

I am trying to set some settings for Celery in my Django setup, but wherever I put this:
CELERYD_MAX_TASKS_PER_CHILD=1
it always allows to start multiple tasks at the same time. I tried putting it in settings.py and proj.settings. My celery.py is as follows:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj', backend='redis://', broker='redis://localhost')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
The place where it should go is settings.py:
CELERY_WORKER_CONCURRENCY = 1 # this is the one I was actually looking for
CELERY_MAX_TASKS_PER_CHILD = 1
There is no limit by default.
Try
from celery import conf
conf.CELERYD_MAX_TASKS_PER_CHILD = 1 #max_tasks_per_child
Also, you can pass it through the cmd as parameter at starting (depends of ver.):
--maxtasksperchild=1
or
--max-tasks-per-child=1
Source