Flask integration with Celery - flask

I am trying to use Celery in my Flask Example application. Because I am creating instance in Factory method I can not use example from documentation (http://flask.pocoo.org/docs/0.10/patterns/celery/)
init.py
from celery import Celery
from flask import Flask
from config import config
def create_app():
app = Flask(__name__)
app.debug = True
app.config.from_object(config)
from .main import main as main_blueprint
app.register_blueprint(main_blueprint)
return app
def make_celery(app = None):
app = app or create_app()
celery = Celery('app', backend=app.config['CELERY_RESULT_BACKEND'], broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
TaskBase = celery.Task
class ContextTask(TaskBase):
abstract = True
def __call__(self, *args, **kwargs):
with app.app_context():
return TaskBase.__call__(self, *args, **kwargs)
celery.Task = ContextTask
return celery
tasks.py
from app import make_celery
celery = make_celery()
#celery.task
def add(a, b):
return a + b
views.py
from flask import render_template
from app.main import main
from ..tasks import add
#main.route('/', methods=['GET', 'POST'])
def index():
add.delay(5, 3)
return render_template('index.html')
I am getting an error:
$ celery -A app.tasks worker
Traceback (most recent call last):
File "...lib/python3.4/site-packages/celery/app/utils.py", line 229, in find_app
sym = symbol_by_name(app, imp=imp)
File "...lib/python3.4/site-packages/celery/bin/base.py", line 488, in symbol_by_name
return symbol_by_name(name, imp=imp)
File "...lib/python3.4/site-packages/kombu/utils/__init__.py", line 97, in symbol_by_name
return getattr(module, cls_name) if cls_name else module
AttributeError: 'module' object has no attribute 'tasks'

The -A param should point to the instance of Celery to use, not the module http://docs.celeryproject.org/en/latest/reference/celery.bin.celery.html#cmdoption-celery-a
In this case:
celery -A app.tasks.celery worker

Related

Flask AssertionError: View function mapping is overwriting an existing endpoint function: home

I'm trying to code a social network with flask on python anywhere
, everything was working fine before and without touching the imports I started to receive this error when I run routes.py
Traceback (most recent call last):
File "/home/OurHub/mysite/routes.py", line 13, in <module>
def home():
File "/usr/local/lib/python3.9/site-packages/flask/scaffold.py", line 433, in decorator
self.add_url_rule(rule, endpoint, f, **options)
File "/usr/local/lib/python3.9/site-packages/flask/scaffold.py", line 54, in wrapper_func
return f(self, *args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1084, in add_url_rule
raise AssertionError(
AssertionError: View function mapping is overwriting an existing endpoint function: home
I tried to put everything in a single file and I don't have two functions that have the same name
here is the start of my routes.py code
import os
import secrets
from PIL import Image
from flask import render_template, url_for, flash, redirect, request, abort
from __init__ import app, db, bcrypt
from forms import FormCreerCompte, FormConnecter, ModifierCompte, FormPoste
from modelsdb import Profil, Poste
from flask_login import login_user, current_user, logout_user, login_required
#app.route("/")
#app.route("/home")
def home():
page = request.args.get('page',1, type=int)
posts = Poste.query.paginate(page=page, per_page=5)
return render_template('page1.html', posts=posts)
and the code from the innit file:
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_bcrypt import Bcrypt
from flask_login import LoginManager
app = Flask(__name__)
app.config['SECRET_KEY'] = '6dfde280ba245'
app.config['SQLALCHEMY_DATABASE_URI'] = 'mysql+mysqlconnector://OurHub:ninjQ#OurHub.mysql.pythonanywhere-services.com/OurHub$default'
db = SQLAlchemy(app)
bcrypt = Bcrypt(app)
login_manager = LoginManager(app)
login_manager.login_view= 'connecter'
login_manager.login_message_category = 'primary'
import routes
After a lot of research, I tried to delete a piece of code that I had commented in my html file and it worked! Maybe because the html comment "<!-->" doesn't work with python code inserts "{%%}". The error still appears when I run route, but the application works fine, I was looking for the error in the wrong place after all.

How do I test that my Celery worker actually works in Django

(code at bottom)
Context: I'm working on a Django project where I need to provide the user feedback on a task that takes 15-45 seconds. In comes Celery to the rescue! I can see that Celery is performing as expected when I celery -A my_project worker -l info & python manage.py runserver.
Problem: I can't figure out how to run a celery worker in my tests. When I run python manage.py test, I get the following error:
Traceback (most recent call last):
File "/Users/pbrockman/coding/t1v/lib/python3.8/site-packages/django/test/utils.py", line 387, in inner
return func(*args, **kwargs)
File "/Users/pbrockman/coding/tcommerce/tcommerce/tests.py", line 58, in test_shared_celery_task
self.assertEqual(result.get(), 6)
File "/Users/pbrockman/coding/t1v/lib/python3.8/site-packages/celery/result.py", line 224, in get
return self.backend.wait_for_pending(
File "/Users/pbrockman/coding/t1v/lib/python3.8/site-packages/celery/backends/base.py", line 756, in wait_for_pending
meta = self.wait_for(
File "/Users/pbrockman/coding/t1v/lib/python3.8/site-packages/celery/backends/base.py", line 1087, in _is_disabled
raise NotImplementedError(E_NO_BACKEND.strip())
NotImplementedError: No result backend is configured.
Please see the documentation for more information.
Attempted solution:
I tried various combinations of #override_settings with CELERY_TASK_ALWAYS_EAGER=True, CELERY_TASK_EAGER_PROPOGATES=True, and BROKER_BACKEND='memory'.
I tried both #app.task decorator and the #shared_task decorator.
How do I see if celery is having the expected behavior in my tests?
Code
Celery Settings: my_project/celery.py
import os
from dotenv import load_dotenv
load_dotenv()
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'my_project.settings')
app = Celery('my_project-{os.environ.get("ENVIRONMENT")}',
broker=os.environ.get('REDISCLOUD_URL'),
include=['my_project.tasks'])
from django.conf import settings
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
if __name__ == '__main__':
app.start()
Testing: my_project/tests.py
from django.test import TestCase
from tcommerce.celery import app
from tcommerce.tasks import shared_add
from tcommerce.tasks import app_add
class CeleryTests(TestCase):
def test_shared_celery_task(self):
'#shared_task'
result = shared_add.delay(2, 4)
self.assertEqual(result.get(), 6)
def test_app_celery_task(self):
'#task.app'
result = app_add.delay(2, 4)
self.assertEqual(result.get(), 6)
Defining tasks: my_project/tasks.py
from .celery import app
from celery import shared_task
#shared_task
def shared_add(x, y):
return x + y
#app.task
def app_add(x, y):
return x + y

celery beat using flask task issue

I would like to do cron jobs from Flask using Celery but I have an issue regarding celery beat schedule, because it seems that my task is not loaded and I don't know how to check where the issue is.
This is where I define my Flask app in views.py :
from celery.schedules import crontab
app = Flask(__name__)
app.config.update(
CELERY_BROKER_URL='redis://localhost:6379',
CELERY_RESULT_BACKEND='redis://localhost:6379',
CELERY_BEAT_SCHEDULE={
'task-number-one': {
'task': 'app.tasks.test',
'schedule': crontab(minute="*"),
}
},
CELERY_IMPORTS = ('app.tasks'),
CELERY_TASK_RESULT_EXPIRES = 30,
CELERY_TIMEZONE = 'UTC',
)
and this where my celery object is created in tasks.py:
from celery import Celery
from app.views import app
from celery import shared_task
def make_celery(app):
celery = Celery(app.import_name, backend=app.config['CELERY_RESULT_BACKEND'],
broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
TaskBase = celery.Task
class ContextTask(TaskBase):
abstract = True
def __call__(self, *args, **kwargs):
with app.app_context():
return TaskBase.__call__(self, *args, **kwargs)
celery.Task = ContextTask
return celery
celery_app = make_celery(app)
#celery_app.task()
def test():
logger = test.get_logger()
logger.info("Hello")
views.py and tasks.py are under the same directories which is called app
This is what I have when launching celery worker (everything seems normal here):
But this is what I have when launching celery beat, it seems that my task is never sent by my schedule but I don't know where to check:
Can you help me on this?
Best
I believe Celery-Beat Tasks need to be configured at least after the #app.on_after_configure.connect signal is sent. You should be able to do the following in your tasks.py file.
celery_app.conf.CELERYBEAT_SCHEDULE = {
"test-every-minue": {
"task": "tasks.test",
"schedule": crontab(minute="*"),
},
}
Alternatively you can use this decorator syntax if your task is defined in the same file as your celery application instance.
#celery_app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(5 , test_two.s(), name='test-every-5')
If your tasks are defined in a separate module you can use the #app.on_after_finalize.connect decorator after importing your tasks.
#app.on_after_finalize.connect
def setup_periodic_tasks(sender, **kwargs):
from app.tasks import test
sender.add_periodic_task(10.0, test.s(), name='test-every-10')
Celery Beat Entry Docs

Flask to autobahn and twisted: RuntimeError: working outside of request context

I would like to send session data from the Flask part to the Websocket part of my app. A simplified example is:
from flask import session
class EchoServerProtocol(WebSocketServerProtocol):
def __init__(self):
self.user= session.get('user')
def onMessage(self, payload, isBinary):
_user=self.user
self.sendMessage(payload, isBinary, _user)
app = Flask(__name__)
app.secret_key = str(uuid.uuid4())
#app.route('/')
def page_home():
return render_template('index.html')
(adapted from here).
I received the error:
File "app.py", line 171, in __init__
if session.get('user'):
File "/usr/local/lib/python2.7/dist-packages/werkzeug/local.py", line 338, in __getattr__
return getattr(self._get_current_object(), name)
File "/usr/local/lib/python2.7/dist-packages/werkzeug/local.py", line 297, in _get_current_object
return self.__local()
File "/usr/local/lib/python2.7/dist-packages/flask/globals.py", line 20, in _lookup_req_object
raise RuntimeError('working outside of request context')
exceptions.RuntimeError: working outside of request context
The temporary solution I found was to save the user name as a pickle object in the Flask part, and to load the pickle object in the websocket part, like:
from flask import session
import pickle
class EchoServerProtocol(WebSocketServerProtocol):
def __init__(self):
self.user= pickle.load(open('../static/uploads/userfile.p', 'rb'))
def onMessage(self, payload, isBinary):
_user=self.user
self.sendMessage(payload, isBinary, _user)
app = Flask(__name__)
app.secret_key = str(uuid.uuid4())
#app.route('/')
def page_home():
_user=session.get('user')
pickle.dump(_user, open('../static/uploads/userfile.p', 'wb'))
return render_template('index.html')

Raising Error: NotRegistered when I use Flask with Celery

Description
Hi, I'm learning Celery, and I read a blog.>>
Celery and the Flask Application Factory Pattern - miguelgrinberg.com
So I wrote a small program to run Flask with Celery
Code
app.__init__.py
from flask import Flask
from celery import Celery
celery = Celery(__name__, broker='amqp://127.0.0.1:5672/')
def create_app():
app = Flask(__name__)
#celery.task
def add(x, y):
print x+y
#app.route('/')
def index():
add.delay(1, 3)
return 'Hello World!'
return app
manage.py
from app import create_app
app = create_app()
if __name__ == '__main__':
app.run()
celery_worker_1.py
from app import celery, create_app()
f_app = create_app()
f_app.app_context().push()
celery_worker_2.py
from app import celery, create_app
#celery.task
def foo():
print 'Balabala...'
f_app = create_app()
f_app.app_context().push()
Problem
When I run the Flask server and celery useing:
celery -A celery_worker_1 worker -l
the Celery raised NotRegistered Error:
Traceback (most recent call last): File "D:\Python27\lib\site-packages\billiard\pool.py", line 363, in workloop
result = (True, prepare_result(fun(*args, **kwargs))) File "D:\Python27\lib\site-packages\celery\app\trace.py", line 349, in
_fast_trace_task
return _tasks[task].__trace__(uuid, args, kwargs, request)[0] File "D:\Python27\lib\site-packages\celery\app\registry.py", line 26, in __missing__
raise self.NotRegistered(key) NotRegistered: 'app.add'
But instead of using celery_worker_2:
celery -A celery_worker_2 worker -l info
the task run correctly:
[2015-11-28 15:45:56,299: INFO/MainProcess] Received task: app.add[cbe5e1d6-c5df-4141-9db1-e6313517c202]
[2015-11-28 15:45:56,302: WARNING/Worker-1] 4
[2015-11-28 15:45:56,371: INFO/MainProcess] Task app.add[cbe5e1d6-c5df-4141-9db1-e6313517c202] succeeded in 0.0699999332428s: None
Why can't the Celery run correctly with the code of celery_worker_1?
PS: I'm not good at English, you can point it out to me if you can't understand, I'd like to describe again. ThankS!