Djcelery not working with xhtml2pdf - Mingle: all alone - django

Im trying to import xhtml2pdf in to my tasks.py file. But when ever I import it, no tasks gets done, as soons as I remove the import line, it works again. There is not error, its just does nothing.
taks.py:
from celery import task
import xhtml2pdf.pisa as pisa
#task
def some_task():
return "HELLO"
python manage.py celeryd -E -B -l DEBUG
just stays like this:
. recurring_invoice.tasks.send_email
. recurring_invoice.tasks.send_invoice
[2015-06-10 20:54:14,549: INFO/Beat] beat: Starting...
[2015-06-10 20:54:14,555: INFO/Beat] Writing entries...
[2015-06-10 20:54:14,588: INFO/MainProcess] Connected to amqp://guest:**#127.0.0.1:5672//
[2015-06-10 20:54:14,613: INFO/MainProcess] mingle: searching for neighbors
[2015-06-10 20:54:15,630: INFO/MainProcess] mingle: all alone
/Users/hermanstander/apps/afriapps/stem/env/lib/python2.7/site-packages/djcelery/loaders.py:136: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warn('Using settings.DEBUG leads to a memory leak, never '
[2015-06-10 20:34:13,870: WARNING/MainProcess] /Users/hermanstander/apps/afriapps/stem/env/lib/python2.7/site-packages/djcelery/loaders.py:136: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warn('Using settings.DEBUG leads to a memory leak, never '
[2015-06-10 20:34:13,870: WARNING/MainProcess] celery#Hermans-MacBook-Air.local ready.
[2015-06-10 20:34:13,870: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2015-06-10 20:34:13,871: DEBUG/MainProcess] basic.qos: prefetch_count->16
Im using Django 1.8, python 2.7

Related

Run periodic task defined from the view | Flask & Celery

I'm trying to run a simple periodic task every 10 seconds using flask and celery with the following code in my controllers.py:
#celery.task()
def print_hello(word):
print(f"Hello {word} !")
#celery.on_after_configure.connect
def add_periodic(word, **kwargs):
celery.add_periodic_task(10.0, print_hello.s(word), name='add every 10')
add_periodic is called from my view.py this way :
from .controllers import add_periodic
#api.route("scheduler")
class DataSchedule(Resource):
def get(self):
word = request.args.get("word")
return add_periodic(word)
My celeryconfig.py is the following :
CELERY_IMPORTS = ('api.controllers')
CELERY_TASK_RESULT_EXPIRES = 30
CELERY_TIMEZONE = 'UTC'
Redis is running OK.
Celery beat is started with the fllowing command celery beat -A app.celery :
celery beat v4.4.7 (cliffs) is starting.
Celery worker is giving fine signal starting using command celery worker -A app.celery --loglevel=info --pool=solo :
[2022-05-13 14:30:49,848: INFO/MainProcess] Connected to redis://localhost:6379/1
[2022-05-13 14:30:49,858: INFO/MainProcess] mingle: searching for neighbors
[2022-05-13 14:30:50,906: INFO/MainProcess] mingle: all alone
[2022-05-13 14:30:50,952: INFO/MainProcess] celery#W10P-WORK ready.
But no output for the scheduling. Any help please ? How to fix ? and what is wrong with this ?

Function call in another file not working in Django rest framework

I am using Django signals to trigger a task (sending mass emails to subscribers using Django celery package)when an admin post a blogpost is created from Django admin. The signal is triggered but the task function in the task file is not called. It's because I put a print function which is not printing inside the task function.
My signlas.py file:
from apps.blogs.celery_files.tasks import send_mails
from apps.blogs.models import BlogPost,Subscribers
from django.db.models.signals import post_save
from django.dispatch import receiver
def email_task(sender, instance, created, **kwargs):
if created:
print("#signals.py")
send_mails.delay(5)
post_save.connect(email_task, sender=BlogPost,dispatch_uid="email_task")
My task.py file
from __future__ import absolute_import, unicode_literals
from celery import shared_task
# from celery.decorators import task
from apps.blogs.models import BlogPost,Subscribers
from django.core.mail import send_mail
from travel_crm.settings import EMAIL_HOST_USER
from time import sleep
#shared_task
def send_mails(duration,*args, **kwargs):
print("#send_mails.py")
subscribers = Subscribers.objects.all()
blog = BlogPost.objects.latest('date_created')
for abc in subscribers:
sleep(duration)
print("i am inside loop")
emailad = abc.email
send_mail('New Blog Post ', f" Checkout our new blog with title {blog.title} ",
EMAIL_HOST_USER, [emailad],
fail_silently=False)
Here. the print("#send_mails.py") is not executed but print("#signals.py") in signals.py file is executed. Hence, signals is received after the Blogpost model object is created but the function inside task.py which is send_mails is not executed.
I have installed both celery and redis server and both are working fine.
The main thing is if I remove .delay(5) from signal file and instead used just send_mails() inside email_task , it works perfectly and i am getting emails. But as soon as I add delay() function, the fucntion inside task file is not called. What is the issue??
My traceback when I run worker info:
-------------- celery#DESKTOP-AQPSFR9 v5.1.2 (sun-harmonics)
--- ***** -----
-- ******* ---- Windows-10-10.0.18362-SP0 2021-07-18 11:06:10
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: travel_crm:0x15c262afcd0
- ** ---------- .> transport: redis://localhost:6379//
- ** ---------- .> results: redis://localhost:6379/
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. apps.blogs.celery_files.celery.debug_task
. apps.blogs.celery_files.tasks.send_mails
. travel_crm.celery.debug_task
[2021-07-18 11:06:11,465: INFO/SpawnPoolWorker-1] child process 9276 calling self.run()
[2021-07-18 11:06:11,475: INFO/SpawnPoolWorker-2] child process 8792 calling self.run()
[2021-07-18 11:06:11,496: INFO/SpawnPoolWorker-4] child process 1108 calling self.run()
[2021-07-18 11:06:11,506: INFO/SpawnPoolWorker-3] child process 7804 calling self.run()
[2021-07-18 11:06:13,145: INFO/MainProcess] Connected to redis://localhost:6379//
[2021-07-18 11:06:17,206: INFO/MainProcess] mingle: searching for neighbors
[2021-07-18 11:06:24,287: INFO/MainProcess] mingle: all alone
[2021-07-18 11:06:32,396: WARNING/MainProcess] c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\celery\fixups\django.py:203: UserWarning: Using settings.DEBUG leads to a memory
leak, never use this setting in production environments!
warnings.warn('''Using settings.DEBUG leads to a memory
[2021-07-18 11:06:32,396: INFO/MainProcess] celery#DESKTOP-AQPSFR9 ready.
[2021-07-18 11:06:32,596: INFO/MainProcess] Task apps.blogs.celery_files.tasks.send_mails[6bbac0ae-8146-4fb0-b64b-a07755123e1d] received
[2021-07-18 11:06:32,612: INFO/MainProcess] Task apps.blogs.celery_files.tasks.send_mails[25d3b32a-f223-4ae4-812b-fa1cfaedaddd] received
[2021-07-18 11:06:34,633: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\billiard\pool.py", line 362, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\celery\app\trace.py", line 635, in fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
[2021-07-18 11:06:34,633: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)')
Traceback (most recent call last):
File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\billiard\pool.py", line 362, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\user\desktop\travelcrm\myvenv\lib\site-packages\celery\app\trace.py", line 635, in fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
The stack trace helps identify that the issue has to do with calling the celery function.
ValueError: not enough values to unpack (expected 3, got 0)
Which is this part of the code:
send_mails.delay(5)
Try calling the function using apply_async instead.
send_mails.apply_async(args=(5, ))
If that doesn’t work remove *arg and **kwargs from def send_mails(duration):. I did not see why those parameters are necessary.
More information can be found in this answer: https://stackoverflow.com/a/48910727/7838574
Or in the Docs here: https://docs.celeryproject.org/en/latest/userguide/calling.html#basics

celery task not processed by celery

total celery and django noob here, so sorry if the problem is trivial. Basically the problem is that any function defined by #app.task is not being processed by celery, it just runs normally as if celery isn't there.
My celery_app.py file is -
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery(broker=settings.CELERY_BROKER_URL)
app.config_from_object('django.conf:settings')
app.autodiscover_tasks()
if __name__ == '__main__':
app.start()
While my tasks.py file is -
from project.celery_app import app
#app.task
def mytask():
...
I get the following output on running celery in the terminal -
-------------- celery#LAPTOP v4.1.0 (latentcall)
---- **** -----
--- * *** * -- Windows-10-10.0.16299-SP0 2017-12-20 19:27:24
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: __main__:0x229ce2884e0
- ** ---------- .> transport: amqp://user:**#localhost:5672/myvhost
- ** ---------- .> results: disabled://
- *** --- * --- .> concurrency: 8 (solo)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. account.tasks.mytask
[2017-12-20 19:27:24,085: INFO/MainProcess] Connected to amqp://user:**#127.0.0.1:5672/myvhost
[2017-12-20 19:27:24,101: INFO/MainProcess] mingle: searching for neighbors
[2017-12-20 19:27:25,126: INFO/MainProcess] mingle: all alone
[2017-12-20 19:27:25,141: WARNING/MainProcess] c:\programdata\anaconda2\envs\myenv\lib\site- packages\celery\fixups\django.py:202: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
warnings.warn('Using settings.DEBUG leads to a memory leak, never '
[2017-12-20 19:27:25,141: INFO/MainProcess] celery#LAPTOP- ready.
So my task is known to celery, but it doesn't do anything about it. The task runs on a button click, and using -loglevel=debug it is seen that celery isn't affected by it. I am using RabbitMQ as broker, celery 4.1.0, python3 and django-1.10.5. Any help would be greatly appreciated!
As I had thought, a simple mistake. Just needed to change mytask() to mytask.delay() and celery started receiving it.
.delay() is actually a shorcut method. If want to provide additional options you have to use .apply_async()
official doc can be found here:
http://docs.celeryproject.org/en/latest/userguide/calling.html

setting up django & celery: location for celery path

I'm setting up celery based on an example and at this point...
$ export PYTHONPATH=/webapps/hello_django/hello:$PYTHONPATH
$ /webapps/hello_django/bin/celery --app=hello.celery:app worker --loglevel=INFO
on my end set as
samuel#samuel-pc:~/Documents/code/revamp$ export PYTHONPATH=/home/samuel/Documents/code/revamp/gallery:$PYTHONPATH
samuel#samuel-pc:~/Documents/code/revamp$ /home/samuel/Documents/code/revamp/revamp/celery --app=revamp.celery:app worker --loglevel=INFO
bash: /home/samuel/Documents/code/revamp/revamp/celery: No such file or directory
not sure what it did to the path and this is what the result should be
-------------- celery#django v3.1.11 (Cipater)
---- **** -----
--- * *** * -- Linux-3.2.0-4-amd64-x86_64-with-debian-7.5
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: hello_django:0x15ae410
- ** ---------- .> transport: redis://localhost:6379/0
- ** ---------- .> results: disabled
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ----
--- ***** ----- [queues]
-------------- .> celery exchange=celery(direct) key=celery
[tasks]
. testapp.tasks.test
[2014-05-20 13:53:59,740: INFO/MainProcess] Connected to redis://localhost:6379/0
[2014-05-20 13:53:59,748: INFO/MainProcess] mingle: searching for neighbors
[2014-05-20 13:54:00,756: INFO/MainProcess] mingle: all alone
[2014-05-20 13:54:00,769: WARNING/MainProcess] celery#django ready.
my guess is I need to set path to the path for celery installation, if so anyone who can tell me the path.
Had lots of headeaches with celery tutorials out there. Try this:
First thing you need a virtual enviroment for your project, so you dont need to setup paths.
bash:
sudo pip3 install virtualenv
virtualenv env
source env/bin/activate
Then you need a django project and an app.
bash:
pip install django
django-admin starproject myproject
cd myproject
python manage.py startapp myapp
Then you should pip install celery
Next make a tasks.py on the same level of your view.py on myapp directory:
tasks.py
from celery import Celery
from celery.decorators import task
app = Celery('tasks', broker='pyamqp://guest#localhost//')
#task(bind=True,name="my_task")
def my_task(self):
print('hello')
return 1+1
Install your broker (rabbitmq)
bash:
sudo apt-get install rabbitmq-server
sudo service rabbitmq-server restart
Go to your app directory, the one that has tasks.py and run celery -A tasks worker --loglevel=info. This only works on the directory that tasks is defined. Then you should have your worker up and runnning. When you print or return something from your task it should appear here.
Finally you should use your task. Setup a view(setup url, make a template, etc.) and call your task from the view:
views.py
from django.shortcuts import render
from .tasks import my_task
def index(request):
my_task.delay()
return render({},'index.html')
The magic is that the delay call is assync, non-blockable.
From this minimalistic example I hope you can understand better the paths you tutorials talk about and do complicated stuff that they make you do: like placing celery settings on settings.py, calling the worker from other directories, putting things on path... This was a real pain when I was trying to learn this from the oficial docs.
Good luck!

Tasks not executing (Django + Heroku + Celery + RabbitMQ)

I'm using RabbitMQ for the first time and I must be misunderstanding some simple configuration settings. Note that I am encountering this issue while running the app locally right now; I have not yet attempted to launch to production via Heroku.
For this app, every 20 seconds I want to look for some unsent messages in the database, and send them via Twilio. Apologies in advance if I've left some relevant code out of my examples below. I've followed all of the Celery setup/config instructions. Here is my current setup:
BROKER_URL = 'amqp://VflhnMEP:8wGLOrNBP.........Bhshs' # Truncated URL string
from datetime import timedelta
CELERYBEAT_SCHEDULE = {
'send_queued_messages_every_20_seconds': {
'task': 'comm.tasks.send_queued_messages',
'schedule': timedelta(seconds=20),
# 'schedule': crontab(seconds='*/20')
},
}
CELERY_TIMEZONE = 'UTC'
I am pretty sure that the tasks are being racked up in RabbitMQ; here is the dash that I can see with all of the accumulated messages:
The function, 'send_queued_messages' should be called every 20 seconds.
comm/tasks.py
import datetime
from celery.decorators import periodic_task
from comm.utils import get_user_mobile_number
from comm.api import get_twilio_connection, send_message
from dispatch.models import Message
#periodic_task
def send_queued_messages(run_every=datetime.timedelta(seconds=20)):
unsent_messages = Message.objects.filter(sent_success=False)
connection = get_twilio_connection()
for message in unsent_messages:
mobile_number = get_user_mobile_number(message=message)
try:
send_message(
connection=connection,
mobile_number=mobile_number,
message=message.raw_text
)
message.sent_success=True
message.save()
except BaseException as e:
raise e
pass
I'm pretty sure that I have something misconfigured with RabbitMQ or in my Heroku project settings, but I'm not sure how to continue troubleshooting. When I run 'celery -A myproject beat' everything appears to be running smoothly.
(venv)josephs-mbp:myproject josephfusaro$ celery -A myproject beat
celery beat v3.1.18 (Cipater) is starting.
__ - ... __ - _
Configuration ->
. broker -> amqp://VflhnMEP:**#happ...Bhshs
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]#%INFO
. maxinterval -> now (0s)
[2015-05-27 03:01:53,810: INFO/MainProcess] beat: Starting...
[2015-05-27 03:02:13,941: INFO/MainProcess] Scheduler: Sending due task send_queued_messages_every_20_seconds (comm.tasks.send_queued_messages)
[2015-05-27 03:02:34,036: INFO/MainProcess] Scheduler: Sending due task send_queued_messages_every_20_seconds (comm.tasks.send_queued_messages)
So why aren't the tasks executing as they do without Celery being involved*?
My Procfile:
web: gunicorn myproject.wsgi --log-file -
worker: celery -A myproject beat
*I have confirmed that my code executes as expected without Celery being involved!
Special thanks to #MauroRocco for pushing me in the right direction on this. The pieces that I was missing were best explained in this tutorial: https://www.rabbitmq.com/tutorials/tutorial-one-python.html
Note: I needed to modify some of the code in the tutorial to use URLParameters, passing in the resource URL defined in my settings file.
The only line in send.py and receive.py is:
connection = pika.BlockingConnection(pika.URLParameters(BROKER_URL))
and of course we need to import the BROKER_URL variable from settings.py
from settings import BROKER_URL
settings.py
BROKER_URL = 'amqp://VflhnMEP:8wGLOrNBP...4.bigwig.lshift.net:10791/sdklsfssd'
send.py
import pika
from settings import BROKER_URL
connection = pika.BlockingConnection(pika.URLParameters(BROKER_URL))
channel = connection.channel()
channel.queue_declare(queue='hello')
channel.basic_publish(exchange='',
routing_key='hello',
body='Hello World!')
print " [x] Sent 'Hello World!'"
connection.close()
receive.py
import pika
from settings import BROKER_URL
connection = pika.BlockingConnection(pika.URLParameters(BROKER_URL))
channel = connection.channel()
channel.queue_declare(queue='hello')
print ' [*] Waiting for messages. To exit press CTRL+C'
def callback(ch, method, properties, body):
print " [x] Received %r" % (body,)
channel.basic_consume(callback,
queue='hello',
no_ack=True)
channel.start_consuming()