Rabbitmq listener using pika in django - django

I have a django application and I want to consume messages from a rabbit mq. I want the listener to start consuming when I start the django server.I am using pika library to connect to rabbitmq.Proving some code example will really help.

First you need to somehow run your application at the start of the django project
https://docs.djangoproject.com/en/2.0/ref/applications/#django.apps.AppConfig.ready
def ready(self):
if not settings.IS_ACCEPTANCE_TESTING and not settings.IS_UNITTESTING:
consumer = AMQPConsuming()
consumer.daemon = True
consumer.start()
Further in any convenient place
import threading
import pika
from django.conf import settings
class AMQPConsuming(threading.Thread):
def callback(self, ch, method, properties, body):
# do something
pass
#staticmethod
def _get_connection():
parameters = pika.URLParameters(settings.RABBIT_URL)
return pika.BlockingConnection(parameters)
def run(self):
connection = self._get_connection()
channel = connection.channel()
channel.queue_declare(queue='task_queue6')
print('Hello world! :)')
channel.basic_qos(prefetch_count=1)
channel.basic_consume(self.callback, queue='queue')
channel.start_consuming()
This will help
http://www.rabbitmq.com/tutorials/tutorial-six-python.html

Related

Issue with using django celery when django signals is being used to sent email?

I have used the default Django admin panel as my backend. I have a Blogpost model. What I am trying to do is whenever an admin user saves a blogpost object on Django admin, I need to send an email to the newsletter subscribers notifying them that there is a new blog on the website.
I have to send mass emails so I am using django-celery. Also, I am using django signals to trigger the send email function.
But Right now, I am sending without using celery but it is too slow.
class Subscribers(models.Model):
email = models.EmailField(unique=True)
date_subscribed = models.DateField(auto_now_add=True)
def __str__(self):
return self.email
class Meta:
verbose_name_plural = "Newsletter Subscribers"
# binding signal:
#receiver(post_save,sender=BlogPost)
def send_mails(sender,instance,created,**kwargs):
subscribers = Subscribers.objects.all()
if created:
blog = BlogPost.objects.latest('date_created')
for abc in subscribers:
emailad = abc.email
send_mail('New Blog Post ', f" Checkout our new blog with title {blog.title} ",
EMAIL_HOST_USER, [emailad],
fail_silently=False)
else:
return
Using celery documentation i have written following files.
My celery.py
from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE','travel_crm.settings')
app = Celery('travel_crm')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
#app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
Mu init file:
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ('celery_app',)
Tasks file from docs:
def my_first_task(duration):
subject= 'Celery'
message= 'My task done successfully'
receiver= 'receiver_mail#gmail.com'
is_task_completed= False
error=''
try:
sleep(duration)
is_task_completed= True
except Exception as err:
error= str(err)
logger.error(error)
if is_task_completed:
send_mail_to(subject,message,receivers)
else:
send_mail_to(subject,error,receivers)
return('first_task_done')
This task doesn't work because I am using Django signal to trigger the send email function, How to employ this into tasks.py
I think I understand your question ... I was recently faced with similar challenge which included the complexity of a multi-tenant [schema] database [proved to be an issue with Redis]. I also tried django-celery, but it is dependent on a much older version of Celery. In addition, I wanted to send mass mail initiated by model signal post_save ... using EmailMultiAlternatives with 'bcc' and 'reply-to' features.
So now I am using the latest [as of this post] Django, the latest Celery with Redis ... running on macOS localhost with Poetry virtual env & package manager. The following worked for me:
Celery: I spent several hours net searching for tutorials and advice ... among others, this one added value for me Celery w Django. Good practice to dig deeper into Celery anyway if you have not done it already.
Redis: This will depend on your OS and if you are developing local or remote. The Redis website will guide you to set up. I also tried RabbitMQ but found it [personally] more complex to set up.
The code fractions: There are 4 fractions ... myapp/signals.py, myapp/tasks.py, myproj/celery.py, myproj/settings.py Disclaimer: I'm a hobby programmer ... more experienced engineers may well improve on my code ... I've done some minor testing and all seems to work.
# myapp/signals.py
#receiver(post_save, sender=MyModel)
def post_save_handler(sender, instance, **kwargs):
if instance.some_field == True:
recipient_list = list(get_user_model().objects.filter('some filters'))
from_email = SomeModel.objects.first().site_email
to_email = SomeModel.first().admin_email
# call async task Celery
task_send_mail.delay(instance.some_field, instance.someother_field, from_email, to_email, recipient_list)
# myapp/tasks.py
#shared_task(name='task_sendmail')
def task_send_mail(instance.some_field, instance.someother_field, from_email, to_email, recipient_list):
template_name = 'emails/sometemplate.html'
html_message = render_to_string(template_name, {'body': instance.some_field,}) # These variables are added to the email template
plain_message = strip_tags(html_message)
subject = f'Do Not Reply : {instance.someother_field}'
connection = get_connection()
connection.open()
message = EmailMultiAlternatives(
subject,
plain_message,
from_email,
[to_email],
bcc=recipient_list,
reply_to=[to_email],
)
message.attach_alternative(html_message, "text/html")
try:
message.send()
connection.close()
except SMTPException as e:
print('There was an error sending email: ', e)
connection.close()
# myproj/celery.py
import os
from celery import Celery
# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproj.settings')
app = Celery('myproj')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django apps.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print(f'Request: {self.request!r}')
# myproj/settings.py
...
##Celery Configuration Options
CELERY_BROKER_URL = 'redis://localhost:6379//'
CELERY_TIMEZONE = "Africa/SomeCity"
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60

Tornado on pika consumer can't run

I want to build monitoring system using RabbitMQ and Tornado. I can run the producer and my consumer can consume the data on queue but the data cant be show on website.
This just my experiment before I using the sensor
import pika
import tornado.ioloop
import tornado.web
import tornado.websocket
import logging
from threading import Thread
logging.basicConfig(lvl=logging.INFO)
clients=[]
credentials = pika.credentials.PlainCredentials('ayub','ayub')
connection = pika.BlockingConnection(pika.ConnectionParameters('192.168.43.101',
5672,
'/',
credentials))
channel = connection.channel()
def threaded_rmq():
channel.basic_consume('Queue',
on_message_callback= consumer_callback,
auto_ack=True,
exclusive=False,
consumer_tag=None,
arguments=None)
channel.start_consuming()
def disconect_rmq():
channel.stop_consuming()
Connection.close()
logging.info('Disconnected from broker')
def consumer_callback(ch,method,properties,body):
for itm in clients:
itm.write_message(body)
class SocketHandler(tornado.websocket.WebSocketHandler):
def open(self):
logging.info('websocket open')
clients.remove(self)
def close(self):
logging.info('websocket closed')
clients.remove(self)
class MainHandler(tornado.web.RequestHandler):
def get(self):
self.render("websocket.html")
application = tornado.web.Application([
(r'/ws',SocketHandler),
(r"/", MainHandler),
])
def startTornado():
application.listen(8888)
tornado.ioloop.IOLoop.instance().start()
def stopTornado():
tornado.ioloop.IOLoop.instance().stop()
if __name__ == "__main__":
logging.info('starting thread RMQ')
threadRMQ = Thread(target=threaded_rmq)
threadRMQ.start()
logging.info('starting thread tornado')
threadTornado = Thread(target=startTornado)
threadTornado.start()
try:
raw_input("server ready")
except SyntaxError:
pass
try:
logging.info('disconnected')
disconnect_rmq()
except Exception, e:
pass
stopTornado()
but I got this error
WARNING:tornado.access:404 GET /favicon.ico (192.168.43.10) 0.98ms
please help me
In your SocketHandler.open function you need to add the client not remove it.
Also consider using a set for clients instead of a list because the remove operation will be faster:
clients = set()
...
class SocketHandler(tornado.websocket.WebSocketHandler):
def open(self):
logging.info('websocket open')
clients.add(self)
def close(self):
logging.info('websocket closed')
clients.remove(self)
The message you get regarding favicon.ico is actually a warning and it's harmless (the browser is requesting an icon to show for web application but won't complain if none is available).
You might also run into threading issues because Tornado and Pika are running in different threads so you will have to synchronize them; you can use Tornado's IOLoop.add_callback method for that.

Django manage.py runserver graceful reloading

I am working on a Django project, which integrates a webcam and OpenCV. For the webcam access, I use following code. The webcam can be released if I use Ctrl + C to end a running server, but if the server reloads itself after the code change, the webcam can not be released properly and therefore will be not available. How can I detect hot reloading so I can close the webcam properly?
I am aware of the option of forbidding hot reloading but this is rather uncomfortable. Is there any option I can realize programmatically?
class VideoCamera(object):
def __init__(self):
self.video = None
def __del__(self):
if self.video is not None and self.video.isOpened():
self.video.release()
def get_frame(self):
try:
self.video = cv2.VideoCapture(0)
success, image = self.video.read()
self.video.release()
ret, jpeg = cv2.imencode('.jpg', image)
return jpeg.tobytes()
except (SystemExit, KeyboardInterrupt, Exception) as e:
self.video.release()
raise e
Few ideas
1) You could connect on the file_change signal that is triggered on file change before reload takes place
https://github.com/django/django/blob/9386586f31b8a0bccf59a1bff647cd829d4e79aa/django/utils/autoreload.py#L24
def notify_file_changed(self, path):
results = file_changed.send(sender=self, file_path=path)
logger.debug('%s notified as changed. Signal results: %s.', path, results)
if not any(res[1] for res in results):
trigger_reload(path)
2) Simply monkey patch trigger reload function and inject webcam closing code
https://github.com/django/django/blob/9386586f31b8a0bccf59a1bff647cd829d4e79aa/django/utils/autoreload.py#L221
If your class is a singleton, which it appears to be, create an instance of your class in its file and then add Django signal handlers to perform the clean up. As it is generally not recommended to use __del__, I would recommend you add a signal (system signals like SIGINT/ctrl+C) handler which forwards it as a Django signal. This means, listen on file_changed from django.utils.autoreload and SIGINT.
To use the singleton in other files, import it with from my_app.singleton import my_singleton.
# my_app/singleton.py
class MySingleton:
def __init__(self):
"""Init code here"""
def close(self):
"""Shutdown code here"""
my_singleton = MySingleton()
Then, register your signal handlers for your app. This is the standard AFAIK.
# my_app/apps.py
from django.apps import AppConfig
class MyAppConfig(AppConfig):
name = "my_app"
def ready(self):
import my_app.signals
The signal handler file itself, import the singleton and close it on shutdown. This is called on file_changed and SIGINT.
# my_app/signals.py
from django.dispatch import receiver
from django.utils.autoreload import file_changed
from my_app.signal_definitions import system_shutdown_signal
from my_app.singleton import my_singleton
#receiver(file_changed)
#receiver(system_shutdown_signal)
def my_shutdown_handler(sender, **kwargs):
my_singleton.close()
Register a handler to catch the SIGINT signal from the system.
# my_app/__init__.py
import signal
import sys
from iot.signal_definitions import system_shutdown_signal
def _forward_to_django_shutdown_signal(signal, frame):
print(f"Shutting down Django {sys.argv}")
system_shutdown_signal.send("system")
sys.exit(0)
signal.signal(signal.SIGINT, _forward_to_django_shutdown_signal)
The Django signal used to propagate the system SIGINT signal.
# my_app/signal_definitions.py
from django.dispatch import Signal
system_shutdown_signal = Signal()

Emit/Broadcast Messages on REST Call in Python With Flask and Socket.IO

Background
The purpose of this project is to create a SMS based kill switch for a program I have running locally. The plan is to create web socket connection between the local program and an app hosted on Heroku. Using Twilio, receiving and SMS will trigger a POST request to this app. If it comes from a number on my whitelist, the application should send a command to the local program to shut down.
Problem
What can I do to find a reference to the namespace so that I can broadcast a message to all connected clients from a POST request?
Right now I am simply creating a new web socket client, connecting it and sending the message, because I can't seem to figure out how to get access to the namespace object in a way that I can call an emit or broadcast.
Server Code
from gevent import monkey
from flask import Flask, Response, render_template, request
from socketio import socketio_manage
from socketio.namespace import BaseNamespace
from socketio.mixins import BroadcastMixin
from time import time
import twilio.twiml
from socketIO_client import SocketIO #only necessary because of the hack solution
import socketIO_client
monkey.patch_all()
application = Flask(__name__)
application.debug = True
application.config['PORT'] = 5000
# White list
callers = {
"+15555555555": "John Smith"
}
# Part of 'hack' solution
stop_namespace = None
socketIO = None
# Part of 'hack' solution
def on_connect(*args):
global stop_namespace
stop_namespace = socketIO.define(StopNamespace, '/chat')
# Part of 'hack' solution
class StopNamespace(socketIO_client.BaseNamespace):
def on_connect(self):
self.emit("join", 'server#email.com')
print '[Connected]'
class ChatNamespace(BaseNamespace, BroadcastMixin):
stats = {
"people" : []
}
def initialize(self):
self.logger = application.logger
self.log("Socketio session started")
def log(self, message):
self.logger.info("[{0}] {1}".format(self.socket.sessid, message))
def report_stats(self):
self.broadcast_event("stats",self.stats)
def recv_connect(self):
self.log("New connection")
def recv_disconnect(self):
self.log("Client disconnected")
if self.session.has_key("email"):
email = self.session['email']
self.broadcast_event_not_me("debug", "%s left" % email)
self.stats["people"] = filter(lambda e : e != email, self.stats["people"])
self.report_stats()
def on_join(self, email):
self.log("%s joined chat" % email)
self.session['email'] = email
if not email in self.stats["people"]:
self.stats["people"].append(email)
self.report_stats()
return True, email
def on_message(self, message):
message_data = {
"sender" : self.session["email"],
"content" : message,
"sent" : time()*1000 #ms
}
self.broadcast_event_not_me("message",{ "sender" : self.session["email"], "content" : message})
return True, message_data
#application.route('/stop', methods=['GET', 'POST'])
def stop():
'''Right here SHOULD simply be Namespace.broadcast("stop") or something.'''
global socketIO
if socketIO == None or not socketIO.connected:
socketIO = SocketIO('http://0.0.0.0:5000')
socketIO.on('connect', on_connect)
global stop_namespace
if stop_namespace == None:
stop_namespace = socketIO.define(StopNamespace, '/chat')
stop_namespace.emit("join", 'server#bayhill.com')
stop_namespace.emit('message', 'STOP')
return "Stop being processed."
#application.route('/', methods=['GET'])
def landing():
return "This is Stop App"
#application.route('/socket.io/<path:remaining>')
def socketio(remaining):
try:
socketio_manage(request.environ, {'/chat': ChatNamespace}, request)
except:
application.logger.error("Exception while handling socketio connection",
exc_info=True)
return Response()
I borrowed code heavily from this project chatzilla which is admittedly pretty different because I am not really working with a browser.
Perhaps Socketio was a bad choice for web sockets and I should have used Tornado, but this seemed like it would work well and this set up helped me easily separate the REST and web socket pieces
I just use Flask-SocketIO for that.
from gevent import monkey
monkey.patch_all()
from flask import Flask
from flask.ext.socketio import SocketIO
app = Flask(__name__)
socketio = SocketIO(app)
#app.route('/trigger')
def trigger():
socketio.emit('response',
{'data': 'someone triggered me'},
namespace='/global')
return 'message sent via websocket'
if __name__ == '__main__':
socketio.run(app)

Scale Gevent Socketio

I currently have a site setup using Django. I have added Gevent Socketio to add a chat function. I have a need to scale it as there are quite a few users already on the site and can't find a way to do so.
I tried https://github.com/abourget/gevent-socketio/tree/master/examples/django_chat/chat
I am using Gunicorn & the socketio.sgunicorn.GeventSocketIOWorker worker class so at first I thought of increasing the worker count. Unfortunately this seems to fail intermittently. I have started rewriting it to use redis from a few sources I found and have 1 worker on each server which is now being load balanced. However this seems to have the same problem. I am wondering if there is some issue in the gevent socketio code itself which does not allow it to scale.
Here is how I have started which is just the submit message code.
def redis_client():
"""Get a redis client."""
return Redis(settings.REDIS_HOST, settings.REDIS_PORT, settings.REDIS_DB)
class PubSub(object):
"""
Very simple Pub/Sub pattern wrapper
using simplified Redis Pub/Sub functionality.
Usage (publisher)::
import redis
r = redis.Redis()
q = PubSub(r, "channel")
q.publish("test data")
Usage (listener)::
import redis
r = redis.Redis()
q = PubSub(r, "channel")
def handler(data):
print "Data received: %r" % data
q.subscribe(handler)
"""
def __init__(self, redis, channel="default"):
self.redis = redis
self.channel = channel
def publish(self, data):
self.redis.publish(self.channel, simplejson.dumps(data))
def subscribe(self, handler):
redis = self.redis.pubsub()
redis.subscribe(self.channel)
for data_raw in redis.listen():
if data_raw['type'] != "message":
continue
data = simplejson.loads(data_raw["data"])
handler(data)
from socketio.namespace import BaseNamespace
from socketio.sdjango import namespace
from supremo.utils import redis_client, PubSub
from gevent import Greenlet
#namespace('/chat')
class ChatNamespace(BaseNamespace):
nicknames = []
r = redis_client()
q = PubSub(r, "channel")
def initialize(self):
# Setup redis listener
def handler(data):
self.emit('receive_message',data)
greenlet = Greenlet.spawn(self.q.subscribe, handler)
def on_submit_message(self,msg):
self.q.publish(msg)
I used parts of code from https://github.com/fcurella/django-push-demo and gevent-socketio 0.3.5rc1 instead of rc2 and it is working now with multiple workers and load balancing.