I am looking for a way to publish messages to a rabbitmq server from my django application. This is not for task offloading, so I don't want to use Celery. The purpose is to publish to the exchange using the django application and have a sister (non-django) application in the docker container consume from that queue.
This all seems very straightforward, however, I can't seem to publish to the exchange without establishing and closing a connection each time, even without explicitly calling for that to happen.
In an attempt to solve this, I have defined a class with a nested singleton class that maintains a connection to the rabbitmq server using Pika. The idea was that the nested singleton would be instantiated only once, declaring the connection at that time. Any time something is to be published to the queue, the singleton handles it.
import logging
import pika
import os
logger = logging.getLogger('django')
class PikaChannelSingleton:
class __Singleton:
channel = pika.adapters.blocking_connection.BlockingChannel
def __init__(self):
self.initialize_connection()
def initialize_connection(self):
logger.info('Attempting to establish RabbitMQ connection')
credentials = pika.PlainCredentials(rmq_username, rmq_password)
parameters = pika.ConnectionParameters(rmq_host, rmq_port, rmq_vhost, credentials, heartbeat=0)
connection = pika.BlockingConnection(parameters)
con_chan = connection.channel()
con_chan.exchange_declare(exchange='xchng', exchange_type='topic', durable=True)
self.channel = con_chan
def send(self, routing_key, message):
if self.channel.is_closed:
PikaChannelSingleton.instance.initialize_connection()
self.channel.basic_publish(exchange='xchng', routing_key=routing_key,
body=message)
instance = None
def __init__(self, *args, **kwargs):
if not PikaChannelSingleton.instance:
logger.info('Creating channel singleton')
PikaChannelSingleton.instance = PikaChannelSingleton.__Singleton()
#staticmethod
def send(routing_key, message):
PikaChannelSingleton.instance.send(routing_key, message)
rmq_connection = PikaChannelSingleton()
I then import rmq_connection where needed in the django application. Everything works in toy applications and in the python repl, but a new connection is being established every time the send function is being called in the django application. The connection then immediately closes with the message 'client unexpectedly closed TCP connection'. The message does get published to the exchange correctly.
So I am sure there is something going on with django and how it handles processes and such. The question still remains, how do I post numerous messages to a queue without re-establishing a connection each time?
If I understand correctly, connections cannot be kept alive like that in a single-threaded context. As your Django app continues executing, the amqp client is not sending the heartbeats on the channel and the connection will die.
You could use SelectConnection instead of BlockingConnection, probably not easy in the context of Django.
A good compromise could be to simply collect messages in your singleton but only send them all at once with a BlockingConnection at the very end of your Django request.
Related
In my Django application, I need to connect to a MQTT broker from several locations.
It will be great if I can create some kind of MQTT worker, which will run in the background/different thread and I can use this worker to publish/subscribe for messages and I don't have to create a separate MQTT connection for each function.
Example:
Create MQTT worker with connection details. On startup, this connection is started and handled, restarted if connection lost, etc... (maybe use Celery for this?)
Create functions which is available inside my Django projects for publish and subscribe. Publish seems more straighforward, but I'm not sure about the subscribe part.
My current implementation:
#shared_task(
bind=True,
name="tasks.send_command",
)
def send_command(self, username):
pedestals = User.objects.filter(username=username)
client = MqttClient("scheduled")
client.connect()
...
#shared_task(
bind=True,
name="tasks.toggle_switch",
)
def toggle_switch(self, switch):
from django.core.cache import cache
client = MqttClient("toggle-switch")
client.connect()
...
As you can see, I need to create the client in every task. Also I'm using it multiple times in Django as well, not just as a Celery task.
How can I create a worker for this. So like:
#shared_task(
bind=True,
name="tasks.toggle_switch",
)
def toggle_switch(self, switch):
from django.core.cache import cache
from mqtt.worker import mqtt_worker
mqtt_worker.publish()
...
That way I could simplify my codebase and I would not have to wait for the client to connect every time the task runs.
I have found mqttasgi but I don't know if it will fit my needs.
I would like a way to integrate django with mqtt and for that the first thing that came in my mind was using django-channels and an mqtt broker that supports mqtt over web sockets, so I could communicate directly between the broker and django-channels.
However, I did not found a way to start a websocket client from django, and acording to this link it's not possible.
Since I'm also starting to study task queues I wonder if it would be a good practice to start an mqtt client using paho-mqtt and then run that in a separate process using celery. This process would then forward the messages receives by the broker to django channels through websockets, this way I could also communicate with the client process, to publish data or stop the mqtt client when needed, and all that directly from django.
I'm a little skeptical about this idea since I also read that process run in celery should not take too long to complete, and in this case that's exactly what I want to do.
So my question is, how much of a bad idea that is? Is there any other option to directly integrate django with mqtt?
*Note: I dont want to have a separate process running on the server, I want to be able to start and stop the process from django, in order to have full control over the mqtt client from the web gui
I found a better way that does not need to use celery.
I simply started a mqtt client on app/apps.py on the ready method, so a client will be started everytime I run the application. From here I can communicate with other parts of the system using django-channels or signals.
apps.py:
from django.apps import AppConfig
from threading import Thread
import paho.mqtt.client as mqtt
class MqttClient(Thread):
def __init__(self, broker, port, timeout, topics):
super(MqttClient, self).__init__()
self.client = mqtt.Client()
self.broker = broker
self.port = port
self.timeout = timeout
self.topics = topics
self.total_messages = 0
# run method override from Thread class
def run(self):
self.connect_to_broker()
def connect_to_broker(self):
self.client.on_connect = self.on_connect
self.client.on_message = self.on_message
self.client.connect(self.broker, self.port, self.timeout)
self.client.loop_forever()
# The callback for when a PUBLISH message is received from the server.
def on_message(self, client, userdata, msg):
self.total_messages = self.total_messages + 1
print(str(msg.payload) + "Total: {}".format(self.total_messages))
# The callback for when the client receives a CONNACK response from the server.
def on_connect(self, client, userdata, flags, rc):
# Subscribe to a list of topics using a lock to guarantee that a topic is only subscribed once
for topic in self.topics:
client.subscribe(topic)
class CoreConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'core'
def ready(self):
MqttClient("192.168.0.165", 1883, 60, ["teste/01"]).start()
If you are using ASGI in your Django application you can use MQTTAsgi. Full disclosure I'm the author of MQTTAsgi.
It's a complete protocol server for Django and MQTT.
To utilize the mqtt protocol server you can run your application, first you need to create a MQTT consumer:
from mqttasgi.consumers import MqttConsumer
class MyMqttConsumer(MqttConsumer):
async def connect(self):
await self.subscribe('my/testing/topic', 2)
async def receive(self, mqtt_message):
print('Received a message at topic:', mqtt_mesage['topic'])
print('With payload', mqtt_message['payload'])
print('And QOS:', mqtt_message['qos'])
pass
async def disconnect(self):
await self.unsubscribe('my/testing/topic')
Then you should add this protocol to the protocol router:
application = ProtocolTypeRouter({
'websocket': AllowedHostsOriginValidator(URLRouter([
url('.*', WebsocketConsumer)
])),
'mqtt': MyMqttConsumer,
....
})
Then you can run the mqtt protocol server with*:
mqttasgi -H localhost -p 1883 my_application.asgi:application
*Assuming the broker is in localhost and port 1883.
I wanted to solve this problem too but found no good solutions out there that really fitted the Channels architecture (though MQTTAsgi came close but it uses paho-mqtt and doesn't fully use the Channels-layer system).
I created: https://pypi.org/project/chanmqttproxy/
(src at https://github.com/lbt/channels-mqtt-proxy)
Essentially it's a fully async Channels 3 proxy to MQTT that allows publishing and subscribing. The documentation show how to extend the standard Channels tutorial so chat messages are seen on MQTT topics - and can be sent from MQTT topics to all websocket browser clients.
I don't know it this is what the OP wants as far as listening to MQTT topics goes but for the general case I think this is a good solution.
I am using web sockets with Redis on Django. Django is running fine on macOS server but I started running it on Redhat Linux server and now the server gives me this error whenever I send a package over websockets:
ERROR - server - HTTP/WS send decode error:
Cannot dispatch message on channel
u'daphne.response.fzdRCEVZkh!nqhIpaLfWb' (unknown)
Note: while I get the error, the package will be received correctly.
I couldn't find any resources for this error.
I followed official instructions for channels.
According to Andrew Godwin (the developer of the channels package), this message is logged when you have a channel that was disconnected, but not removed from channel group(s):
Ah yes, that's Daphne being a little bit more verbose than before, I need to remove that. Don't worry about it - it's perfectly normal after you disconnect a channel that's still in a group. You might want to add a Group.discard call in a disconnect handler to stop it, though.
Source.
I had the same error, using a custom impl of channels.generic.websockets.WebsocketConsumer. After cleaning up channels from groups in disconnect callback, the message disappeared.
A short example with a class-based consumer: assuming you add clients to the broadcast group named foo on connection establishing. Then, on client disconnect, remove its channel from the group:
from channels import Group
from channels.generic.websockets import JsonWebsocketConsumer
class MyConsumer(JsonWebsocketConsumer):
groupname = 'foo'
def connect(self, message, **kwargs):
# send an accept or the connection will be dropped automatically
self.message.reply_channel.send({"accept": True})
# add the channel to the broadcast group
Group(self.groupname).add(message.reply_channel)
# do the rest of logic that should happen on connection established
...
def disconnect(self, message, **kwargs):
Group(self.groupname).discard(message.reply_channel)
# do the rest of logic that should happen on disconnect
...
I am trying to put some of the message system to redis. I have a question regarding the connection management towards redis from django. Below is taken from quora:
When talking to Redis from Django (or indeed any other web framework, I imagine) an interesting challenge is deciding when to connect and disconnect.
If you make a new connection for every query to Redis, that's a ton of unnecessary overhead considering a single page request might make hundreds of Redis requests.
If you keep one connection open in the thread / process, you end up with loads of unclosed connections which can lead to problems. I've also seen the Redis client library throw the occasional timeout error, which is obviously bad.
The best result I've had has been from opening a single Redis connection at the start of the request, then closing it at the end - which can be achieved with Django middleware. It feels a bit dirty though having to add a piece of middleware just to get this behaviour.
Does anybody had a chance to create such redis middleware , I am always in favor of not reinventing the wheel but didn't find anything on google related to this topic.
I implemented the middleware :
import redis
from redis_sessions import settings
# Avoid new redis connection on each request
if settings.SESSION_REDIS_URL is not None:
redis_server = redis.StrictRedis.from_url(settings.SESSION_REDIS_URL)
elif settings.SESSION_REDIS_UNIX_DOMAIN_SOCKET_PATH is None:
redis_server = redis.StrictRedis(
host=settings.SESSION_REDIS_HOST,
port=settings.SESSION_REDIS_PORT,
db=settings.SESSION_REDIS_DB,
password=settings.SESSION_REDIS_PASSWORD
)
else:
redis_server = redis.StrictRedis(
unix_socket_path=settings.SESSION_REDIS_UNIX_DOMAIN_SOCKET_PATH,
db=settings.SESSION_REDIS_DB,
password=settings.SESSION_REDIS_PASSWORD,
)
class ReddisMiddleWare(object):
def process_request(self,request):
request.redisserver = redis_server
Then in the view I am just using request.redisserver.get(key) .
How can I access the result of a Celery task in my main Django application process? Or, how can I publish to an existing socket connection from a separate process?
I have an application in which users receive scores. When a score is recorded, calculations are made (progress towards goals, etc), and based on those calculations notifications are sent to interested users. The calculations may take 30s+, so to avoid sluggish UI those operations are performed in a background process via a Celery task, invoked by the post_save signal of my Score model.
Ideally the post_save signal on my Nofication model would publish a message to subscribed clients (I'm using django-socketio, a wrapper for gevent-socketio). This seems straightforward...
Create a Score
Do some calculations on the new Score instance in a background process
Based on those calculations, create a Notification
On Notification save, grab the instance and publish to subscribed clients via socket connection
However after trying the following I'm not sure this is possible:
passing gevent's SocketIOServer instance to the callback method invoked by the task, but this requires pickling the passed object, which isn't possible
storing the socket's session_id (different from Django's session_id) in memchache and retrieving that in the Celery task process.
using Redis pubsub, so methods called by post_save signals on models created in a background process could simply publish to a Redis channel, but listening to chat channel in main application process (that has access to the socket connection) blocks the rest of the application.
I've also tried spawning new threads for each Redis client, which are created for each socket subscriber. As far as I can tell this requires spawning a new gevent.greenlets.Greenlet, and gevent can't be used in multiple threads
Surely this is a solved problem. What am I missing?
You already have django-socketio, writing a pub/sub with redis would be a pity :)
client side:
var socket = new io.Socket();
socket.connect();
socket.on('connect', function() {
socket.subscribe({{ score_update_channel }});
});
server side:
from django_socketio import broadcast_channel
def user_score_update(user):
return 'score_updates_user_%s' % user.pk
channel = user_score_update(user)
broadcast_channel(score_result_data, channel)
You need to run the broadcast on the django-socketio process; if you run it from a different process (celery worker) it will not work (channels are referenced in memory by the django-socketio process); You can solve this by wrapping it in a view and that celery will call (making a real http request) when the task is complete.