Is there a way of consuming events from Event Hubs from Ruby?
If not, how would one connect a Ruby application that needs to consume events published into Event Hubs?
e.g.: using SendGrid and a webhook? Are there more streamlined options?
Related
I am working on an IoT project that uses MQTT protocol to transport the sensor data from embedded devices to an App. For this i have created,
A MQTT broker to send the data from the device.
A custom bridge that push data from MQTT broker to my Kafka broker
Django server to push the messages via websocket to the App
Right now, What i need is to consume the Kafka messages from django, save to the DB and then push this data to client via websockets. But i don't have much idea regarding how to consume Kafka messages from Django.
So far the solution in my mind is using custom management command, start a kafka consumer, push the data to DB and then to websockets.
Is this a good approach? If not, what would be a good solution to solve this?
You can add periodic task to consume a topic and bulk insert (or update) to database (it impact on the performance).
Some of the Azure services we're using (e.g. APIM, Function Apps, etc) are sending logs to Azure Event Hubs.
How do I consume these logs in Fluentd?
From the Fluentd plugins page, I can't see any input plugins specifically for Azure Event Hubs. There is however a Kafka plugin that might work - not sure.
There is also an Azure Event Hubs output plugin - see here - but I'm looking for an input plugin.
Logstash (which is an alternate log forwarding solution) has an Azure Event Hubs input plugin - see here - but we're looking to use Fluentd for a few other reasons.
Has anyone done this before?
I can't find any definitive answer here. My IoT service needs to tollerate flaky connections. Currently, I manage a local cache myself and retry a cloud-blob transfer as often as required. Could I replace this with an Azure EventHub service? i.e. will the EventHub client (on IoT-Core) buffer events until the connection is available? If so, where is the info on this?
It doesn't seem so according to:
https://azure.microsoft.com/en-us/documentation/articles/event-hubs-programming-guide/
You are resposible for sending and caching it seems:
Send asynchronously and send at scale
You can also send events to an Event Hub asynchronously. Sending
asynchronously can increase the rate at which a client is able to send
events. Both the Send and SendBatch methods are available in
asynchronous versions that return a Task object. While this technique
can increase throughput, it can also cause the client to continue to
send events even while it is being throttled by the Event Hubs service
and can result in the client experiencing failures or lost messages if
not properly implemented. In addition, you can use the RetryPolicy
property on the client to control client retry options.
I'm newbie to AWS and trying to work on the SQS for the first time. I've an Oracle Service Bus (OSB) in non-cloud environment and would like to configure OSB to consume messages from Amazon SQS. The documentation mentions to use REST API and poll repeatedly for messages. I also read about the 'client library for JMS' so that the OSB could treat SQS as JMS provider. What is the best approach to achieve this? Appreciate your inputs.
The easiest (not necessarily the purest way) would be to create a Java EE app that imports the SQS libraries and pulls messages from AWS and puts them on a local queue for OSB to process. The example code snippets are in Java, so it should be relatively straight forward.
The purest way would be to set it up as a remote JMS provider. However, how to set that up is not so clear - you may end up writing most of the code that went into option #1 above, but making a JMS client library instead of a MDB.
I've got a Grails app (version 2.2.4) with a controller method that "logs" all requests to an external web service (JSON over HTTP - one way message, response is not needed). I want to decouple the controller method from calling the web service directly/synchronously and provide a simple "queue" which can store the calls if the web service is unavailable and then send them through once the service is back up again.
This sounds like a good fit for some sort of JMS solution but I've not got any experience with using JMS (so learning curve could be an issue). Should I be using one of the available messaging plugins or is that overkill for my simple requirements? I don't want a separate messaging app, it has to be embedded in my webapp and I'd prefer something small and simple vs more complicated and robust (so advice on which plugin would be welcome).
The alternative is to implement an async service myself and queue the "messages" in the database (reading them via a Quartz job) or with something like java.util.concurrent.ConcurrentLinkedQueue?
EDIT: Another approach could be to use log4j with a custom appender set up as a AsyncAppender.
The alternative is to implement an async service myself and queue the "messages" in the database (reading them via a Quartz job)
I went ahead and tried this approach. It was very straight forward and was only a "screen" length of code in the end. I tested it with a failing web service end point as well as an app restart (crash) and it handled both. I used a single service class to both persist the messages (Grails domain class) and to flush the queue (triggered by Quartz scheduler) which reads the DB and fires off the web service calls, removing the DB entity when web service returns 200 status code.