How to send message to event hub via python - azure-eventhub

I am trying to connect lots of iot objects to an eventhub and save them to a blob storage(also an sql database). I want to do this with python(and I am not sure if this is a recommended practice). The documentation about python was confusing. I tried a few examples but they create an entry to blob storage but entries seems to be irrelevant.
Things like this:
Objavro.codecnullavro.schema\EC{"type":"record","name":"EventData","namespace":"Microsoft.ServiceBus.Messaging","fields":[{"name":"SequenceNumber","type":"long"}...
which is not what I send. How can I solve this?

You could use the azure-eventhub Python SDK to send messages to Event Hub which is available on pypi.
And there is a send sample showing how to send messages:
import os
from azure.eventhub import EventHubProducerClient, EventData
producer = EventHubProducerClient.from_connection_string(
conn_str=CONNECTION_STR,
eventhub_name=EVENTHUB_NAME
)
with producer:
event_data_batch = producer.create_batch()
event_data_batch.add(EventData('Single message'))
producer.send_batch(event_data_batch)
I'm interested in The documentation about python was confusing. I tried a few examples but they create an entry to blob storage but entries seems to be irrelevant.
Could you share your code with me? I'm wondering what's the input/output for Event Hub and Storage Blob and how's the data processing flow.
btw, for Azure Storage Blob Python SDK usage, you could check the repo and [blob samples] for more information.

This is the connection string format for inserting new messages in eventhub using kafka-python. If you were using kafka and want to replace you just have to change this connection string.
import ssl
context = ssl.create_default_context()
context.options &= ssl.OP_NO_TLSv1
context.options &= ssl.OP_NO_TLSv1_1
self.kafka = KafkaProducer(bootstrap_servers=KAFKA_HOST,connections_max_idle_ms=5400000,security_protocol='SASL_SSL',value_serializer=lambda v: json.dumps(v).encode('utf-8'),sasl_mechanism='PLAIN',sasl_plain_username='$ConnectionString',sasl_plain_password={YOUR_KAFKA_ENDPOINT},api_version = (0,10),retries=5,ssl_context = context)
KAFKA_HOST = "{your_eventhub}.servicebus.windows.net:9093"
KAFKA_ENDPOINT="Endpoint=sb://{your_eventhub}.servicebus.windows.net/;SharedAccessKeyName=RootSendAccessKey;SharedAccessKey={youraccesskey}"
You can find KAFKA_HOST and KAFKA_ENDPOING from your Azure Console.

Related

How to use ConversationHandler for Telegram bot in AWS Lambda

I'm currently writing a Telegram bot using python-telegram-bot as a wrapper. I want to try and host this on AWS Lambda. However, so far the examples I've seen are simple, dumb bots that are unable to continue a conversation. I'm leveraging ConversationHandler to run the bot's conversations but this doesn't work well on AWS Lambda. I'm not sure how to fix this issue.
bot = MyBot()
def lambda_handler(event=None, context=None):
try:
dispatcher = bot.updater.dispatcher
message = json.loads(event['body'])
print("Incoming:", message)
dispatcher.process_update(Update.de_json(message, bot.updater.bot))
except Exception as e:
print(e)
return {"statusCode": 500}
bot.updater.idle()
return {"statusCode": 200}
How can I get the bot to hold a conversation state throughout?
ConversationHandler stores the state internally, i.e. in memory. I don't know how AWS handles initialization of variables, but if the ConversationHandler is initialized anew on each incoming update, it won't remember which state each conversation was in. If you can use some sort of database/file storage on AWS, you can try to use PTBs persistence setup to store the converstanion states and reload them for each incoming update.
Disclaimer: I'm currently the maintainer of python-telegram-bot

Post from GCP Pub Sub to on-prem Kafka

I have a requirement to publish messages from Google Pub/Sub topic to a Kafka running on my on-prem infrastructure. I stumbled on this link.
https://docs.confluent.io/current/connect/kafka-connect-gcp-pubsub/index.html
This should work. Wanted to know if you've used any other alternative solution to achieve this?
If you need to integrate PubSub and Kafka I suggest that you create a script for this purpose. In Python for example we have libraries for both PubSub and Kafka
Based on that, you could create a script more or less like below and run it inside some processing resource like Compute Engine or in your on premises server:
from google.cloud import pubsub_v1
from kafka import KafkaProducer
def callback(message):
print(message.data)
producer.send('<your-topic>', message.data)
message.ack()
producer = KafkaProducer(bootstrap_servers='localhost:1234') //Change it for your real parameter
subscription_name = "projects/<your-project>/subscriptions/<your-subscription>"
subscriber = pubsub_v1.SubscriberClient()
future = subscriber.subscribe(subscription_name, callback)

Sending UID / Device ID via Greengrass

I'm running several Greengrass Cores and they send Data to a MQTT Stream.
I deployed a Lambda on GGC reading the SerialPort coming in and push it to the Stream.
But now I want to check which device is sending the Data - I tried this one to check out the hostname
import socket
host = socket.gethostname()
but the core sends the value "sandbox" so i think the lambda isn't authorized to read the host name.
The SDK has no Documentation for this:
https://github.com/aws/aws-greengrass-core-sdk-python
I want to push the data to a mqqt stream like this:
response = client.publish(
topic='customer/events/{DEVICE-ID or UID or ARN}/',
payload=jsonData.encode())
I found something useful in another AWS Python Example - ThingNames are registered in the System Env so you can import OS and get the ThingName like this:
import os
device = os.environ['AWS_IOT_THING_NAME']

Is it possible to connect to the Google IOTCore MQTT Bridge via Javascript?

I've been trying to use the javacscript version of the Eclipse Paho MQTT client to access the Google IOTCore MQTT Bridge, as suggested here:
https://cloud.google.com/iot/docs/how-tos/mqtt-bridge
However, whatever I do, any attempt to connect with known good credentials (working with other clients) results in this connection error:
errorCode: 7, errorMessage: "AMQJS0007E Socket error:undefined."
Not much to go on there, so I'm wondering if anyone has ever been successful connecting to the MQTT Bridge via Javascript with Eclipse Paho, the client implementation suggested by Google in their documentation.
I've gone through their troubleshooting steps, and things seem to be on the up and up, so no help there either.
https://cloud.google.com/iot/docs/troubleshooting
I have noticed that in their docs they have sample code for Java/Python, etc, but not Javascript, so I'm wondering if it's simply not supported and their documentation just fails to mention as such.
I've simplified my code to just use the 'Hello World' example in the Paho documentation, and as far as I can tell I've done things correctly (including using my device path as the ClientID, the JWT token as the password, specifying an 'unused' userName field and explicitly requiring MQTT v3.1.1).
In the meantime I'm falling back to polling via their HTTP bridge, but that has obvious latency and network traffic shortcomings.
// Create a client instance
client = new Paho.MQTT.Client("mqtt.googleapis.com", Number(8883), "projects/[my-project-id]/locations/us-central1/registries/[my registry name]/devices/[my device id]");
// set callback handlers
client.onConnectionLost = onConnectionLost;
client.onMessageArrived = onMessageArrived;
// connect the client
client.connect({
mqttVersion: 4, // maps to MQTT V3.1.1, required by IOTCore
onSuccess:onConnect,
onFailure: onFailure,
userName: 'unused', // suggested by Google for this field
password: '[My Confirmed Working JWT Token]' // working JWT token
function onFailure(resp) {
console.log(resp);
}
// called when the client connects
function onConnect() {
// Once a connection has been made, make a subscription and send a message.
console.log("onConnect");
client.subscribe("World");
message = new Paho.MQTT.Message("Hello");
message.destinationName = "World";
client.send(message);
}
// called when the client loses its connection
function onConnectionLost(responseObject) {
if (responseObject.errorCode !== 0) {
console.log("onConnectionLost:"+responseObject.errorMessage);
}
}
// called when a message arrives
function onMessageArrived(message) {
console.log("onMessageArrived:"+message.payloadString);
}
I'm a Googler (but I don't work in Cloud IoT).
Your code looks good to me and it should work. I will try it for myself this evening or tomorrow and report back to you.
I've spent the past day working on a Golang version of the samples published on Google's documentation. Like you, I was disappointed to not see all Google's regular languages covered by samples.
Are you running the code from a browser or is it running on Node.JS?
Do you have a package.json (if Node) that you would share too please?
Update
Here's a Node.JS (JavaScript but non-browser) that connects to Cloud IoT, subscribes to /devices/${DEVICE}/config and publishes to /devices/${DEVICE}/events.
https://gist.github.com/DazWilkin/65ad8890d5f58eae9612632d594af2de
Place all the files in the same directory
Replace values in index.js of the location of Google's CA and your key
Replaces [[YOUR-X]] values in config.json
Use "npm install" to pull the packages
Use node index.js
You should be able to pull messages from the Pub/Sub subscription and you should be able to send config messages to the device.
Short answer is no. Google Cloud IoT Core doesn't support WebSockets.
All the JavaScript MQTT libraries use WebSocket because JavaScript is restricted to perform HTTP requests and WebSocket connections only.

Need qpid-proton publish/subscribe amqp sample program for python to access Azure topic

I am using Azure cloud service bus to send and receive messages using AMQP protocol. I have installed proton-c libraries in my debian-linux. I tried the below program to send and receive message from the queue. My requirement is instead of queue I have use topics. Please anyone give me a sample program to use topics in Azure cloud.
import sys, optparse
from proton import *
messenger = Messenger()
message = Message()
message.address = "amqps://owner:<<key>>#namespace.servicebus.windows.net/queuename"
message.body = "sending message to the queue"
messenger.put(message)
messenger.send()
Instead of queuename in above url if I give the topic name then the program running forever. Please someone help me. I am new to python programming.
I found myself the solution for this problem. I guess very few people are working in Azure Cloud so I didn't get any answers.
Here is the solution:
If we create topics in Azure service bus, it always select the checkbox "Enable Partitioning". AMQP protocol doesn't support partitioning topics/queues so I stuck with above issue. Once I deleted the topic and recreate the same topic without select the checkbox "Enable Partitioning". Its work fine. :)