Does anyone know how to query and create consumer groups in Azure Event Hubs using the .NET SDK. I've googled loads and can only find a way through the REST API (which I can do, but it would nicer if I can do it through the SDK).
Thanks in advance
Does anyone know how to query and create consumer groups in Azure Event Hubs using the .NET SDK.
You could try to install this NuGet package, and as Sreeram said, we could use the NamespaceManager class to create consumer group.
var manager = NamespaceManager.CreateFromConnectionString("{your connection string}");
manager.CreateConsumerGroupIfNotExists("{eventHubPath}", "{consumergroup Name}");
After you executed the code, you will find the consumer group is created.
To get the consumer group, you could try to call EventHubClient.GetConsumerGroup method.
var factory = MessagingFactory.CreateFromConnectionString("{your connection string}");
var client = factory.CreateEventHubClient("{eventHubPath}");
EventHubConsumerGroup group = client.GetConsumerGroup("{consumergroup Name}");
NamespaceManager.CreateConsumerGroupIfNotExistsAsync(...)
ConsumerGroupDescription realtimeCG = nsMgr.CreateConsumerGroupIfNotExists("PartitionedStream_AKA_EventHub_Name");
Related
My application creates a new EventHubClientBuilder() every time it sends an event to EventHub.
EventHubProducerClient producer = new EventHubClientBuilder()
.connectionString(connectionString, eventHubName)
.buildProducerClient();
I followed this quick start guide when building this application. I don't think it is best practice to create a new client if the application is expected to be publishing an event regularly (every 2-3 min) however, I cannot find any documentation that explains how to keep the client connection open aside from this .NET guide but my application is using Java.
Can someone explain how I could use a single producer client for the duration of the application? The application is consuming messages from another application and needs to published each message to EventHub.
I am using EventHubConsumerClient.ReadEventsAsync method to read events in eventHub. It works perfectly when I use default eventHub. However, when I route it to a new eventHub I am getting EventHubsExeception(ConsumerDisconnected) from time to time. From the documentation. It says this happen due to A client was forcefully disconnected from an Event Hub instance. This typically occurs when another consumer with higher OwnerLevel asserts ownership over the partition and consumer group. I almost got this exception every time. Only a few time it works. Anyone know how to resolve this? Or is there a better way to read message from eventHub? I don't want to use eventProcessorClient since it requires blobContainerClient
for the code, I followed the sample
await using var consumerClient = new EventHubConsumerClient(
EventHubConsumerClient.DefaultConsumerGroupName,
eventHubConnectionString,
eventHubName
);
await foreach (PartitionEvent partitionEvent in consumerClient.ReadEventsAsync(cancelToken)){
...
}
The error that you're seeing is very specific to a single scenario: another client has opened an AMQP link to one of the partitions you're reading from and has requested that the Event Hubs service give it exclusive access. This results in the Event Hubs service terminating your link with an AMQP error code of Stolen which the Event Hubs SDK translates into the form that you're seeing. (source)
These requests for exclusive access are enforced on a consumer group level. In your snippet, you're using the default consumer group, which is apparently also used by other consumers. As a best practice, I'd recommend that you create a unique consumer group for each application that is reading from the Event Hub - unless you specifically want them to interact.
In your case, your client is not requesting exclusive access, so anyone that is will take precedence. If you were to create a new consumer group and use that to configure your client, I would expect your disconnect errors to stop.
I am trying to connect lots of iot objects to an eventhub and save them to a blob storage(also an sql database). I want to do this with python(and I am not sure if this is a recommended practice). The documentation about python was confusing. I tried a few examples but they create an entry to blob storage but entries seems to be irrelevant.
Things like this:
Objavro.codecnullavro.schema\EC{"type":"record","name":"EventData","namespace":"Microsoft.ServiceBus.Messaging","fields":[{"name":"SequenceNumber","type":"long"}...
which is not what I send. How can I solve this?
You could use the azure-eventhub Python SDK to send messages to Event Hub which is available on pypi.
And there is a send sample showing how to send messages:
import os
from azure.eventhub import EventHubProducerClient, EventData
producer = EventHubProducerClient.from_connection_string(
conn_str=CONNECTION_STR,
eventhub_name=EVENTHUB_NAME
)
with producer:
event_data_batch = producer.create_batch()
event_data_batch.add(EventData('Single message'))
producer.send_batch(event_data_batch)
I'm interested in The documentation about python was confusing. I tried a few examples but they create an entry to blob storage but entries seems to be irrelevant.
Could you share your code with me? I'm wondering what's the input/output for Event Hub and Storage Blob and how's the data processing flow.
btw, for Azure Storage Blob Python SDK usage, you could check the repo and [blob samples] for more information.
This is the connection string format for inserting new messages in eventhub using kafka-python. If you were using kafka and want to replace you just have to change this connection string.
import ssl
context = ssl.create_default_context()
context.options &= ssl.OP_NO_TLSv1
context.options &= ssl.OP_NO_TLSv1_1
self.kafka = KafkaProducer(bootstrap_servers=KAFKA_HOST,connections_max_idle_ms=5400000,security_protocol='SASL_SSL',value_serializer=lambda v: json.dumps(v).encode('utf-8'),sasl_mechanism='PLAIN',sasl_plain_username='$ConnectionString',sasl_plain_password={YOUR_KAFKA_ENDPOINT},api_version = (0,10),retries=5,ssl_context = context)
KAFKA_HOST = "{your_eventhub}.servicebus.windows.net:9093"
KAFKA_ENDPOINT="Endpoint=sb://{your_eventhub}.servicebus.windows.net/;SharedAccessKeyName=RootSendAccessKey;SharedAccessKey={youraccesskey}"
You can find KAFKA_HOST and KAFKA_ENDPOING from your Azure Console.
I am using Azure cloud service bus to send and receive messages using AMQP protocol. I have installed proton-c libraries in my debian-linux. I tried the below program to send and receive message from the queue. My requirement is instead of queue I have use topics. Please anyone give me a sample program to use topics in Azure cloud.
import sys, optparse
from proton import *
messenger = Messenger()
message = Message()
message.address = "amqps://owner:<<key>>#namespace.servicebus.windows.net/queuename"
message.body = "sending message to the queue"
messenger.put(message)
messenger.send()
Instead of queuename in above url if I give the topic name then the program running forever. Please someone help me. I am new to python programming.
I found myself the solution for this problem. I guess very few people are working in Azure Cloud so I didn't get any answers.
Here is the solution:
If we create topics in Azure service bus, it always select the checkbox "Enable Partitioning". AMQP protocol doesn't support partitioning topics/queues so I stuck with above issue. Once I deleted the topic and recreate the same topic without select the checkbox "Enable Partitioning". Its work fine. :)
I have a Ruby on Rails 4.0 and PostgreSQL app hosted in an Ubuntu VPS. in this application I want to send email based on data in the database. for example a background job check a table content per hour and depend on content send email to user or not. I decided to do this work by Resque.
how can I do that?
should I do in Rails app or in an independent service?
and how can I schedule this job?
There are couple of more options I advise you to try to
1. Cron : One of most preferred approach for any unix developer to run a task based upon some interval . here are read more about
FYI: if you facing problem with understanding cron settings there are gem available to do the same for you its called whenever
2. Resque-Scheduler : Surely you missed one of Resque plugins that provide exactly the same feature that you need its called resque-scheduler . It too provide cron like settings for you to work on
Please check the above link for more info
Hope this helps.
I do not use Resque because I want a process in the Ubuntu server that in a schedule time (per hour). for example per hour check the table content and send alarm to the users by email.
I make a process by Daemon and rufus-scheduler for scheduling.
Process.daemon(true)
task_test = TaskTest.new
pid = Process.fork do
task_test.task
end
class TaskTest
def task
scheduler = Rufus::Scheduler.new
scheduler.every '1h' do
msg = "Message"
mailer = MailerProcess.new
mailer.send_mail('email-address', 'password', 'to-email', 'Subject', msg)
puts Time.now
end
scheduler.join
end
end