how to create an azure service bus (topic) subscription and list existing subscriptions via code? - azure-servicebus-topics

A 3rd party supplies data via azure service bus, but it exists in their name space on their azure subscription, so we cant create topic subscriptions via the portal.
They have given us a connection string and list of topic names.
We need to list existing topic subscriptions, and create new ones via code. Either a console app, or function app. Unfortunately, our devs are all java and javascript, so we are new to C#.
We have found tutorials on sending and receiving messages, but none to create a subscription given a connection string and topic name, and none to list all existing subscriptions (and even delete old ones).
Basically manage an azure message bus without access to the Azure portal namespace.

The SDK you would want to use is Azure.Messaging.ServiceBus. This is the latest SDK for managing Service Bus at the time of writing this answer and should work for .Net Core.
The Namespace for managing entities (Queues, Topics etc.) is Azure.Messaging.ServiceBus.Administration.
To list Topics in the Namespace, you would first create an instance of ServiceBusAdministrationClient and then use ServiceBusAdministrationClient.GetTopicsAsync method.
Similarly to create a Topic in the Namespace, you would create an instance of ServiceBusAdministrationClient and then use ServiceBusAdministrationClient.CreateTopicAsync method.

Related

When to use Firestore vs Pub/Sub

Can you elaborate on the differences between Pub/Sub and Firestore and provide some scenarios or use cases on which one to choose?
I'm not sure which one to use for building an app for a food delivery service that services real-time updates reflected as soon as they are added or changed to the database, ensuring that customers and drivers are aware of when food is ready for pickup and when food is in transit to their end destination such as UberEats.
The difference is quite simple:
Firestore (RealtimeDB) is for backend to frontend (customers/users) communication and realtime updates
Pubsub is a backend to backend message bus for async processing.
In your use case, you won't use PubSub to send notification to your users! Use realtimeDB to perform these updates.
Pub/Sub is like a notification system wherein you receive updates when something is added, changed or removed.
Firestore, on the other hand, is a NoSQL database for mobile (Android, iOS) and other web apps that can be directly access via native SDK. It can support many data types, from simple strings to complex objects. It also supports whatever data structure that works best for your app.
It is best to use Firestore for your app as it provides realtime updates.
You can check for the detailed documentation of Pub/Sub and Firestore.
For Firestore, you can either use either mobile/web client library or server client library.
Here's the link for Firestore, containing its benefits and key features.

Is pubsub suitable to be used by client desktop applications?

If I were to create a client desktop application, I'm trying to find a reliable way to notify client applications of new data that needs to be queried from the server. Would pubsub be a good use for this? Most of the documentation I see for it seems to be focused on server to server communication, and is a bit ambiguous if this would work well for server to client notifications.
If it should work, would I be able to properly authenticate subscribers to limit the topics they could subscribe to? This application would be potentially downloadable by anyone, and I would need to ensure that information intended for one client couldn't end up in the hands of another client.
Cloud Pub/Sub is not going to be a good choice for this use case. First of all, note that each topic and project is limited to 10,000 subscriptions. Therefore, if you intend to have more than that, you will run out of subscriptions. Secondly, note that a subscription only receives messages published after the subscription is created. If you only need messages to be delivered that were published after the user came to the website, this may be okay. However, with these two issues combined, you'll need to consider lifetime of your subscriptions. Do they get deleted when a user logs out? If not, when a user comes back, do you expect them to get all of the messages published since the last time they visited?
Additionally, as discussed in the comments, there is the issue of authentication. Your client-side app would have to have the credentials to subscribe. This would require you to essentially leak those credentials into your client-side code, which could be a vulnerability in your application.
The service designed to deliver notifications of this nature is Firebase Cloud Messaging.
If you want to open the application to anyone on the internet, you can't rely on the IAM service that only works with Google identity -> You can't ask your user to have a Google Account, the user experience will be bad.
Thus, you can't use IAM service to secure the PubSub access, and thus to use PubSub because anyone could access it.
In your use case, the first step is to ask the user to register (create an account, validate email, maybe use payment method,...). Then, you have an identity, but managed by you, not by IAM. You know which messages are for this user and which aren't.
If you want to be notified "in real time", I propose you to use long polling method or streaming to push data to the user. Cloud Run is now capable to do this and I recommend you to have a look on that.

Java API to read Azure EventHub Partition list using EventProcessorClient client (Latest SDK)

I need to get the List of Partitions for an EventHub. I am trying to use EventProcessorClient from the Latest SDK. This does not seem to have a getRuntimeInformation method.
Is there any way I can get the list of partitions foa an EventHub using any API for a EventProcessorClient client.
The processor is intended to manage partitions on your behalf and without any explicit action on your part, so it does not expose any means to inspect the Event Hub or its properties.
To inspect your Event Hub and its partitions, you'll want to use the EventHubClientBuilder to create either a producer or consumer client to do so. I'd recommend taking a peek at GetEventHubMetadata.java from the client library samples, which demonstrates doing so.

AWS service for managing state data - dynamodb/step functions/sqs?

I am building a Desktop-on-Demand solution using AWS Workspaces product and I am trying to understand what is the best AWS service to fit my requirements for managing state data for new users.
In a nutshell, solution will create a new AWS Workspace (virtual desktop instance) for a user when multiple conditions are met and checks are satisfied. These tasks would be satisfied by multiple lambda functions.
DynamoDB would be used as a central point for storing confguration data details like user data, user groups data and deployed virtual desktops data.
Logic for Desktops creation would be implemented using Step Functions like below:
Event hook comes from Identity Management system firing a lambda function that checks if user desktop already exists in DynamoDB table
If it does not exist, another lambda creates AWS AD connector
Once this is done, another lambda builds custom image for new desktop if needed
Another lambda pulls latest data from Identity Management system and updates DynamoDB table for users and groups.
Other lambda functions that may be fired up as a dependency
To ensure we have transactional mechanism, we only deploy new desktop when all conditions are met. I can think about few ways of implementing this check:
Use DynamoDB table for keeping State data. When all attributes in item are in expected state, desktop can be created. If any lambda fails or produces data that does not fit, dont' create desktop.
Just use Step Functions and design it's logic flow that all conditions must satisfy before desktop is created
Someone suggested using SQS queue but I don't see how this can be used for my purpose.
What is the best way to keep this data?
Step Functions is the method I would use for this. The DynamoDB solution would also work, but this seems like exactly the sort of thing Step Functions was designed to handle.
I agree that SQS would not be a correct solution.

How to write Kafka connector to integrate with Facebook API?

I am trying to write a Kafka connector to fetch data from the facebook. The problems are,
How to fetch data from facebook through their API without exceeding the limit of API hit provided by facebook? The connector should call facebook API for data after a specific time interval so that the number of hits won't exceed.
Each user can hit the facebook API with their Access Token so users can't share the same topic partition. So how to handle this scenario. Do we have to create one partition for each user?
I read a few guides and blogs to understand Kafka connect and write a connector.
Confluent- https://docs.confluent.io/current/connect/index.html
Kafka Documentation- https://kafka.apache.org/documentation/#connect
Conceptually It gave me an idea about what is Kafka connect, how it works and what are the important classes to write a Kafka connector. But still, I am confused that practically how to write and run a connector. I tried to find step by step development guide but didn't get.
Any tutorial or pdf If you could suggest which have detailed step by step development guide to write and run Kafka connector.
The only "official guide" is in those links you have
https://docs.confluent.io/current/connect/devguide.html#developing-a-simple-connector
I personally have no experience with the Facebook API, but I assume it uses REST, so you could make start by forking the kafka-connect-rest project, but the simplest answer to not exceed the limit would be to not send more requests than you are allowed within a given time period (add a timer to the code that waits between requests)
Also, one connector would only have one set of access keys. How you create the ConnectRecord objects to ultimately partition the records is up to you, but I don't think having an access key per user will scale very well. It might make more sense to have one key tied to one application, then each user will accept that that application has access to read certain details from their account.