Using EventHub RBAC feature to publish events - azure-eventhub

I am writing a publisher service that wants to publish events to a client event hub. Is it correct that by using RBAC, I can achieve without having to worry about SAS keys? If so, is there any sample code I can look at.
I have been looking at the github samples at https://github.com/Azure/azure-event-hubs/tree/master/samples/DotNet/Rbac/EventHubsSenderReceiverRbac/ but they seem to be using individual user accounts for RBAC.

Related

Is there any equivalent feature to BPMN UserTask available in AWS Step functions?

We have our old 'Camunda based Spring boot application' which we currently deployed it into kubernetes which is running in an AWS EC2 Instance. This application acts as a backend for an angular based UI application.
Now we need to develop a new application similar to the above, which needs to interact with UI.
Our process flow will contain some UserTasks (BPMN) which will wait until manual interaction performed by human user via angular UI.
We are evaluating the possibility of using AWS stepfunctions instead of Camunda, if possible.
I googled but unable to find a concrete answer.
Is AWS stepfunctions have any feature similar to BPMN/Camunda's UserTask ?
Short answer: No.
====================================
Long answer:
After a whole day of study, I decided to continue with CamundaBPM because of below reasons.
AWS step-functions don't have an equivalent feature of UserTask in BPMN.
Step functions supports minimal human intervention via sending emails/messages by using AWS SQS(simple queue service) and AWS SNS(simple notification service).
Refer this link for full example. This manual interaction also based on 'task token'. So this interaction is limited to basic conversational style.
Step-function is NOT coming with in-built database & data management support, the developer has to take care of designing database schema, creating tables, their relationship etc.
On the other hand, Camunda is taking care of creating tables, their relationship, saving & fetching data.
No GUI modeler is available in step-functions, instead you need to draw workflow in a JSON-like language. This will be very difficult if your workflow becomes complex.
Drawing workflow in Camunda is just drag-and-drop using it's modeler.

Java API to read Azure EventHub Partition list using EventProcessorClient client (Latest SDK)

I need to get the List of Partitions for an EventHub. I am trying to use EventProcessorClient from the Latest SDK. This does not seem to have a getRuntimeInformation method.
Is there any way I can get the list of partitions foa an EventHub using any API for a EventProcessorClient client.
The processor is intended to manage partitions on your behalf and without any explicit action on your part, so it does not expose any means to inspect the Event Hub or its properties.
To inspect your Event Hub and its partitions, you'll want to use the EventHubClientBuilder to create either a producer or consumer client to do so. I'd recommend taking a peek at GetEventHubMetadata.java from the client library samples, which demonstrates doing so.

How do I fetch a log of user update changes in Amazon Cognito?

I have been working on getting some information on changes made against a certain Amazon Cognito user.
However, I'm only able to fetch details specific to the Authentication flow, but now I want to fetch some user update details, for example, lets say if a user was disabled then enabled again, or if a user updates their phone number or something like that. As an admin I would want to see what has been changed or at what time was a certain change made for auditing purpose. I'm not able to find out a way how this can be achieved.
Could someone please help me out with this?
I believe you can achieve this using Cognito Streams with Kinesis streams. Once a data change happens, Cognito streams will publish the change to Kinesis streams and you can analyse data using Kinesis Streams.
More info: https://docs.aws.amazon.com/cognito/latest/developerguide/cognito-streams.html

How to write Kafka connector to integrate with Facebook API?

I am trying to write a Kafka connector to fetch data from the facebook. The problems are,
How to fetch data from facebook through their API without exceeding the limit of API hit provided by facebook? The connector should call facebook API for data after a specific time interval so that the number of hits won't exceed.
Each user can hit the facebook API with their Access Token so users can't share the same topic partition. So how to handle this scenario. Do we have to create one partition for each user?
I read a few guides and blogs to understand Kafka connect and write a connector.
Confluent- https://docs.confluent.io/current/connect/index.html
Kafka Documentation- https://kafka.apache.org/documentation/#connect
Conceptually It gave me an idea about what is Kafka connect, how it works and what are the important classes to write a Kafka connector. But still, I am confused that practically how to write and run a connector. I tried to find step by step development guide but didn't get.
Any tutorial or pdf If you could suggest which have detailed step by step development guide to write and run Kafka connector.
The only "official guide" is in those links you have
https://docs.confluent.io/current/connect/devguide.html#developing-a-simple-connector
I personally have no experience with the Facebook API, but I assume it uses REST, so you could make start by forking the kafka-connect-rest project, but the simplest answer to not exceed the limit would be to not send more requests than you are allowed within a given time period (add a timer to the code that waits between requests)
Also, one connector would only have one set of access keys. How you create the ConnectRecord objects to ultimately partition the records is up to you, but I don't think having an access key per user will scale very well. It might make more sense to have one key tied to one application, then each user will accept that that application has access to read certain details from their account.

AWS DynamoDB Stream example for App?

I used Firebase to create a chat app. But I plan to move the backend from google to AWS. I found that DynamoDB support a Stream function is very similar to Firebase in AWS Website.
http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html
If someone adds a new message. Firebase and DynamoDB will notify clients the change. It is the basic feature of a chat app. Looks like DynamoDB Stream should be an alternative of Firebase.
But I notice that the documents and examples of DynamoDB Stream is very rare. And the weirdest is the tutorial has been removed
https://aws.amazon.com/getting-started/projects/build-mobile-messaging-app-ios/?nc1=h_ls
Click Get Started will be redirected to MobileHub.
I wonder if the DynamoDB Stream function is not supported in the future? And this service will be replaced by MobileHub.
And I used another function of MobileHub in another app development. But not figure out how to use MobileHub to build a chatting function
If I want to use AWS to build a chat app. What is the best solution? Is any useful example or tutorial for DynamoDB Stream? Thanks very much.
This is how I would solve it,
These things are not yet wired up yet. Here is how you can do it.
DynamoDB (Streams) --> Lambda --> SNS
Subscribe to streams, changes will be delivered to your Lambda, you can customize how the message should look like and send notification with SNS.
Hope it helps.