Consuming azure event hub events in AWS - amazon-web-services

I'm trying to consume events that are published to a Azure event hub hosted by a third party, the service that consumes these events is an AWS based architecture.
What is the best way to consume event hub messages in aws services?
Would my only options be:
Set up an EC2 with an application that continuously processes the event hub messages
Have a scheduled lambda that processes messages, picking up where the last invocation left off
Both options using this guide:
https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-node-get-started-send
Is there any other way to accomplish this? The goal would be to somehow have event hub trigger a service like lambda directly.

Related

What is the low latency event sourcing service in AWS?

I am using EventBridge as event bus in our application. Based on its doc: https://aws.amazon.com/eventbridge/faqs/, the latency between sending and receiving an event is half second which is unacceptable in my application.
I am thinking about other alternatives. Kinesis has a problem about filtering events. Once a consumer attaches on a stream, it needs to provide some logics to filter out uninterested events. Since I am using lambda as the consumer and there will be many uninterested events trigger my lambda which will lead to high AWS bill.
AWS SNS can only support target of AWS services.
Another option is Kafka. But I can't find what the latency is when using AWS managed Kafka service.
What is the lowest latency event sourcing solution when using AWS
Kinesis is probably the best way to go now, thanks to the newly release "event filtering" feature. This allows you to configure an event source mapping which filters kinesis (or SQS, Dynamo Streams) events.
Doing this means you can use Kinesis as an event bus, without having to invoke a lambda with every event.
See: https://aws.amazon.com/about-aws/whats-new/2021/11/aws-lambda-event-filtering-amazon-sqs-dynamodb-kinesis-sources/

Using Apache Kafka in place of SQS

I have an application that uses AWS SQS with Lambda to process the messages pushed on the Queue. The Lambda keeps on polling the Queue, and when a new message appears it process the message.
For this scenario, is it possible to replace the SQS with Kafka on the AWS. In other words, can we use Kafka as a Queue for this use case?
You absolutely can. Have a look at AWS Amazon Managed Streaming for Apache Kafka (Amazon MSK)
. It's a managed service for Apache Kafka.
As for lambda triggers, unfortunately it's not a built in trigger. You can easily replicate the behaviour by using a periodically triggered lambda function that checks if the messsages are visible and then invokes the function that will process the message or processes the message directly. For some direction you can refer this official guide which sets up a similar pipeline, but for AWS MQ.

Is there a scheduling service in AWS?

I need something like SQS that would call my api on schedule. Also I need to be able to create new "schedules"
AWS Cloudwatch events is what you are looking for:
Events—An event indicates a change in your AWS environment. AWS resources can generate events when their state changes. For example, Amazon EC2 generates an event when the state of an EC2 instance changes from pending to running, and Amazon EC2 Auto Scaling generates events when it launches or terminates instances. AWS CloudTrail publishes events when you make API calls. You can generate custom application-level events and publish them to CloudWatch Events. You can also set up scheduled events that are generated on a periodic basis. For a list of services that generate events, and sample events from each service, see CloudWatch Events Event Examples From Supported Services.
https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/WhatIsCloudWatchEvents.html

AWS gives us Amazon MQ but how can I trigger a Lambda?

Superb news about the Amazon MQ service but now the question arises on how I can trigger a Lambda function (Node.js) on a message on a specific Queue?
I was thinking if I somehow can get a SNS topic posted on a message PUT or some other trigger that can fire a Lambda to consume the message from the Queue...
Any suggestions?
There isn't a native way to do this. Amazon's managed ActiveMQ service is simply a managed deployment of ActiveMQ running in EC2. It has no integration with other services.
You'd need to write a queue consumer and have it running on a server and listening to the queue on ActiveMQ and publishing those messages to SNS or invoking the Lambda function directly via the Lambda API, etc.
(At least for now.)
AWS recently announced
https://aws.amazon.com/about-aws/whats-new/2020/11/aws-lambda-supports-amazon-mq-apache-activemq-event-source/
We can now add Trigger as MQ in lambda. Then Configure
Broker url
Batch size
Queue Name
Here is one approach that AWS describes - https://aws.amazon.com/blogs/compute/invoking-aws-lambda-from-amazon-mq/
Basically, have a cloud watch trigger for the lambda, start polling for MQ messages and process those messages

Aws IoT : How to use an application service on EC2?

I'd like to use AWS IoT to manage a grid of devices. Data by device must be sent to a queue service (RabbitMQ) hosted on an EC2 instance that is the starting point for a real time control application. I read how to make a rule to write data to other Service: Here
However there isn't an example for EC2. Using the AWS IoT service, how can I connect to a service on EC2?
Edit:
I have a real time application developed with storm that consume data from RabbitMQ and puts the result of computation in another RabbitMQ queue. RabbitMQ and storm are on EC2. I have devices producing data and connected to IoT. Data produced by devices must be redirected to the queue on EC2 that is the starting point of my application.
I'm sorry if I was not clear.
The AWS IoT supports pushing the data directly to other AWS services. As you have probably figured out by now publishing to third party APIs isn't directly supported.
From the choices AWS offers Lambda, SQS, SNS and Kinesis would probably work best for you.
With Lambda you could directly forward the incoming message using the one of Rabbit MQs APIs.
With SQS you would put it into an AWS queue first and than poll this queue transfering it to RabbitMQ.
Kinesis would allow more sophisticated processing, but is probably too complex.
I suggest you program a Lamba with the programming language of your choice using one of the numerous RabbitMQ APIs.