What is role of AWS eventbridge in a kafka consumer lambda? - amazon-web-services

I'm trying to understand the implementation flow while I'm designing the blueprint for one of our usecase. As per existing articles/blogs, AWS now supports self hosted kafka implementation for lambdas. Also there exists scheduled lambdas. But does anyone know where does eventbridge stands here?
Basically I want to trigger lambda everytime there's an event change in the topic it's subscribing too. So should the lambda act as a consumer that will listen to topics? If yes, since it's serverless, someone has to tell that there's a change. So will cloudwatch would be the one to do so?
Again if yes, does cloudwatch also needs to be acting as consumer and listen to topics?
Please help me understand, this might sound like an opinion question, but really nowhere I could find the correct answer.
P.S.- I know there's MSK and Kinesis, but it's recommended to used between lambda, eventbridge, sqs, sns, s3, etc only. The target is to read the data from topics and send out emails to recipients.

The Lambda service manages the integration with Kafka itself. You will config how it interacts, but ultimately your function will receive an event just like any other integration and it will include the messages from Kafka.

Related

How to create a topic in Amazon Sqs/Sns

I have a process which publish some data(json) onto a queue on Aws-Sqs. Another process reads from this queue. All this is working fine.
Now I want to create a topic which can be listened by mutiple processes and the data is delivered to all the processes. For example Activemq and many other messaging servers have this capability to create a topic. I could not find any such thing on AWS. The closest I could find is AWS SNS.
From what I understand AWS-SNS allows multiple clients to subscribe to a topic. But the type of subscription is either Email, Http, or Sms and so on ... This does not really serve my purpose. I want to recieve json data in all my clients just like Sqs.
Is that achievable? If so how?
You can subscribe multiple SQS into single SNS topic: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-subscribe-queue-sns-topic.html
Then it will be distributed to all of them.
The other option is to use Kinesis - https://aws.amazon.com/kinesis/ but it is more difficult to set up. There you can also read from multiple clients from the stream.
amazon mq is a managed active mq service. maybe this will help with your needs?

How to use Aws SQS as event source and invoke different Lamda function based on event Attributes

guys need small help, I have a use case, where I want to set up a communication service.
using SQS, SQs is going to receive a different type of events to be communicated. Now we have a single lambda function which does a single communication. let's say one email Lambda, Slack lambda, etc.
how I can invoke different lambda based on queue attributes. I was planning to use SQS as an event source and something kind of this architecture link to sample architeture
here in the above, we can handle rate limiting and concurrency at the lambda service level
simplified works if event type is A invoke Lambda A if the event type is B invoke a lambda B
and both events are in same SQS
all suggestions are welcome
Your problem is a SQS message can only be read by one service at a time. When it is being read, it is invisible to anyone else. You can only have one Lambda consumer and there isn't any partitioning or routing in SQS besides setting up another SQS topic. Multiple consumers are implemented Kensis or AWS MSK (Kafka)
What you are trying to accomplish is called a fan out. This is a common cloud architecture. What you probably want to do is publish initially to SNS. Then with SNS you can filter and route to multiple SQS topics for each of the message types and each SQS topic would then be consumed by it's own Lambda.
Check out a tutorial here:
https://docs.aws.amazon.com/sns/latest/dg/sns-common-scenarios.html

AWS SQS Required or not

I am building one service which would use the data that would come from another source(service). So, I am thinking to use the following pipeline:-
Other service ----> SNS Topic ----> SQS ----> AWS Lambda ----> Dynamo Db
So, what above flow says is Another service will push data to SNS Topic to which an SQS would be a subscriber. Now AWS Lambda will have a trigger on this SQS which would listen to the messages in SQS and push it to Dynamo Db. Although it looks okay to do this. But now I am thinking if I really need SQS or not. Can I avoid using it? Instead of using SQS, AWS Lambda directly has a trigger on SNS. I am just thinking of one case if I don't use AWS SQS. How will it handle the scenario if AWS Dynamo DB fails? I think with only SNS, I would lose some messages during the time, my Dynamo Db is in failed state but if I have SQS, then those messages would be stored in SQS queue.
Please let me know if my understanding is correct.
Thanks a lot for your help.
Couldn't answer as much in the comments so I'll try here.
The architecture you linked to is pretty common. The two biggest downfalls are that you're going to billed for Lambda usage even if there is nothing to do and your data may be delayed by the amount of the polling interval which is a minimum of 1 minute. Neither of these things may matter in your problem though.
SQS could be used as a temporary store for data in the event of a DynamoDB failure. But what exactly are you going to do if it fails? What if SQS fails and loses your messages? What if Lambda fails and never runs your code? DynamoDB is a hosted service just like SQS and Lambda - Amazon is going to work very hard to keep it running just like their other services. Trying to architect around every possible failure scenario will mean you never deliver code. I'd focus on the simplest architecture you can and put some trust in the services you're paying for.

AWS SNS vs AWS Step Functions

What's the better option to coordinate tasks between microservices?
For example, if I have a microservice that handles customer information and need to notifies other microservices, is it better to create a workflow (AWS Steps) between microservices or use a SNS?
I think AWS Steps will couple my lambda functions, and SNS not.
AWS Step Functions is a step machine that executes AWS Lambda functions. If your task involves "do this, then this" activities, then Step Functions could be a good option. It includes logic to determine the next step and automatically handles retries. It's the modern version of Amazon Simple Workflow (SWF).
Amazon Simple Notification Service (SNS) can also trigger Lambda functions, but it does not handle the logic nor the retries. It's a good fit for decoupled services, especially for fan-out where multiple subscribers receive the same message from a topic -- for example, for triggering multiple Lambda functions or sending multiple notifications. It's basically a public/subscribe service, of which Lambda is one of the subscriber types.
The choice will depend upon your particular use-case. If you don't want to redesign things to use Step Functions, then send notifications via SNS. If you sometimes send notifications (eg emails) rather than just trigger Lambda functions, use SNS.
Currently, Step Functions is not available in every region, while SNS is everywhere so that might also influence your choice.
It depends on what type of coordination you want.
Synchronous or Asynchronous.
If it is synchronous and if you really want some co-ordination between them, then Amazon Simple Notification Service (SNS) would not help and AWS Step Functions would be the way to go.
But if the requirement is asynchronous, and you just want to notify/invoke the microservices then SNS would be a better fit.
As I can read from your question "need to notify other microservices" I assume it is just about notifying them (as against to co-ordinating them) and each would know what to do further without relying on other microservices. And if that is true then SNS is a good fit.

I want to use amazon SQS to save the messages and use lambda to read the queue data and dump it into mysql

I am working with PHP technology.
I have my program that will write message to Amazon SQS.
Can anybody tell me how I can use lambda service to get data from SQS and push it into MySQL. Lambda service should get trigger whenever new record gets added to the queue.
Can somebody share the steps or code that will help me to get through with this task?
There isn't any official way to link SQS and Lambda at the moment. Have you looked into using an SNS topic instead of an SQS queue?
Agree with Mark B.
Ways to get events over to lambda.
use SNS http://docs.aws.amazon.com/sns/latest/dg/sns-lambda.html
use SNS->SQS and have the lambda launched by the sns notification just use it to load whatever is in te SQS queue.
use kinesis.
alternatively have lambda run by cron job to read sqs. Depends on needed latency. If you require it be processed immediately then this is not the solution because you would be running the lambda all the time.
Important note for using SQS. You are charged when you query even if no messages are waiting. So do not do fast polls even in your lambdas. Easy to run up a huge bill doing nothing. Also good reason to make sure you set up cloudwatch on the account to monitor usage and charges.