I read few articles but either I couldn't fully understand the solution or I felt they did not answer my problem at hand. So I am asking here again.
I need to make HTTP requests POST, PUT and GET calls to an external service.
I would like to batch up my POST calls in AWS SQS and execute them.
Currently, the POST calls via a lambda function is working fine. But I am not sure how to leverage AWS SQS for these callouts and also how to store the API responses and/or webhook responses.
Related
I have created a micro service architecture which flows as follows:
Api call -> Api gateway -> Eventbridge -> SNS -> Lambda
The reason for this is to use SNS instead of SQS to decouple applications for true serverless compute without the need for lambda to continuously poll sqs, pub sub over push poll.
The trouble is that although the execution is fine and the lambdas run as expected the return received by the user or app is the eventbridge response. I can’t find any docs on how eventbridge handles responses for http requests through API gateway.
Does anyone have any ideas or docs to push me in the right direction.
Thanks!
In your setup it's not possible to have the Lambda response proxied back to the api request initiator, as your client is very much decoupled of the actual request processing.
Almost identical issue was experienced here
You need to rethink the process as a whole:
what operation you want to complete via the API request?
does the processing of the request really need to be asynchronous (= does it take long time to complete?)
can you handle the request with a Lambda function, delegate to sns from there and finally generate desired response back to the client?
So as it turns out the answer is yes and no for anyone coming across this in the future.
With the current setup another database is required and the responses can be inserted into it with a transaction ID. This transaction ID can be generated by the client during the request so a subsequent call to find the response in the table can be made.
Alternatively Websocket or GraphQL api’s or would allow for asynchronous invocation if really depends on your use case and accepted complexity.
Thanks for everyone’s inputs!
We are exposing a AWS Api Gateway which needs to act as proxy and push body as message to AWS SQS. Our API gateway body can be array of object which we need to parse and send each object to SQS as separate message. Is there any way to achieve this without using Lambda ?
To answer your question, for something as simple as parsing a message and posting to SQS, I don't believe you need Lambda, no. Lambdas are designed for serverless-architecture. That is, when you don't have a server and you still want to run code. In this case, you do have a server behind your API Gateway, so you don't need Lambda (unless you want fancy branching error handling). You can use your API Gateway directly, yep. Here's a code review I found:
https://dzone.com/articles/creating-aws-service-proxy-for-amazon-sqs
I am someone who is totally new to REST APIs, pardon the newbie-ish mistakes.
My requirement is:
The source Database people wants to send JSON data on an hourly basis to an API endpoint which I publish. I am not sure of what all do I need to build to make sure it happens seamlessly. My target is to receive the data and create CSV files and save in it AWS S3 for further downstream processing.
My plan is, creating an AWS API Gateway endpoint which will accept POST requests and whenever anyone sends data through POST, the API Gateway will trigger AWS Lambda Function which will run Python to parse the JSON data to CSV and store in AWS S3. Is this thought valid? What all am I missing out? Are there best practices which needs to be implemented?
This architecture seems to be what you wanna do.
You wanna make sure that your API is secured with a key or via Cognito (more complex) and that your Lambda have the IAM permissions needed in order to access your bucket.
This post will help you understand the Lambda blueprint that is triggered when an object is upload to s3. Just change the Lambda trigger and a little bit the Python code and you're done.
Yes,this is a simple, typical serverless stack and it works perfectly fine.
Additionally, you may also focus on the authentication on the API Gateway end point to make it secure.
I am trying to figure out what is the best way to read a JSON file from a specific endpoint and then save/post such object to AWS S3. I have created a mocked endpoint with a mocked response via https://www.mockable.io/ and I would like to know what is the best way to 'POST' it to a S3 bucket. New JSON files will be available weekly and I was thinking that perhaps a way to go about would be to use Lambda AWS and the API Gateway. Is this a viable way? I also would like to explore the possibility of enable an event trigger way to pull data or a scheduler. What would you recommend? I know that AWS SQS is an option, but how do you send the fetched JSON files to the queue?
Thank you, any resource or suggestion is more than welcome. I am looking for potential approaches.
There are quite a lot of different ways you could approach this, but if I understand correctly you want to retrieve a JSON response once a week from a fixed endpoint (which you have set up?), and then write that JSON to a file, or sequence of files, you store on S3.
If that is correct, then all you really need is Cloudwatch Events (to set up a weekly scheduled recurring event in cron format) which triggers a lambda function that makes the request and then writes it to S3. You can also use the same lambda function (or write another which is triggered by the same CloudWatch Event) to post a message to SQS with the JSON.
Depending on what language you are most comfortable writing it in, you can use the SDK to do all the things you want to do. Personally I like the python library boto3, and combined with a little file IO to get the JSON to a text file of some kind, and the requests library to make the actual HTTP request to your endpoint, you should be able to do all you need. Helpful functions in boto3 will be sending a SQS message and writing to S3.
I'm not sure why you'd necessarily need API Gateway to do anything here, unless instead of triggering the lambda via a scheduled event, you wanted to do it by making a separate HTTP request, but then you may as well just make the request to your original API!
Please consider using Lambda with NodeJS code to do GET from endpoint for invoking the lambda function use cloudwatch event
https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/RunLambdaSchedule.html
What would be the most efficient way to EMIT a POST requests (a webhook precisely) from AWS on a daily basis ?
Right away, I started by simply creating a cloudWatch rule with an event schedule CRON that would trigger an SNS publication "every day at 18h", then I created an SNS topic "AlertMyWebhook" with all POST endpoints as subscribers of the topic.
But.. SNS requires me to CONFIRM subscription of each endpoints... which I can't by definition, since the https endpoint is not mine (webhook = HOOK into someone ELSE'S WEBapp).
So now I am starting to think crazy stuff like having a dynamoDB table to store all webhooks endpoint URL, coupled with a lambda function to read the table, take each https endpoints, and send a POST request to them...
Frankly speaking: that doesn't make any sense to me.
Is there any way to avoid SNS confirmation ?
If not, how on earth would you do to "trigger a POST every day at 18h" without creating a monolithic-like architecture ?
AWS SNS and Lambda functions are integrated with each other so you can subscribe a Lambda function to your topic. When a message is posted to that topic the subscribed Lambda function is invoked with the Payload(published message).
Using this Payload as input for the Lambda function trigger the POST requests for the endpoints. A good way to do is make all the HTTPS POST endpoints as Environment variables in Lambda. So there is no code change in the Lambda function whenever a new POST endpoint need to be added as the Subscription endpoints. For more of How to integrate AWS SNS and Lambda look here.
Invoking Lambda functions using Amazon SNS notifications
The sample NodeJS code to invoke the POST request
AWS Lambda HTTP POST Request