Getting error while invoking API using AWS Lambda. (AWS Lambda + AWS API Gateway+ Postman) - amazon-web-services

I get an error while invoking the AWS SageMaker endpoint API from a Lambda function. When I call this using Postman, I am getting an error like:
{
"errorMessage": "module initialization error"
}

Just to make it clear, you can't call SageMaker endpoints directly using PostMan (even if it is, it would not be straightforward).
You may need to use AWS SDK (i.e. boto) for that.
Ref : https://aws.amazon.com/blogs/machine-learning/call-an-amazon-sagemaker-model-endpoint-using-amazon-api-gateway-and-aws-lambda/
What I would suggest is to create a small HTTP server with Flask and use the AWS SDK (Boto) to call the endpoint. Then you can call your Flask endpoint using PostMan.

We recommend using AWS SDK to invoke your endpoint. AWS SDK clients handle the serialization for you as well as request signing, etc. It would be really hard to get it right manually with postman.
We have the SDK client available in many languages, including Java, Python, JS, etc.
https://docs.aws.amazon.com/sagemaker/latest/dg/API_runtime_InvokeEndpoint.html#API_runtime_InvokeEndpoint_SeeAlso

Next time please include more details in your question. eg. POST request data, Headers etc.
Anyways, to help you out in calling Sagemaker endpoint using Postman -
In 'Authorization' tab, select type as 'AWS Signature'.
Enter your Access and Secret key of the IAM user which has permission to Sagemaker resources.
Enter the AWS region. eg.us-east-1
Enter 'Service Name' as 'sagemaker'
Select the right content type. Some ML algorithms only accept 'text/csv'.
Select request type as 'POST'
Enter the Sagemaker Invocation url. eg:'https://runtime.sagemaker.us-east-1.amazonaws.com/endpoints/xgboost-xxxx-xx-xx-xx-xx-xx-xxx/invocations'
Try it out and let me know if you have any issues.
Here is how your Postman should look -

Related

AWS serverless url not found

I published an AWS Lambda NET solution that responded with an AWS Serverless URL:
https://je7p92w69l.execute-api.us-east-2.amazonaws.com/Prod
When I got to this link it is 404.
Check the root path for your API: your API's URL is https://je7p92w69l.execute-api.us-east-2.amazonaws.com/Prod which seems to be correct. However, haven't you mapped any API? Let's say you are creating a /books API, then you should invoke your API like https://je7p92w69l.execute-api.us-east-2.amazonaws.com/Prod/books
If you're getting 404 it just means that API Gateway is responding properly but this URI on your API does not exist. Just figure out the path you created and it will work.
On the left-hand side of your API you should see your resources, and, inside it, your URIs.

How can I invoke AWS SageMaker endpoint to get inferences?

I want to get real time predictions using my machine learning model with the help of SageMaker. I want to directly get inferences on my website. How can I use the deployed model for predictions?
Sagemaker endpoints are not publicly exposed to the Internet. So, you'll need some way of creating a public HTTP endpoint that can route requests to your Sagemaker endpoint. One way you can do this is with an AWS Lambda function fronted by API gateway.
I created an example web app that takes webcam images and passes them on to a Sagemaker endpoint for classification. This uses the API Gateway -> Lambda -> Sagemaker endpoint strategy that I described above. You can see the whole example, including instructions for how to set up the Lambda (and the code to put in the lambda) at this GitHub repository: https://github.com/gabehollombe-aws/webcam-sagemaker-inference/
You can invoke the SageMaker endpoint using API Gateway or Lambda.
Lambda:
Use sagemaker aws sdk and invoke the endpoint with lambda.
API Gateway:
Use API Gateway and pass parameters to the endpoint with AWS service proxy.
Documentation with example:
https://aws.amazon.com/blogs/machine-learning/call-an-amazon-sagemaker-model-endpoint-using-amazon-api-gateway-and-aws-lambda/
Hope it helps.
Use the CLI like this:
aws sagemaker-runtime invoke-endpoint \
--endpoint-name <endpoint-name> \
--body '{"instances": [{"in0":[863],"in1":[882]}]}' \
--content-type application/json \
--accept application/json \
results
I found it over here in a tutorial about accessing Sagemaker via API Gateway.
As other answers have mentioned, your best option is fronting the SageMaker endpoint with a REST API in API Gateway. The API then lets you control authorisation and 'hides' the backend SageMaker endpoint from API clients, lowering the coupling between API clients (your website) and your backend. (By the way, you don't need a Lambda function there, you can directly integrate the REST API with SageMaker as a backend).
However, if you are simply testing the endpoint after deploying it and you want to quickly get some inferences using Python, there's two options:
After deploying your endpoint with predictor = model.deploy(...), if you still have the predictor object available in your Python scope, you can simply run predictor.predict(), as documented here. However, it's rather likely that you've deployed the endpoint a while ago and you can no longer access the predictor object, and naturally one doesn't want to re-deploy the entire endpoint just to get the predictor.
If your endpoint already exists, you can invoke it using boto3 as follows, as documented here:
import boto3
payload = "string payload"
endpoint_name = "your-endpoint-name"
sm_runtime = boto3.client("runtime.sagemaker")
response = sm_runtime.invoke_endpoint(
EndpointName=endpoint_name,
ContentType="text/csv",
Body=payload
)
response_str = response["Body"].read().decode()
Naturally, you can adjust the above invocation according to your content type, to send JSON data for example. Then just be aware of the (de)serializer the endpoint uses, as well as the ContentType in the argument to invoke_endpoint.

AWS API Gateway + Lambda: No Auth Header -> Error Message; Auth Header -> No Response

I'm trying to set up a very basic API, before taking the additional step of integrating Cognito. Testing the API endpoint within the AWS console works fine. However, I'm not able to get things working in Postman.
In API Gateway, I've connected an endpoint with no parameters to a Lambda function. The Lambda function returns a hard-coded, static JavaScript object. Clicking the test button here works as expected.
In the Method Execution screen, I've selected AWS_IAM for authorization and don't require an API key.
In IAM I created a user named postman and attached the AmazonAPIGatewayInvokeFullAccess policy (covering all ExecuteAPI resources, for now).
The API seems to be published correctly, since it complains when I try to access it without an Authorization header.
But when I use the AWS Signature Authoriztion type and enter postman's AccessKey and SecretKey, I get no response at all.
I'm trying to find access logs to debug, but I'm new to this part of AWS and haven't found anything yet... What am I missing? Thanks in advance.

How to make request to AWS API Gateway from Postman using temporary credentials

I followed the steps mentioned in this repo and created an API for users and pets.
I am able to get the temporary credentials after login and signup.
But for "/pets" (both POST and GET), I am getting "Internal Server Error".
I tested the functionality of pets API from AWS API Gateway test interface and was able to create and list pets.
Want to know about the parameters and headers that need to be set in Postman for an authorized request.
Based on the error message ("Execution failed due to configuration error: Invalid permissions on Lambda function".), it seems policy attached to your Lambda function is not correct. Make sure you have proper policies configured on Lambda

Upload file to s3 with custom response to client

I am trying to upload a file to s3 and then have lambda generate id, date.
I then want to return this data back to the client.
I want to avoid generating id and date on the client for security reasons.
Currently, I am trying to use API Gateway which invokes a lambda to upload into s3. However, I am having problems setting this up. I know that this is not a preferred method.
Is there another way to do this without writing my own web server. (I would like to use lambda).
If not, how can I configure my API Gateway method to support file upload to lambda?
You have a couple of options here:
Use API Gateway as an AWS Service Proxy to S3
Use API Gateway to invoke a Lambda function, which uses the AWS SDK to upload to S3
In either case, you will need to base64 encode the file content before calling API Gateway, and POST it in the request body.
We don't currently have any documentation on this exact use case but I would refer you to the S3 API and AWS SDK docs for more information. If you have any specific questions we'd be glad to help.
Thanks,
Ryan