AWS Lambda Function assistance - amazon-web-services

I just started with AWS and i am creating my first Lambda functions. The first one was success - no issues when creating and executing.
Now i am trying to create Lambda function (python 3 based) with couple parameters. When i perform test from the API Gateway i can see it executes ok. When i try to execute from browser i see the following error:
{
"errorMessage": "'foo2",
"errorType": "KeyError",
"stackTrace": [
" File \"/var/task/lambda_function.py\", line 6, in lambda_handler\n foo2 = event['foo2'];\n"
]
}
Here is the function and mapping templates:
import json
import sys
def lambda_handler(event, context):
foo1 = event['foo1'];
foo2 = event['foo2'];
foo3 = event['foo3'];
foo = "This is Test!";
# TODO implement
return {
'statusCode': 200,
'body': json.dumps(event)
}
Mapping template
#set($inputRoot = $input.path('$'))
{
"foo1": "$input.params('foo1')",
"foo2": "$input.params('foo2')",
"foo3": "$input.params('foo3')"
}
I really wonder why this is happening..

I'm not an API gateway wizard, but it looks like you are trying to assign the variable foo2 to a part of the event that doesn't exist when invoking the function from the Browser, when testing the event you might want to look at the structure of the event. It might help inside your Lambda function to add a json.dumps direct under the lambda_handler to try understand if there are missing parameters.

Related

Why is AWS Lambda returning a Key Error when trying to upload an image to S3 and updating a DynamoDB Table with API Gateway?

I am trying to upload a binary Image to S3 and update a DynamoDB table in the same AWS Lambda Function. The problem is, whenever I try to make an API call, I get the following error in postman:
{
"errorMessage": "'itemId'",
"errorType": "KeyError",
"requestId": "bccaead6-cb60-4a5e-9fc7-14ff25380451",
"stackTrace": [
" File \"/var/task/lambda_function.py\", line 14, in lambda_handler\n s3_upload = s3.put_object(Bucket=bucket, Key=event[\"itemId\"] + \".png\", Body=decode_content)\n"
]
}
My events section takes in 3 Strings and whenever I try and access those strings, I get this error. However, if I try and access them without trying to upload to an S3 Bucket, everything works fine. My Lambda Function looks like this:
import json
import boto3
import base64
dynamoclient = boto3.resource("dynamodb")
s3 = boto3.client("s3")
table = dynamoclient.Table("Items")
bucket = "images"
def lambda_handler(event, context):
get_file_content = event["content"]
decode_content = base64.b64decode(get_file_content)
s3_upload = s3.put_object(Bucket=bucket, Key=event["itemId"] + ".png", Body=decode_content)
table.put_item(
Item={
'itemID': event["itemId"],
'itemName': event['itemName'],
'itemDescription': event['itemDescription']
}
)
return {
"code":200,
"message": "Item was added successfully"
}
Again, if I remove everything about the S3 file upload, everything works fine and I am able to update the DynamoDB table successfully. As for the API Gateway side, I have added the image/png to the Binary Media Types section. Additionally, for the Mapping Templates section for AWS API Gateway, I have added the content type image/png. In the template for the content type, I have the following lines:
{
"content": "$input.body"
}
For my Postman POST request, in the headers section, I have put this:
Finally, for the body section, I have added the raw event data with this:
{
"itemId": "0fx170",
"itemName": "Mouse",
"itemDescription": "Smooth"
}
Lastly, for the binary section, I have uploaded my PNG file.
What could be going wrong?

CDK CustomResource attribute error: Vendor response doesn't contain key in object

Using CDK, I have an aws custom resource that I want to get a value from its response. Unfortunately, I've been getting the error in the title. A simplified version of the response of the lambda that is invoked by the resource is found below:
public class Response {
private ResponseInfo info;
}
The lambda handler using this response is here
I have tested in AWS Lambda console that the lambda indeed returns json of the form:
{
"info": {...}
}
but when I try to get it (from my custom resource that triggered the lambda) with:
flyway_resource.get_response_field("info")
I get the error in the title. Any I ideas? How can I view what the custom resource's response actually looks like so that I can use the right keys?
You can view the custom resource definition here
The return json object from your custom resource doesn't have the field "info". I would use boto3 to create the resource and print the response in the console to see how it looks like:
Something like this:
client = boto3.client('Lambda', region_name='ap-southeast-2')
response = client.invoke(
FunctionName='string',
InvocationType='Event'|'RequestResponse'|'DryRun',
LogType='None'|'Tail',
ClientContext='string',
Payload=b'bytes'|file,
Qualifier='string'
)
print(response)
the response of your custom resource seems to be something like this:
{
'StatusCode': 123,
'FunctionError': 'string',
'LogResult': 'string',
'Payload': StreamingBody(),
'ExecutedVersion': 'string'
}
but you can verify it with the boto3 call
boto3 documentation: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/lambda.html?highlight=lambda#Lambda.Client.invoke

Error when calling Lambda UDF from Redshift

When calling a python lambda UDF from my Redshift stored procedure i am getting the following error. Any idea what could be wrong ?
ERROR: Invalid External Function Response Detail:
-----------------------------------------------
error: Invalid External Function Response code:
8001 context: Extra rows in external function response query: 0
location: exfunc_data.cpp:330 process: padbmaster [pid=8842]
-----------------------------------------------
My Python Lambda UDF looks as follows.
def lambda_handler(event, context):
#...
result = DoJob()
#...
ret = dict()
ret['results'] = result
ret_json = json.dumps(ret)
return ret_json
The above lambda function is associated to an external function in Redshift by name send_email_lambda. The permissions and invocation works without any issues. I am calling the lambda function as follows.
select send_email_lambda('sebder#company.com',
'recipient1#company.com',
'sample body',
'sample subject);
Edit :
As requested , adding the event payload passed from redshift to lambda.
{
"user":"awsuser",
"cluster":"arn:aws:redshift:us-central-1:dummy:cluster:redshift-test-cluster",
"database":"sample",
"external_function":"lambda_send_email",
"query_id":178044,
"request_id":"20211b87-26c8-6d6a-a256-1a8568287feb",
"arguments":[
[
"sender#company.com",
"user1#company.com,user2#company.com",
"<html><h1>Hello Therer</h1><p>A sample email from redshift. Take care and stay safe</p></html>",
"Redshift email lambda UDF",
"None",
"None",
"text/html"
]
],
"num_records":1
}
It looks like a UDF can be passed multiple rows of data. So, it could receive a request to send multiple emails. The code needs to loop through each of the top-level array, then extract the values from the array inside that.
It looks like it then needs to return an array that is the same length as the input array.
For your code, create an array with one entry and then return the dictionary inside that.

Lambda Function working, but cannot work with API Gateway

I have a working Lambda function when I test it using a test event:
{
"num1_in": 51.5,
"num2_in": -0.097
}
import json
import Function_and_Data_List
#Parse out query string parameters
def lambda_handler(event, context):
num1_in = event['num1_in']
num2_in = event['num2_in']
coord = {'num1': num1_in, 'num2': num2_in}
output = func1(Function_and_Data_List.listdata, coord)
return {
"Output": output
}
However, when I use API gateway to create a REST API I keep getting errors. My method for the REST API are:
1.) Build REST API
2.) Actions -> Create Resource
3.) Actions -> Create Method -> GET
4.) Integration type is Lambda Function, Use Lambda Proxy Integration
5.) Deploy
What am I missing for getting this API to work?
If you use lambda proxy integration, your playload will be in the body. You seem also having incorrect return format.
Therefore, I would recommend trying out the following version of your code:
import json
import Function_and_Data_List
#Parse out query string parameters
def lambda_handler(event, context):
print(event)
body = json.loads(event['body'])
num1_in = body['num1_in']
num2_in = body['num2_in']
coord = {'num1': num1_in, 'num2': num2_in}
output = func1(Function_and_Data_List.listdata, coord)
return {
"statusCode": 200,
"body": json.dumps(output)
}
In the above I also added print(event) so that in the CloudWatch Logs you can inspect the event object which should help debug the issue.

AWS API Gateway only passing 1st variable to function, but lambda test passes all

I'm finding this odd issue where AWS is passing the URL String parameters to a Lambda function properly but there is a break down in API gateway only when Lambda runs a Python handler function that calls
KeywordSearch(keyword,page,RPP)
that passes 3 variables to keywordSearch. In the lambda IDE test, it works without issue and prints out all 3 variables as seen in the logs as
InsideKeywordSearch, Vars=:
keyword:
bombing
page:
1
RPP:
10
But when I run an API Gateway test the log is showing the variable is not passed into the function as seen in the log showing no variable for RPP or Page.
The keyword is passed only. Am I not defining the function correctly? it works in Lambda why not API gateway if this is so?
here is a snippet of the code.
Function Call
def handler(event, context):
print('Inside Handler Funciton')
keyword = event.get('search_keyword', None)
id = event.get('id', None)
RPP = event.get('RPP', 10)
page = event.get('page', 1)
#get event variables, if passed and filter bad input
print("keyword")
print(keyword)
print("id")
print(id)
print('RPP')
print(RPP)
print('page')
print(page)
if keyword is not None:
return keywordSearch(keyword,page,RPP)
elif id is not None:
return idSearch(id)
else:
return ""
Function
def keywordSearch (keyword, page, RPP):
print('InsideKeywordSearch, Vars=: ')
print("keyword: ")
print(keyword)
print(" page: ")
print(page)
print(" RPP: ")
print(RPP)
Lambda Logs shows
Function Logs:
6d Version: $LATEST
Inside Handler Funciton
keyword
bombing
id
None
RPP
10
page
1
InsideKeywordSearch, Vars=:
keyword:
bombing
page:
1
RPP:
10
[INFO] 2018-06-30T03:04:56.240Z 5dc7a2cc-7c12-11e8-8f39-f5112d2e976d SUCCESS: Connection to RDS mysql instance succeeded
API Gateway call shows
{
"errorMessage": "unsupported operand type(s) for -: 'str' and 'int'",
"errorType": "TypeError",
"stackTrace": [
[
"/var/task/app.py",
144,
"handler",
"return keywordSearch(keyword,page,RPP)"
],
[
"/var/task/app.py",
93,
"keywordSearch",
"sql = f\"SELECT attackid, SUM(MATCH(attack_fulltext) AGAINST('%{keyword}%' IN BOOLEAN MODE)) as score FROM search_index WHERE MATCH(attack_fulltext) AGAINST('%{keyword}%' IN BOOLEAN MODE) GROUP BY attackid ORDER BY score DESC Limit { ((page-1)*RPP) },{(RPP)};\""
]
]
}
Which tells me that it's not passing the variables because the SQL sting becomes invalid.
The issue was that the API gateway was passing in a number as a string. I had to fix the get method mapping template in integration request as follows
In order to pass variables from API Gateway to back-end as integers, the quotes around the variable values need to be removed. Please make sure the values are getting passed as an integer from the client otherwise you will get the same error that you mentioned in your initial correspondence. In this case, I update the body mapping template from:
{
"search_keyword" : "$input.params('search_keyword')",
"page": "$input.params('page')",
"RPP": "$input.params('RPP')"
}
to:
{
"search_keyword" : "$input.params('search_keyword')",
"page": $input.params('page'),
"RPP": $input.params('RPP')
}