Lambda function returns empty payload? - amazon-web-services

I have a very simple lambda that returns the same event object it receives.
import json
def lambda_handler(event, context):
print('json.dumps(event)', json.dumps(event))
return json.dumps(event)
I am invoking it with the following
import json
import boto3
lambda_client = boto3.client('lambda', region_name = REGION)
response = lambda_client.invoke(
FunctionName='mylambdafunction',
InvocationType='Event',
Payload=json.dumps( {'key111': 'value1', 'key2': 'value2', 'key3': 'value3'})
)
However, when I do response['Payload'].read() I get an empty bytestring b'' back. Does anyone know what might be the issue? I can see the in the logs the function is receiving the event and attempts to return it, but all I get is that empty byte string no matter what I return payload comes back empty (it is a botocore.response.StreamingBody obj that I apply .read() to).
What is the issue here?

Related

Call another Lambda and pass through parameters a function that returns a set of information

I'm currently developing a lambda that invokes another Lambda with Boto3. However, I need from one statement to retrieve a group of results and send them through the invoke payload to the other Lambda. However, I can't find how to send this function as a parameter to call another Lambda and pass through parameters a function that returns a set of information.
I have implemented this method:
from MysqlConnection import MysqlConnection
from sqlalchemy import text
def make_dataframe(self):
conn = MysqlConnection()
query = text("""select * from queue WHERE estatus = 'PENDING' limit 4;""")
df = pd.read_sql_query(query,conn.get_engine())
return df.to_json()
This is the Lambda handler:
import json
import boto3
from MysqlConnection import MysqlConnection
from Test import Test
client = boto3.client('lambda')
def lambda_handler(event, context):
mydb = MysqlConnection()
print(mydb.get_engine)
df = Test()
df.make_dataframe()
object = json.loads(df.make_dataframe())
response = client.invoke(
FunctionName='arn:aws:lambda:',
InvocationType='RequestResponse'#event
Payload=json.dumps(object)
)
responseJson = json.load(response['Payload'])
print('\n')
print(responseJson)
print('\n')
What you're doing is correct in terms of structuring your call.
I assume the problem is with your payload structure and whether its stringified.
I would try invoke your lambda with an empty payload and see what happens. If it works with empty payload then its your payload serialising, if it doesnt work with empty payload then its something else.
In cloudwatch what do your logs of both your "runner" lambda and your "target" lambda say?
It might also be a permissions thing - you will need to specify and grant execute permissions on your runner lambda.
after days of refactoring and research I am sharing the answer. It is about packing the json.dump object and inside the handler place the method with the response already packed
This a method to parent child
class Test:
def make_dataframe(self):
conn = MysqlConnection()
query = text("""select * from TEST WHERE status'PEN' limit 4;""")
df = pd.read_sql_query(query,conn.get_engine())
lst = df.values.tolist()
obj = json.dumps(lst, cls=CustomJSONEncoder)
return obj
def lambda_handler(event, context):
mydb = MysqlConnection()
df = Test()
response = client.invoke(
FunctionName='arn:aws:lambda:',
InvocationType='RequestResponse',
Payload= df.make_dataframe()
)
responseJson = json.load(response['Payload'])
print('\n')
print(responseJson)
print('\n')
`

Testing a HTTP API is working with a boto3 Lambda function

I'm not too sure how the formatting works with using json and boto3 in the same file. The function works how it should but I don't know how to get a response from an API without an Internal server error.
I don't know if it is permissions or the code is wrong.
import boto3
import json
def lambda_handler(event, context):
client = boto3.resource('dynamodb')
table = client.Table('Visit_Count')
input = {'Visits': 1}
table.put_item(Item=input)
return {
'statusCode': 200
body: json.dumps("Hello, World!")
}
Instead of body it should be 'body'. Other than that you should check CloudWatch logs for any further lambda errors.
Anon and Marcin were right, I just tried it and it worked
Your Lambda role also need to have dynamodb:PutItem
import boto3
import json
def lambda_handler(event, context):
client = boto3.resource('dynamodb')
table = client.Table('Visit_Count')
input = {'Visits': 1}
table.put_item(Item=input)
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}

Return HTTP response in Python lambda which performs multiple DynamoDB put items

I have a lambda which performs multiple dynamodb puts. This is the handler
def lambda_handler(event, context):
ddbclient = boto3.client('dynamodb')
ddbclient.put_item(TableName='Tacticalble', Item={'xxx}})
ddbclient.put_item(TableName='Tacticalble', Item={'yyy}})
ddbclient.put_item(TableName='Tacticalble', Item={'zzz}})
ddbclient.put_item(TableName='Tacticalble', Item={'aaa}})
ddbclient.put_item(TableName='Tacticalble', Item={'bbb}})
Now I'm looking for the correct way to return an HTTP response.
Do I have to check every reponse like this and check if they are all statuscode 200?:
def lambda_handler(event, context):
ddbclient = boto3.client('dynamodb')
resp1 = ddbclient.put_item(TableName='Tacticalble', Item={'xxx}})
resp2 = ddbclient.put_item(TableName='Tacticalble', Item={'yyy}})
resp3 = ddbclient.put_item(TableName='Tacticalble', Item={'zzz}})
resp4 = ddbclient.put_item(TableName='Tacticalble', Item={'aaa}})
resp5 = ddbclient.put_item(TableName='Tacticalble', Item={'bbb}})
What is the correct way to return a HTTP200 when all put's succeed and return an error code when one of the put fails.
Thanks
You can use DynamoDB transactions instead of multiple put requests
In case of error Exception would be raised. See examples

Invoke a Lambda function with S3 payload from boto3

I need to invoke a Lambda function that accepts an S3 path. Below sample code of the lambda function.
def lambda_handler(event, context):
bucket = "mybucket"
key = "mykey/output/model.tar.gz"
model = load_model(bucket, key)
somecalc = some_func(model)
result = {'mycalc': json.dumps(somecalc)}
return result
I need to invoke this handler from my client code using boto3. I know I can do a request like below
lambda_client = boto3.client('lambda')
response = lambda_client.invoke(
FunctionName='mylambda_function',
InvocationType='RequestResponse',
LogType='Tail',
ClientContext='myContext',
Payload=b'bytes'|file,
Qualifier='1'
)
But I am not sure how to specify an S3 path in the payload. Looks like it is expecting a JSON.
Any suggestion?
You can specify a payload like so:
payload = json.dumps({ 'bucket': 'myS3Bucket' })
lambda_client = boto3.client('lambda')
response = lambda_client.invoke(
FunctionName='mylambda_function',
InvocationType='RequestResponse',
LogType='Tail',
ClientContext='myContext',
Payload=payload,
Qualifier='1'
)
And access the payload properties in your lamdba handler like so:
def lambda_handler(event, context):
bucket = event['bucket'] # pull from 'event' argument
key = "mykey/output/model.tar.gz"
model = load_model(bucket, key)
somecalc = some_func(model)
result = {'mycalc': json.dumps(somecalc)}
return result

AWS lambda function call another lambda function with parameter

I need to make two lambda function, one call the other with parameter, the called function print the parameter out.I got trouble to make it works:
The first function:
from __future__ import print_function
import boto3
import json
lambda_client = boto3.client('lambda')
def lambda_handler(event, context):
invoke_response = lambda_client.invoke(FunctionName="called-function",
InvocationType='Event',
Payload=json.dumps('hello Jenny'))
)
print(invoke_response)
Please advise what code should I put in the called-function in order to receive the parameter 'hello Jenny'?
Thank you
The Payload supplied in the params will be available as the event of the Lambda being invoked.
def add(event, context):
# event is 'hello Jenny'
return event