Asyncio is not working within my Python3.7 lambda - amazon-web-services

I am trying to create a python3.7 lambda which correctly uses asyncio for threading.
I have tried many different code variations but here is the latest block. I am using AWS Xray to look at the timing and it is easy to verify that the async is not working correctly. All these tasks and calls are being done synchronously.
import json
import boto3
import asyncio
from botocore.exceptions import ClientError
from aws_xray_sdk.core import xray_recorder
from aws_xray_sdk.core import patch_all
#xray
patch_all()
def lambda_handler(event, context):
tasks = []
dict_to_populate = {}
for item in list:
tasks.append(asyncio.ensure_future(do_work(item, dict_to_populate)))
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.gather(*tasks))
loop.close
async def do_work(item, dict_to_populate):
#assume regions are obtained
for region in regions:
response_vpcs = describe_vpcs(obj['Id'], session_assumed, region)
if 'Vpcs' in response_vpcs:
for vpc in response_vpcs['Vpcs']:
#process
I expect to see the do_work functions started at essentially the same time (asynchronously) but they are all synchronous according to XRAY. It is processing synchronously and is populating dict_to_populate as expected.

This is how i have done in my aws lambda, I wanted to do 4 post request and then collect all the responses. Hope this helps.
loop = asyncio.get_event_loop()
if loop.is_closed():
loop = asyncio.new_event_loop()
#The perform_traces method i do all the post method
task = loop.create_task(perform_traces(payloads, message, contact_centre))
unique_match, error = loop.run_until_complete(task)
loop.close()
In the perform_trace method this is how i have used wait with session
future_dds_responses = []
async with aiohttp.ClientSession() as session:
for payload in payloads:
future_dds_responses.append(dds_async_trace(session, payload, contact_centre))
dds_responses, pending = await asyncio.wait(future_dds_responses)
In dds_async_trace this is how i have done the post using the aiohttp.ClientSession session
async with session.post(pds_url,
data=populated_template_payload,
headers=PDS_HEADERS,
ssl=ssl_context) as response:
status_code = response.status

Related

Call another Lambda and pass through parameters a function that returns a set of information

I'm currently developing a lambda that invokes another Lambda with Boto3. However, I need from one statement to retrieve a group of results and send them through the invoke payload to the other Lambda. However, I can't find how to send this function as a parameter to call another Lambda and pass through parameters a function that returns a set of information.
I have implemented this method:
from MysqlConnection import MysqlConnection
from sqlalchemy import text
def make_dataframe(self):
conn = MysqlConnection()
query = text("""select * from queue WHERE estatus = 'PENDING' limit 4;""")
df = pd.read_sql_query(query,conn.get_engine())
return df.to_json()
This is the Lambda handler:
import json
import boto3
from MysqlConnection import MysqlConnection
from Test import Test
client = boto3.client('lambda')
def lambda_handler(event, context):
mydb = MysqlConnection()
print(mydb.get_engine)
df = Test()
df.make_dataframe()
object = json.loads(df.make_dataframe())
response = client.invoke(
FunctionName='arn:aws:lambda:',
InvocationType='RequestResponse'#event
Payload=json.dumps(object)
)
responseJson = json.load(response['Payload'])
print('\n')
print(responseJson)
print('\n')
What you're doing is correct in terms of structuring your call.
I assume the problem is with your payload structure and whether its stringified.
I would try invoke your lambda with an empty payload and see what happens. If it works with empty payload then its your payload serialising, if it doesnt work with empty payload then its something else.
In cloudwatch what do your logs of both your "runner" lambda and your "target" lambda say?
It might also be a permissions thing - you will need to specify and grant execute permissions on your runner lambda.
after days of refactoring and research I am sharing the answer. It is about packing the json.dump object and inside the handler place the method with the response already packed
This a method to parent child
class Test:
def make_dataframe(self):
conn = MysqlConnection()
query = text("""select * from TEST WHERE status'PEN' limit 4;""")
df = pd.read_sql_query(query,conn.get_engine())
lst = df.values.tolist()
obj = json.dumps(lst, cls=CustomJSONEncoder)
return obj
def lambda_handler(event, context):
mydb = MysqlConnection()
df = Test()
response = client.invoke(
FunctionName='arn:aws:lambda:',
InvocationType='RequestResponse',
Payload= df.make_dataframe()
)
responseJson = json.load(response['Payload'])
print('\n')
print(responseJson)
print('\n')
`

Django async model update not being enacted

The SQL update does not seem to be being enacted upon, and no errors are being thrown either. Below is a simplified version of my code. For context, the "choice" field in the model is a Boolean Field with a default of False, and a user may (ideally) change this by sending a JSON package with the "CHOICE" event and "Yes" message.
consumers.py
import json
from channels.generic.websocket import AsyncJsonWebsocketConsumer
from asgiref.sync import sync_to_async
from .models import Room
class Consumer(AsyncJsonWebsocketConsumer):
async def connect(self):
self.room_code = self.scope['url_route']['kwargs']['room_code']
#Websockets connection code
async def disconnect(self):
#Websockets disconnect code
async def receive(self, text_data):
response = json.loads(text_data)
event = response.get("event", None)
message = response.get("message", None)
if event == "CHOICE":
room_set = await sync_to_async(Room.objects.filter)(room_code=self.room_code)
room = await sync_to_async(room_set.first)()
if (not room.choice) and message["choice"] == 'Yes':
sync_to_async(room_set.update)(choice=True) #line seems to not be working
elif room.choice and message["choice"] == 'No':
sync_to_async(room_set.update)(choice=False)
#code to send message to group over Websockets
#code regarding other events
async def send_message(self, res):
#Websockets send message code
I've tried to only include the relevant code here, but if more is needed please let me know. Thanks in advance!
I fixed this issue by adding await before the sync_to_async(room.update)(choice=True) lines. It seems without an await it will move onto the next line of code before completing the SQL update, causing the update to not go through.

Is their a way to pass the output of a Lambda function into another Lambda function to be used as a variable?

I have a Lambda function which returns email addresses into the function log of lambda and i have another lambda function that sends scheduled emails.
I am trying to pass the result from the email addresses function into the second scheduled email function to be used as a variable for the recipients of the scheduled emails.
Here is the code for anyone wondering:
This Function retrieves the email/'s from the database
import pymysql
# RDS config
endpoint = '*******'
username = '*******'
password = '*******'
database_name = '******'
#connection config
connection = pymysql.connect(host=endpoint,user=username,passwd=password,db=database_name)
def handler(event, context):
cursor = connection.cursor()
cursor.execute('SELECT `Presenters`.Email FROM `Main` INNER JOIN `Presenters` ON `Main`.`PresenterID` = `Presenters`.`PresentersID` WHERE `Main`.`Read Day` ="Tuesday"')
rows = cursor.fetchall()
for row in rows:
print("{0}".format(row[0]))
This Second Function sends emails using Python
import os
import smtplib
from email.message import EmailMessage
def lambda_handler(event, context):
EMAIL_ADDRESS = "**********"
EMAIL_PASSWORD = os.environ.get('EMAIL_PASSWORD')
msg = EmailMessage()
msg['Subject'] = "*********"
msg['From'] = EMAIL_ADDRESS
msg['To'] = ['************']
msg.set_content('Hi everyone, a new read timetable has been posted for next week so be sure to check it and keep up to date on your reads, Thank you!')
with smtplib.SMTP_SSL('smtp.gmail.com', 465) as smtp:
smtp.login(EMAIL_ADDRESS, EMAIL_PASSWORD)
smtp.send_message(msg)
There is no such way in the lambda service.
If this is something that has to run sequentially, which is most probably has to, I would strongly recommend looking into Step Functions. Step Functions are basically state machines that orchestrate your workflows by calling your lambda functions (also have support for other compute services) sequentially and the output of one function can be passed in the input for the function that is executed after it.

SQS fifo trigger invoke Lambda Function (1 message - 1 invocation)

I have a SQS FIFO queue triggering a Lambda function.
I sent 10 messages (all different) and the lambda was invoked just once.
Details:
SQS
Visibility timeout: 30 min
Delivery delay: 0 secs
Receive Message Wait Time: 0 secs
Lambda:
Batch size: 1
timeout: 3secs
I don't see any errors on Lambda invocations.
I don't want to touch the delivery delay, but if I increase, seems working.
The avg duration time is less than 1,5ms
Any ideas how I can achieve this?
Should I increase the delivery delay or time out?
The message is being sent from a ecs task with the following code:
from flask import Flask, request, redirect, url_for, send_from_directory, jsonify
app = Flask(__name__)
from werkzeug.utils import secure_filename
import os
import random
import boto3
s3 = boto3.client('s3')
sqs = boto3.client('sqs',region_name='eu-west-1')
#app.route('/', methods=['GET'])
def hello_world():
return 'Hello World!'
#app.route('/upload', methods=['POST'])
def upload():
print (str(random.randint(0,9)))
file = request.files['file']
if file:
filename = secure_filename(file.filename)
file.save(filename)
s3.upload_file(
Bucket = os.environ['bucket'],
Filename=filename,
Key = filename
)
resp = sqs.send_message(
QueueUrl=os.environ['queue'],
MessageBody=filename,
MessageGroupId=filename
)
return jsonify({
'msg': "OK"
})
else:
return jsonify({
'msg': "NOT OK"
})
Check if this helps:
The message deduplication ID is the token used for deduplication of sent messages. If a message with a particular message deduplication ID is sent successfully, any messages sent with the same message deduplication ID are accepted successfully but aren't delivered during the 5-minute deduplication interval.
https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/using-messagededuplicationid-property.html
At least it explains why it works when you increase delivery delay.

aws boto3 unittest function that invokes a Lambda function

I have a python function that invokes an AWS Lambda function.
#lambda.py
import boto3
import os
client = boto3.client('lambda')
MY_LAMBDA = os.environ['MY_LAMBDA']
def invoke_function(input):
response = client.invoke(
FunctionName=MY_LAMBDA,
InvocationType='RequestResponse',
Payload=json.dumps(input)
)
How can I create a Unit Test for this function? I have been using Moto for other AWS services, but haven't been able to make it work for Lambda.
My attempt at using moto:
#test_lambda.py
from unittest.mock import MagicMock, patch
from unittest.mock import ANY
from moto import mock_lambda
import boto3
import os
import zipfile
import io
import lambda
class LambdaTest(unittest.TestCase):
def get_test_zip_file(self):
pfunc = '''
def lambda_handler(event, context):
return event
'''
zip_output = io.BytesIO()
zip_file = zipfile.ZipFile(zip_output, 'w', zipfile.ZIP_DEFLATED)
zip_file.writestr('lambda_function.py', pfunc)
zip_file.close()
zip_output.seek(0)
return zip_output.read()
#mock_lambda
def test_invoke_requestresponse_function(self):
conn = boto3.client('lambda', 'us-east-1')
conn.create_function(
FunctionName='test-func',
Runtime='python3.8',
Role='test-iam-role',
Handler='lambda_function.lambda_handler',
Code={
'ZipFile': self.get_test_zip_file(),
},
Description='test lambda function',
Timeout=3,
MemorySize=128,
Publish=True
)
sample_input = {'msg': 'Test Input'}
result = lambda.invoke_function(sample_input)
This errors out with:
botocore.exceptions.ClientError: An error occurred (404) when calling the Invoke operation:
The boto3-client in lambda.py is initialized before any of the mocking takes place. As that client doesn't know it's being mocked, it probably tries to talk to AWS itself.
For you particular test case, there are a few solutions:
Place import lambda in the test itself, so that the
boto3-client is created after the decorators have initialized
Override the client with the mocked version: lambda.client =
conn
Pass the mocked client as an argument to
lambda.invoke_function(conn, sample_input)