MongoDB - AWS lambda serverless timeout after 6.01 seconds - amazon-web-services

In some of my routes the lambda function times out after 6 seconds when it is using an existing mongodb connection.
I am sure the function itself is not the issue as I have added lots of logs and it takes 0.1 seconds to finish but the lambda does not return anything for 6+ seconds. In most of the tutorials online the db connection is in the same folder as the lambda function but I have moved it out to its own package as I have 5+ endpoints.
This is the most basic endpoint that times out every now and then
module.exports.getAll = async (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
try {
await databaseModels.db();
const x = await databaseModels.items.model.find({});
callback(null, {
statusCode: 200,
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(x)
})
} catch (error) {
callback(error, {
statusCode: 500,
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(error)
})
}
};
This is my connect to database function which I call at the start of every lambda function
const url = ``
let isConnected;
module.exports = connectToDatabase = () => {
if (isConnected) {
console.log('=> using existing database connection');
return Promise.resolve();
}
console.log('=> using new database connection');
return mongoose.connect(url).then(db => {
isConnected = db.connections[0].readyState;
});
};
In one of the turorials I found the author opened a connection for each request and closed it when the computation was done. Is this a good practice?

Related

Lambda connect to Dynamodb

I have lambda function like below
// Loads in the AWS SDK
const AWS = require('aws-sdk');
// Creates the document client specifing the region
const ddb = new AWS.DynamoDB.DocumentClient({region: 'us-east-1'});
exports.handler = async (event, context, callback) => {
// Handle promise fulfilled/rejected states
await readMessage().then(data => {
data.Items.forEach(function(item) {
console.log(item.message)
});
callback(null, {
// If success return 200, and items
statusCode: 200,
body: data.Items,
headers: {
'Access-Control-Allow-Origin': '*',
},
})
}).catch((err) => {
// If an error occurs write to the console
console.error(err);
})
};
// Function readMessage
// Reads 10 messages from the DynamoDb table Message
// Returns promise
function readMessage() {
const params = {
TableName: 'Message',
Limit: 10
}
return ddb.scan(params).promise();
}
my dynamo db is in region us-east-1 , but above code is not running , it is just timing out.But i am able to run simple hello world inside handler function , and it just works fine.
I have enabled VPC with 6 subnets.. also disabled VPC , not matter what it is timing out without any error logs.. I even increased timeout to 5 mins , it did not do anything.Should I install aws-sdk or something ? Sorry I am completely new to lambda.

Scalable AWS API Gateway WebSocket broadcast?

In our scalable AWS serverless application (DynamoDB, Lambda, API Gateway WebSockets) we want to broadcast messages from a Lambda to very many connected browser sessions simultaneously.
We have many topics, and to limit the load on the system we want to limit each browser session to subscribe to max. two topics.
The AWS example app linked in this official tutorial does this by looping through the connectionIds and doing an await postToConnection for each of them, which doesn't scale well as each postToConnection takes 40 ms (average):
const AWS = require('aws-sdk');
const ddb = new AWS.DynamoDB.DocumentClient();
exports.handler = async function (event, context) {
let connections;
try {
connections = await ddb.scan({ TableName: process.env.table }).promise();
} catch (err) {
return {
statusCode: 500,
};
}
const callbackAPI = new AWS.ApiGatewayManagementApi({
apiVersion: '2018-11-29',
endpoint:
event.requestContext.domainName + '/' + event.requestContext.stage,
});
const message = JSON.parse(event.body).message;
const sendMessages = connections.Items.map(async ({ connectionId }) => {
if (connectionId !== event.requestContext.connectionId) {
try {
await callbackAPI
.postToConnection({ ConnectionId: connectionId, Data: message })
.promise();
} catch (e) {
console.log(e);
}
}
});
try {
await Promise.all(sendMessages);
} catch (e) {
console.log(e);
return {
statusCode: 500,
};
}
return { statusCode: 200 };
};
How can we make this broadcast scalable?

AWS Lambda function invoked from another Lambda function runs stale request in queue

I have a Lambda function (A) that calls another Lambda function (B) and returns while the other one is still executing. Everything works perfectly locally (using sls offline), but when I deploy it, the following behavior occurs:
I call A for the first time. I see on Cloudwatch that A runs successfully and returns, but nothing on B, which apparently doesn't run.
I call A another time. Cloudwatch now shows that both A and B run, but the logs show that the invocation of B had request parameters corresponding to the first run of A.
Ensuing runs of A result in runs of B with request params from the previous run of A (e.g. B always lags behind by one and never runs the current invocation like it is supposed to, almost like each call of A pushes out the previous B invocation from a queue of size 1)
Some additional info:
No VPC's are involved (if that makes a difference?)
I am using sls
Anyone have any idea why this is happening? I know that AWS async invocations go into a queue, but I'm pretty sure they should just run ASAP if there's nothing in the queue.
Here's the code:
module.exports.A = async (event, context) => {
context.callbackWaitsForEmptyEventLoop = false;
try {
... some code ...
await connectToDatabase();
... some code ...
const newReq = req;
newReq['newSessionId'] = savedSession._id.toString();
const params = {
FunctionName: config.apiName + '-' + config.apiVersion + '-B',
InvocationType: 'Event',
Payload: JSON.stringify(newReq)
};
lambda.invoke(params, (err, res) => {
if (err) {
console.log(err);
} else {
console.log(res);
}
});
return {
statusCode: 200,
headers: {
'Access-Control-Allow-Origin' : '*',
'Access-Control-Allow-Credentials' : true
},
body: JSON.stringify(savedSession)
}
} catch (err) {
console.log(err);
return {
statusCode: err.statusCode || 500,
headers: { 'Content-Type': 'text/plain' },
body: 'Could not create session.'
}
}
}
module.exports.B = async (event, context) => {
context.callbackWaitsForEmptyEventLoop = false;
try {
... some code ....
// Connect to Atlas
await connectToDatabase();
... some code ...
for (let i = 0; i < applicants.length; i++) {
... some code ...
const savedResume = await DownloadResume(resumeId, accessToken, fileName, oauthSrcs.GSCRIPT);
conversionPromises.push(ConvertAndExportToS3(fileName, savedResume, sessionKey));
}
await Promise.all(conversionPromises);
... some code ...
return {
statusCode: 200,
headers: {
'Access-Control-Allow-Origin' : '*',
'Access-Control-Allow-Credentials' : true
},
body: JSON.stringify('Converted all resumes successfully!')
}
} catch (err) {
console.log(err);
return {
statusCode: err.statusCode || 500,
headers: { 'Content-Type': 'text/plain' },
body: 'Could not create session.'
}
}
}
I see you are returning from A without waiting for lambda.invoke to finish. I think you should.
To clarify, you are indeed invoking an asynchronous lamdda (meaning you don't have to wait for the child lambda to finish), but the invocation itself is synchronous. Not sure what would happen if you returns A in the middle of B's invocation.
Also, did you wait long enough so you are sure that the 2nd run of A triggers anything ? It could be that a cold start on B makes it look that way if you don't wait long enough on first call

Sever-less AWS Lambda Dynamodb throw internal error

I made one todo app by using lambda. I used sever-less framework for deployment. I made one post request where user can create a todo-list. For testing I am using post-man. For my Lambda function I am using async function. I can able to make post request and my item store in Dynamo db but I got response Internal server error. My goal is show in my postman what I kind of post request I made.
Here is my code:
'use strict'
const AWS = require('aws-sdk');
const uuid = require('uuid');
const dynamoDb = new AWS.DynamoDB.DocumentClient();
module.exports.createTodo = async (event) => {
const datetime = new Date().toISOString();
const data = JSON.parse(event.body);
const params = {
TableName: 'todos',
Item: {
id: uuid.v1(),
task: data.task,
done: false,
createdAt: datetime,
updatedAt: datetime
}
};
try {
let data = await dynamoDb.put(params).promise();
console.log(data);
return JSON.stringify(data); // this throw me internal server error.
} catch (error) {
console.log(error);
}
};
Since you are calling the Lambda function via API Gateway, you need to convert the response to the following structure -
return {
statusCode: 200,
body: JSON.stringify(data),
// headers: {'Content-Type': 'application/json'}, // Uncomment if needed by your client
}

Postman: how to send asynchronous request by script

I got two requests: A and B in postman. I want to send the request A first then send the request B while request A is still waiting for the response. It's quite easy to do this mannually because request A take 15s to finish.
But are there anyway I can do this automatically, because I am going to test this case a lot.
I have tried to use runner in postman but it always wait for request A to be finish before sending request B.
After that I found a document about sending asynchronous request in postman here.
I wrote a script that use pm.sendRequest to send request B and put that script in the pre-request of request A.
let confirmRequest = {
url: url + "/confirm",
method: "POST",
body: {
"requestId": (new Date()).getTime(),
"statusCode": "0",
}
}
setTimeout(function() {
pm.sendRequest(confirmRequest, function (err, res) {
console.log(err ? err : res.json());
});
}, 1500);
the problem is even I've wrapped it inside a setTimeout function, the request A still wait for pre-request to finish first. So in the end request B have been send before request A.
Are there any solution to this problem?
I tried but could not achieve asynchronously process requests using Postman or Newman. I found it easier to write a nodeJS code using async-await-promise concepts. Here is the sample code:
Sample Code that works for me:
var httpRequest "your raw request body";
var headersOpt = {
"content-type": "application/json",
};
const promisifiedRequest = function(options) {
return new Promise((resolve,reject) => {
request(options, (error, response, body) => {
if (response) {
return resolve(response);
}
if (error) {
return reject(error);
}
});
});
};
var output;
async function MyRequest(httpRequest,j, fileName) {
var options = {
uri: "url",
method: "POST",
body: httpRequest,
json: true,
time: true,
headers: headersOpt
}
try {
console.log('request posted!');
let response = await promisifiedRequest(options);
console.log('response recieved!');
output = output + ',' +response.elapsedTime;
console.log(response.elapsedTime);
console.log(output);
//return response;
} catch (err) {
console.log(err);
}
finally
{
//this code is optional and it used to save the response time for each request.
try{
fileName = (j+1)+'_'+fileName;
fs.writeFile('/logs-async/scripts/output/'+fileName+'.csv', output, (err) => {
//throws an error, you could also catch it here
if (err) throw err;
});
}
catch (err){
console.log(err);
}
}
}