Step Function Triggered In a loop - amazon-web-services

I am starting a step function from Lambda and the Lambda function is tied to an API Gateway. For some reason, when I try to test the Lambda function, I see hundreds of executions failed and running in loop. I just triggered the step function once. I am missing something here. Can you please advise.
const AWS = require("aws-sdk");
const uuidv4 = require("uuid/v4");
/*----------------------------------------------------------------------- */
/* Implementation */
/*----------------------------------------------------------------------- */
exports.handler = async event => {
var _dt = await ExecuteStepFunction()
return _dt;
}
function ExecuteStepFunction() {
const stepFunctions = new AWS.StepFunctions();
return new Promise((res, rej) => {
var params = {
stateMachineArn: 'arn:aws:states:us-east-1:xxxxxxxxxxxxx:stateMachine:xxTestSateMachine',
input: JSON.stringify(''),
name: uuidv4()
};
stepFunctions.startExecution(params, function (err, data) {
if (err) {
rej(err);
}
else {
res(data);
}
});
});
}
I tried thIS approach provided in the this link (https://docs.aws.amazon.com/step-functions/latest/dg/tutorial-api-gateway.html) where the API gateway directly triggers the step function but I am receiving the following error. After trying to fix this, I move to the above option of starting the function using the API.
{
"__type": "com.amazon.coral.service#UnrecognizedClientException",
"message": "The security token included in the request is invalid"
}

Related

Issues while trying to invoke lambda, from another lambda (ending up Malformed proxy response on APIG)

I am trying to invoke lambda B via another lambda A. Call to lambda A is triggered via APIG endpoint. Using curl, a fetch call is done as below:
curl "$#" -L --cookie ~/.midway/cookie --cookie-jar ~/.midway/cookie -X GET -H "Content-Type: application/json" -s https://us-west-2.beta.api.ihmsignage.jihmcdo.com/api/getSignInstances
Above invokes lambda A which handles the request and calls the main handler. Logic for main handler:
const main = (event: any, context: any, lambdaCallback: Function) => {
console.log(JSON.stringify(event, null, 2));
console.log(JSON.stringify(process.env, null, 2));
if (event.path.startsWith('/getUserInfo')) {
const alias = event.headers['X-FORWARDED-USER'];
const userData = JSON.stringify({ alias });
console.info('UserData: ', userData);
return sendResponse(200, userData, lambdaCallback); //This works perfectly fine with api gateway returning proper response
} else if (event.path.startsWith('/api')) {
console.info('Invoke lambda initiate');
invokeLambda(event, context, lambdaCallback); // This somehow invokes lambda B twice
} else {
return sendResponse(404, '{"message": "Resource not found"}', lambdaCallback);
}
};
Also have a wrapper associated as well in order to allow proper response is being sent back to the APIG:
export const handler = (event: any, context: any, lambdaCallback: Function) => {
const wrappedCallback = (error: any, success: any) => {
success.headers['Access-Control-Allow-Origin'] = getAllowedOrigin(event);
success.headers['Access-Control-Allow-Credentials'] = true;
success.headers['Access-Control-Allow-Methods'] = 'GET,PUT,DELETE,HEAD,POST,OPTIONS';
success.headers['Access-Control-Allow-Headers'] =
'Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token,Access-Control-Allow-Origin,Access-Control-Allow-Methods,X-PINGOVER';
success.headers['Vary'] = 'Accept-Encoding, Origin';
console.info('Logging sucess--', success);
return lambdaCallback(error, success);
};
// Append headers
return main(event, context, wrappedCallback);
};
And finally this is logic of how lambda B should be invoked within lambda A:
const invokeLambda = async (event: any, context: any, lambdaCallback: Function) => {
context.callbackWaitsForEmptyEventLoop = false;
if (!process.env.INVOKE_ARN) {
console.error('Missing environment variable INVOKE_ARN');
return sendResponse(500, '{"message":"internal server error"}', lambdaCallback);
}
const params = {
FunctionName: process.env.INVOKE_ARN,
InvocationType: 'RequestResponse',
Payload: JSON.stringify(event),
};
event.headers = event.headers || [];
const username = event.headers['X-FORWARDED-USER'];
const token = event.headers['X-CLIENT-VERIFY'];
if (!username || !token) {
console.log('No username or token was found');
return sendResponse(401, '{"message":"You shall not pass"}', lambdaCallback);
}
try {
const data = await lambda.invoke(params).promise();
console.info('Got Request router lambda data: ', data);
const invocationResponse = data?.Payload;
console.info('Got invocationResponse: ', invocationResponse);
return lambdaCallback(null, JSON.parse(invocationResponse as string));
} catch (err) {
console.error('Error while running starlet: ', err);
return sendResponse(500, '{"message":"internal server error"}', lambdaCallback);
}
};
Lambda B:
const main = async (event: any = {}) => {
// Log details
console.log('Request router lambda invoked');
console.log(JSON.stringify(event, null, 2));
return {
statusCode: 200,
body: JSON.stringify({ message: 'Hello from RequestRouter Lambda!' }),
headers: {
'Content-Type': 'application/json',
},
isBase64Encoded: false,
};
};
export const handler = main;
All of above works fine (no error logs from cloudwatch for lambdas), however it seems that Lambda A's handler is invoked, but it doesn't invoke Lambda B's handler ultimately returning a response to APIG which doesn't have proper headers.
Any pointers are highly appreciated!! Thank you :)
AWS recommends that you don't orchestrate your lambda functions in the code (one function calling another function).
For that use case, you can use AWS Step Functions.
You can create a state machine, define API Gateway as the trigger, and pass the result from one Lambda function to the next Lambda function.

Lambda not able to execute the aws sdk API

I am trying to create a lambda function which will check if a particular repository exist in codecommit.
Lamda service role is having admin priviledge. Below is the code.
The lambda is unable to call getRepository method. It is niether giving any exception nor passing. Any help on this?
console.log("Before calling cc") This is last printed statement. After that I am not getting any success or error log.
const CloudFormation = require('aws-sdk/clients/cloudformation');
const Codecommit = require('aws-sdk/clients/codecommit');
exports.handler = async (event) => {
try{
console.log("event",event);
console.log("event",JSON.stringify(event));
var repositoryName = event.detail.repositoryName;
var cfn = new CloudFormation({
region: "ap-northeast-1"
});
var cc = new Codecommit({
region: "ap-northeast-1"
});
const stackName = repositoryName+"-infra-stack";
var cloneUrl;
console.log("RepositoryName"+repositoryName);
console.log("StackName"+stackName);
var codeCommitParam = {
repositoryName: repositoryName
};
try{
console.log("Before calling cc")
cc.getRepository(codeCommitParam, function(err, data) {
if (err){
console.log(err, err.stack);
}else {
console.log(data.repositoryMetadata.cloneUrlHttp);
cloneUrl=data.repositoryMetadata.cloneUrlHttp;
console.log("Clone url "+cloneUrl);
checkStackDescription();
}
});
}catch(error){
console.log(error);
}
}
I believe this is coming down to the JavaScript in the Lambda being invoked asynchronously so the Lambda is finishing invoking before the callback processes the response.
Try updating to use this synchronously by updating to the below syntax.
console.log("Before calling cc")
let result = await cc.getRepository(codeCommitParam).promise();
console.log(result);
Be aware that result could either be an error or valid response.

Self invoking lambda invocation timing out

We're trying to develop a self-invoking lambda to process S3 files in chunks. The lambda role has the policies needed for the invocation attached.
Here's the code for the self-invoking lambda:
export const processFileHandler: Handler = async (
event: S3CreateEvent,
context: Context,
callback: Callback,
) => {
let bucket = loGet(event, 'Records[0].s3.bucket.name');
let key = loGet(event, 'Records[0].s3.object.key');
let totalFileSize = loGet(event, 'Records[0].s3.object.size');
const lastPosition = loGet(event, 'position', 0);
const nextRange = getNextSizeRange(lastPosition, totalFileSize);
context.callbackWaitsForEmptyEventLoop = false;
let data = await loadDataFromS3ByRange(bucket, key, nextRange);
await database.connect();
log.debug(`Successfully connected to the database`);
const docs = await getParsedDocs(data, lastPosition);
log.debug(`upserting ${docs.length} records to database`);
if (docs.length) {
try {
// upserting logic
log.debug(`total documents added: ${await docs.length}`);
} catch (err) {
await recurse(nextRange.end, event, context);
log.debug(`error inserting docs: ${JSON.stringify(err)}`);
}
}
if (nextRange.end < totalFileSize) {
log.debug(`Last ${context.getRemainingTimeInMillis()} milliseconds left`);
if (context.getRemainingTimeInMillis() < 10 * 10 * 10 * 6) {
log.debug(`Less than 6000 milliseconds left`);
log.debug(`Invoking next iteration`);
await recurse(nextRange.end, event, context);
callback(null, {
message: `Lambda timed out processing file, please continue from LAST_POSITION: ${nextRange.start}`,
});
}
} else {
callback(null, { message: `Successfully completed the chunk processing task` });
}
};
Where recurse is an invocation call to the same lambda. Rest of the things work as expected it just times out whenever the call stack comes on this invocation request:
const recurse = async (position: number, event: S3CreateEvent, context: Context) => {
let newEvent = Object.assign(event, { position });
let request = {
FunctionName: context.invokedFunctionArn,
InvocationType: 'Event',
Payload: JSON.stringify(newEvent),
};
let resp = await lambda.invoke(request).promise();
console.log('Invocation complete', resp);
return resp;
};
This is the stack trace logged to CloudWatch:
{
"errorMessage": "connect ETIMEDOUT 63.32.72.196:443",
"errorType": "NetworkingError",
"stackTrace": [
"Object._errnoException (util.js:1022:11)",
"_exceptionWithHostPort (util.js:1044:20)",
"TCPConnectWrap.afterConnect [as oncomplete] (net.js:1198:14)"
]
}
Not a good idea to create a self-invoking lambda function. In case of an error (could also be a bad handler call on AWS side) a lambda function might re-run several times. Very hard to monitor and debug.
I would suggest using Step Functions. I believe this tutorial can help Iterating a Loop Using Lambda
From the top of my head, if you prefer not dealing with Step Functions, you could create a Lambda trigger for an SQS queue. Then you pass a message to the queue if you want to run the lambda function another time.

Connect AWS mobile backend to DynamoDB

I am trying to use AWS mobile backend (using lambda function) to insert into dynamoDB (also configured at the mobile backend) but with no success so far.
The relevant code:
'use strict';
console.log("Loading function");
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient({region:process.env.MOBILE_HUB_PROJECT_REGION});
exports.handler = function(event, context, callback) {
var responseCode = 200;
var requestBody, pathParams, queryStringParams, headerParams, stage,
stageVariables, cognitoIdentityId, httpMethod, sourceIp, userAgent,
requestId, resourcePath;
console.log("request: " + JSON.stringify(event));
// Request Body
requestBody = event.body;
if (requestBody !== undefined && requestBody !== null) {
// Set 'test-status' field in the request to test sending a specific response status code (e.g., 503)
responseCode = JSON.parse(requestBody)['test-status'];
}
// Path Parameters
pathParams = event.path;
// Query String Parameters
queryStringParams = event.queryStringParameters;
// Header Parameters
headerParams = event.headers;
if (event.requestContext !== null && event.requestContext !== undefined) {
var requestContext = event.requestContext;
// API Gateway Stage
stage = requestContext.stage;
// Unique Request ID
requestId = requestContext.requestId;
// Resource Path
resourcePath = requestContext.resourcePath;
var identity = requestContext.identity;
// Amazon Cognito User Identity
cognitoIdentityId = identity.cognitoIdentityId;
// Source IP
sourceIp = identity.sourceIp;
// User-Agent
userAgent = identity.userAgent;
}
// API Gateway Stage Variables
stageVariables = event.stageVariables;
// HTTP Method (e.g., POST, GET, HEAD)
httpMethod = event.httpMethod;
// TODO: Put your application logic here...
let params = {
Item:{
"prop1":0,
"prop2":"text"
},
TableName:"testTable"
};
docClient.put(params, function(data, err){
if(err)
responseCode = 500;
else
{
responseCode = 200;
context.succeed(data);
}
});
// For demonstration purposes, we'll just echo these values back to the client
var responseBody = {
requestBody : requestBody,
pathParams : pathParams,
queryStringParams : queryStringParams,
headerParams : headerParams,
stage : stage,
stageVariables : stageVariables,
cognitoIdentityId : cognitoIdentityId,
httpMethod : httpMethod,
sourceIp : sourceIp,
userAgent : userAgent,
requestId : requestId,
resourcePath : resourcePath
};
var response = {
statusCode: responseCode,
headers: {
"x-custom-header" : "custom header value"
},
body: JSON.stringify(responseBody)
};
console.log("response: " + JSON.stringify(response))
context.succeed(response);
};
this doesn't put the item to the table for some reason.
I gave the necessary permissions using the roles part, anything I am missing?
**responseCode is only for testing purposes.
Edit:
tried AWS node.js lambda request dynamodb but no response (no err, no return data) and doesn't work either.
Edit2:
Added the full handler code. (it the default generated code when creating first AWS lambda).
I have refactored some bits of your code to look much simpler and use async/await (make sure to select Node 8.10 as the running environment for your function) instead of callbacks. I also got rid of the context and callback parameters, as they were used for older versions of NodeJS. Once you're using Node 8+, async/await should be the default option.
Also, it is possible to chain a .promise() on docClient.putItem, so you can easily await on it, making your code way simpler. I have left only the DynamoDB part (which is what is relevant to your question)
'use strict';
console.log("Loading function");
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient({region:process.env.MOBILE_HUB_PROJECT_REGION});
exports.handler = async (event) => {
let params = {
Item:{
"prop0":1,
"prop2":"text"
},
TableName:"testTable"
};
try {
await docClient.put(params).promise();
} catch (e) {
console.log(e)
return {
messsage: e.message
}
}
return { message: 'Data inserted successfully' };
};
Things to keep in mind if still it does not work:
Make sure your Lambda function has the right permissions to insert items on DynamoDB (AmazonDynamoDBFullAccess will do it)
You ALWAYS have to provide the partition key when inserting items to DynamoDB. On your example, the JSON only has two properties: prop1 and prop2. If none of them are the partition key, your code will certainly fail.
Make sure you table also exists
If you code fails, just check CloudWatch logs as any exception is now captured and printed out on the console.
The reason why no data is written in the table is because the call to DynamoDB put is asynchronous and will return by calling your callback. But during that time, the rest of the code continues to execute and your function eventually finish before the call to DynamoDB has a chance to complete.
You can use the await / async keywords to make your code sychronous :
async function writeToDynamoDB(params) {
return new Promise((resolve,reject) => {
docClient.put(params, function(data, err){
if(err)
reject(500);
else
resolve(data);
});
});
}
let params = ...
var data = await writeToDynamoDB(params)
You can find sample code I wrote (in Typescript) at https://github.com/sebsto/maxi80-alexa/blob/master/lambda/src/DDBController.ts

Cloud Function (trigger by HTTP) that would publish a message to PubSub

I am trying to create HTTP API in Cloud Function - that eventually published a message t PubSub. Understood, that there is PubSub REST API - but it enforced me to set up the authentication (in client side) - that I would like to skip and move it to the server side.
Below code is deployed as Google Cloud Function with this command gcloud functions deploy helloGET --runtime nodejs8 --trigger-http
But while tested in browser, it is errored out Error: could not handle the request
Any suggestion is appreciated, thanks!
"use strict";
// [START functions_pubsub_setup]
const { PubSub } = require("#google-cloud/pubsub");
// Instantiates a client
const pubsub = new PubSub();
// [END functions_pubsub_setup]
const Buffer = require("safe-buffer").Buffer;
exports.helloGET = (req, res) => {
const topic = pubsub.topic("projects/myproject/topics/openit");
const message = {
data: {
message: "req.body.message"
}
};
// Publishes a message
res.send(
topic
.publish(message)
.then(() => res.status(200).send("Message published."))
.catch(err => {
err = `Catch block ... ${err}`;
console.error(err);
res.status(500).send(err);
return Promise.reject(err);
})
);
};
Below code will work. But it will take around 30 seconds or plus for the subscriber to receive the event - it is way too slow for my used case :S
"use strict";
const { PubSub } = require("#google-cloud/pubsub");
const pubsub = new PubSub();
const Buffer = require("safe-buffer").Buffer;
exports.helloGET = async (req, res) => {
var toPublish = `hello ${Date.now()}`;
publishMessage("_REPLACE_WITH_TOPIC_NAME_", toPublish);
res.send(`Published ${toPublish}`);
};
async function publishMessage(topicName, data) {
console.log("=> publishMessage, data = ", data);
const dataBuffer = Buffer.from(data);
const topic = pubsub.topic(topicName);
const publisher = topic.publisher();
publisher.publish(dataBuffer, { a: "XYZ" }, function() {
console.log("Published eventually ...");
});
}