How to use one APIGateway to multiple lambda - amazon-web-services

I have two lambda functions .
Now I want to use one api for these two.
Then my code is like this
const api = new apigateway.RestApi(this, 'ServerlessRestApi', {
restApiName: `AWSCDKTest-${systemEnv}`,
cloudWatchRole: false
});
api.root.addMethod('GET', new apigateway.LambdaIntegration(helloLambda));
api.root.addMethod('GET', new apigateway.LambdaIntegration(happyLambda));
Howeber it says GET is doubled.
So I made two API
const api = new apigateway.RestApi(this, 'ServerlessRestApi_hello', {
restApiName: `AWSCDK-Viral-${systemEnv}`,
cloudWatchRole: false
});
api.root.addMethod('GET', new apigateway.LambdaIntegration(helloLambda));
const api2 = new apigateway.RestApi(this, 'ServerlessRestApi_happy', { cloudWatchRole: false });
api2.root.addMethod('GET', new apigateway.LambdaIntegration(happyLambda));
It works, but it makes two API.
What is the best practice to use one API for two lambda??

API root:
GET "https://example_api_endpoint/" invokes helloLambda then inside this lambda function call the AWS-SDK lambda API method invoke() to trigger the execution of happyLambda within the first invocation in sequence. Otherwise, you cannot have two lambda functions for a single API resource. Further reading: AWS Lambda Fanout pattern.

One lambda cannot be linked to exact same API path + Http verb. Here are some alternatives -
a) have different api paths, each calling different lambda
b) have one lambda triggered by API gateway, which triggers other lambda that you need (within it's code)
c) have the lambda integration to the endpoint that sends a message to SNS, and have multiple lambda subscribe to the SNS via SQS.

try this
this.api = new RestApi(this, 'ServerlessRestApi', {
restApiName: "myapi",
cloudWatchRole: false
});
this.api.root.addResource(resource).addMethod('GET', new LambdaIntegration(func));

try this
// ./helloService.helloLambda.js file sample
export const handler = async (event, context) => {
return {
statusCode: 200,
headers: {},
body: 'helloLambda',
};
};
// ./helloService.happyLambda.js file sample
export const handler = async (event, context) => {
return {
statusCode: 200,
headers: {},
body: 'happyLambda',
};
};
// .helloService file sample
import { Construct } from 'constructs';
import { NodejsFunction } from 'aws-cdk-lib/aws-lambda-nodejs';
import * as apigateway from 'aws-cdk-lib/aws-apigateway';
export class HelloService extends Construct {
constructor(scope: Construct, id: string) {
super(scope, id);
const helloLambdaFunction = new NodejsFunction(this, 'helloLambda');
const happyLambdaFunction = new NodejsFunction(this, 'happyLambda');
const api = new apigateway.RestApi(this, 'hello-api', {
description: 'This service is Happy.',
});
const helloLambdaPath = api.root.addResource('helloLambda');
// path name https://{createdId}.execute-api.{region}.amazonaws.com/prod/helloLambda
helloLambdaPath.addMethod('GET', new apigateway.LambdaIntegration(helloLambdaFunction));
const happyLambdaPath = api.root.addResource('happyLambda');
// path name https://{createdId}.execute-api.{region}.amazonaws.com/prod/happyLambda
happyLambdaPath.addMethod('GET', new apigateway.LambdaIntegration(happyLambdaFunction));
}
}

Related

Cloud Functions / Cloud Tasks UNAUTHENTICATED error

I am trying to get a Cloud Function to create a Cloud Task that will invoke a Cloud Function. Easy.
The flow and use case are very close to the official tutorial here.
I also looked at this article by Doug Stevenson and in particular its security section.
No luck, I am consistently getting a 16 (UNAUTHENTICATED) error in Cloud Task.
If I can trust what I see in the console it seems that Cloud Task is not attaching the OIDC token to the request:
Yet, in my code I do have the oidcToken object:
const { v2beta3, protos } = require("#google-cloud/tasks");
import {
PROJECT_ID,
EMAIL_QUEUE,
LOCATION,
EMAIL_SERVICE_ACCOUNT,
EMAIL_HANDLER,
} from "./../config/cloudFunctions";
export const createHttpTaskWithToken = async function (
payload: {
to_email: string;
templateId: string;
uid: string;
dynamicData?: Record<string, any>;
},
{
project = PROJECT_ID,
queue = EMAIL_QUEUE,
location = LOCATION,
url = EMAIL_HANDLER,
email = EMAIL_SERVICE_ACCOUNT,
} = {}
) {
const client = new v2beta3.CloudTasksClient();
const parent = client.queuePath(project, location, queue);
// Convert message to buffer.
const convertedPayload = JSON.stringify(payload);
const body = Buffer.from(convertedPayload).toString("base64");
const task = {
httpRequest: {
httpMethod: protos.google.cloud.tasks.v2.HttpMethod.POST,
url,
oidcToken: {
serviceAccountEmail: email,
audience: new URL(url).origin,
},
headers: {
"Content-Type": "application/json",
},
body,
},
};
try {
// Send create task request.
const request = { parent: parent, task: task };
const [response] = await client.createTask(request);
console.log(`Created task ${response.name}`);
return response.name;
} catch (error) {
if (error instanceof Error) console.error(Error(error.message));
return;
}
};
When logging the task object from the code above in Cloud Logging I can see that the service account is the one that I created for the purpose of this and that the Cloud Tasks are successfully created.
IAM:
And the function that the Cloud Task needs to invoke:
Everything seems to be there, in theory.
Any advice as to what I would be missing?
Thanks,
Your audience is incorrect. It must end by the function name. Here, you only have the region and the project https://<region>-<projectID>.cloudfunction.net/. Use the full Cloud Functions URL.

Issues while trying to invoke lambda, from another lambda (ending up Malformed proxy response on APIG)

I am trying to invoke lambda B via another lambda A. Call to lambda A is triggered via APIG endpoint. Using curl, a fetch call is done as below:
curl "$#" -L --cookie ~/.midway/cookie --cookie-jar ~/.midway/cookie -X GET -H "Content-Type: application/json" -s https://us-west-2.beta.api.ihmsignage.jihmcdo.com/api/getSignInstances
Above invokes lambda A which handles the request and calls the main handler. Logic for main handler:
const main = (event: any, context: any, lambdaCallback: Function) => {
console.log(JSON.stringify(event, null, 2));
console.log(JSON.stringify(process.env, null, 2));
if (event.path.startsWith('/getUserInfo')) {
const alias = event.headers['X-FORWARDED-USER'];
const userData = JSON.stringify({ alias });
console.info('UserData: ', userData);
return sendResponse(200, userData, lambdaCallback); //This works perfectly fine with api gateway returning proper response
} else if (event.path.startsWith('/api')) {
console.info('Invoke lambda initiate');
invokeLambda(event, context, lambdaCallback); // This somehow invokes lambda B twice
} else {
return sendResponse(404, '{"message": "Resource not found"}', lambdaCallback);
}
};
Also have a wrapper associated as well in order to allow proper response is being sent back to the APIG:
export const handler = (event: any, context: any, lambdaCallback: Function) => {
const wrappedCallback = (error: any, success: any) => {
success.headers['Access-Control-Allow-Origin'] = getAllowedOrigin(event);
success.headers['Access-Control-Allow-Credentials'] = true;
success.headers['Access-Control-Allow-Methods'] = 'GET,PUT,DELETE,HEAD,POST,OPTIONS';
success.headers['Access-Control-Allow-Headers'] =
'Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token,Access-Control-Allow-Origin,Access-Control-Allow-Methods,X-PINGOVER';
success.headers['Vary'] = 'Accept-Encoding, Origin';
console.info('Logging sucess--', success);
return lambdaCallback(error, success);
};
// Append headers
return main(event, context, wrappedCallback);
};
And finally this is logic of how lambda B should be invoked within lambda A:
const invokeLambda = async (event: any, context: any, lambdaCallback: Function) => {
context.callbackWaitsForEmptyEventLoop = false;
if (!process.env.INVOKE_ARN) {
console.error('Missing environment variable INVOKE_ARN');
return sendResponse(500, '{"message":"internal server error"}', lambdaCallback);
}
const params = {
FunctionName: process.env.INVOKE_ARN,
InvocationType: 'RequestResponse',
Payload: JSON.stringify(event),
};
event.headers = event.headers || [];
const username = event.headers['X-FORWARDED-USER'];
const token = event.headers['X-CLIENT-VERIFY'];
if (!username || !token) {
console.log('No username or token was found');
return sendResponse(401, '{"message":"You shall not pass"}', lambdaCallback);
}
try {
const data = await lambda.invoke(params).promise();
console.info('Got Request router lambda data: ', data);
const invocationResponse = data?.Payload;
console.info('Got invocationResponse: ', invocationResponse);
return lambdaCallback(null, JSON.parse(invocationResponse as string));
} catch (err) {
console.error('Error while running starlet: ', err);
return sendResponse(500, '{"message":"internal server error"}', lambdaCallback);
}
};
Lambda B:
const main = async (event: any = {}) => {
// Log details
console.log('Request router lambda invoked');
console.log(JSON.stringify(event, null, 2));
return {
statusCode: 200,
body: JSON.stringify({ message: 'Hello from RequestRouter Lambda!' }),
headers: {
'Content-Type': 'application/json',
},
isBase64Encoded: false,
};
};
export const handler = main;
All of above works fine (no error logs from cloudwatch for lambdas), however it seems that Lambda A's handler is invoked, but it doesn't invoke Lambda B's handler ultimately returning a response to APIG which doesn't have proper headers.
Any pointers are highly appreciated!! Thank you :)
AWS recommends that you don't orchestrate your lambda functions in the code (one function calling another function).
For that use case, you can use AWS Step Functions.
You can create a state machine, define API Gateway as the trigger, and pass the result from one Lambda function to the next Lambda function.

Cache async API result outside of AWS Lambda execution handler

Was looking at using the SSM Parameter Store SDK to grab secrets for a lambda function. I'm also using epsagon to wrap the async handler function. Epsagon needs to be initialized with some secrets, and then is used to wrap the handler function:
import * as epsagon from 'epsagon'
epsagon.init({
token: EPSAGON_ACCOUNT_TOKEN,
})
export const lambdaHandler = epsagon.lambdaWrapper(async (event) => {
// do stuff
})
Started using aws-parameter-cache to grab config values from SSM Param Store, but since they are resolved with an API call, it takes an await to get the values fully resolved.
import { ssmParameter } from 'aws-parameter-cache'
const param = ssmParameter({ name: 'foo' })
const value = await param.value; // <-- can only be done inside an async function (nodejs12)
Since we don't yet have top level await in nodejs12, is there a way to resolve the variables outside of the handler function? Is it possible to wait for the API call for await param.value to finish so that I can initialize epsagon with a value stored in SSM Param Store?
import * as epsagon from 'epsagon'
import { ssmParameter } from 'aws-parameter-cache'
const ssmParam = ssmParameter({ name: 'epsagonToken' })
const epsagonToken = await ssmParam.value // fails since outside of async func
epsagon.init({
token: epsagonToken,
})
export const lambdaHandler = epsagon.lambdaWrapper(async (event) => {
const epsagonToken = await ssmParam.value // works here but too late
})
Would this "just work" in nodejs 14.3.0 with top-level await? Custom runtime?
Or maybe some form of never-rejecting top-level async function, like in the top answer to this: how-can-i-use-async-await-at-the-top-level?
Need the handler to be the callback to the top-level async function--from what I've read this is essentially how top-level async works in 14.3. Looking for way to store all secrets in SSM Param store and reduce cf template ENV variable mappings.
Basically there is no easy way to do top-level await in this case, but there are some easy workarounds around it. For example, here is an implementation of another wrapper that you can use to initialize Epsagon:
import * as epsagon from 'epsagon'
import { ssmParameter } from 'aws-parameter-cache'
const ssmParam = ssmParameter({ name: 'epsagonToken' })
const withEpsagon = (wrapped) => {
let epsagonInited = false
const epsagonizedFunction = epsagon.lambdaWrapper(wrapped)
return async (event, context, callback) => {
if (!epsagonInited) {
const epsagonToken = await ssmParam.value
epsagon.init({
token: epsagonToken,
})
epsagonInited = true
}
return epsagonizedFunction(event, context, callback)
}
}
export const lambdaHandler = withEpsagon(async (event) => {
// your code here
})
This code will resolve the SSM parameter on its first execution (right after a cold start, which is a time you would have to spend on the cold start anyway), and memorize that it already initialized Epsagon so you don't have to waste time on every time the Lambda is invoked.

Step Function Triggered In a loop

I am starting a step function from Lambda and the Lambda function is tied to an API Gateway. For some reason, when I try to test the Lambda function, I see hundreds of executions failed and running in loop. I just triggered the step function once. I am missing something here. Can you please advise.
const AWS = require("aws-sdk");
const uuidv4 = require("uuid/v4");
/*----------------------------------------------------------------------- */
/* Implementation */
/*----------------------------------------------------------------------- */
exports.handler = async event => {
var _dt = await ExecuteStepFunction()
return _dt;
}
function ExecuteStepFunction() {
const stepFunctions = new AWS.StepFunctions();
return new Promise((res, rej) => {
var params = {
stateMachineArn: 'arn:aws:states:us-east-1:xxxxxxxxxxxxx:stateMachine:xxTestSateMachine',
input: JSON.stringify(''),
name: uuidv4()
};
stepFunctions.startExecution(params, function (err, data) {
if (err) {
rej(err);
}
else {
res(data);
}
});
});
}
I tried thIS approach provided in the this link (https://docs.aws.amazon.com/step-functions/latest/dg/tutorial-api-gateway.html) where the API gateway directly triggers the step function but I am receiving the following error. After trying to fix this, I move to the above option of starting the function using the API.
{
"__type": "com.amazon.coral.service#UnrecognizedClientException",
"message": "The security token included in the request is invalid"
}

Connect AWS mobile backend to DynamoDB

I am trying to use AWS mobile backend (using lambda function) to insert into dynamoDB (also configured at the mobile backend) but with no success so far.
The relevant code:
'use strict';
console.log("Loading function");
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient({region:process.env.MOBILE_HUB_PROJECT_REGION});
exports.handler = function(event, context, callback) {
var responseCode = 200;
var requestBody, pathParams, queryStringParams, headerParams, stage,
stageVariables, cognitoIdentityId, httpMethod, sourceIp, userAgent,
requestId, resourcePath;
console.log("request: " + JSON.stringify(event));
// Request Body
requestBody = event.body;
if (requestBody !== undefined && requestBody !== null) {
// Set 'test-status' field in the request to test sending a specific response status code (e.g., 503)
responseCode = JSON.parse(requestBody)['test-status'];
}
// Path Parameters
pathParams = event.path;
// Query String Parameters
queryStringParams = event.queryStringParameters;
// Header Parameters
headerParams = event.headers;
if (event.requestContext !== null && event.requestContext !== undefined) {
var requestContext = event.requestContext;
// API Gateway Stage
stage = requestContext.stage;
// Unique Request ID
requestId = requestContext.requestId;
// Resource Path
resourcePath = requestContext.resourcePath;
var identity = requestContext.identity;
// Amazon Cognito User Identity
cognitoIdentityId = identity.cognitoIdentityId;
// Source IP
sourceIp = identity.sourceIp;
// User-Agent
userAgent = identity.userAgent;
}
// API Gateway Stage Variables
stageVariables = event.stageVariables;
// HTTP Method (e.g., POST, GET, HEAD)
httpMethod = event.httpMethod;
// TODO: Put your application logic here...
let params = {
Item:{
"prop1":0,
"prop2":"text"
},
TableName:"testTable"
};
docClient.put(params, function(data, err){
if(err)
responseCode = 500;
else
{
responseCode = 200;
context.succeed(data);
}
});
// For demonstration purposes, we'll just echo these values back to the client
var responseBody = {
requestBody : requestBody,
pathParams : pathParams,
queryStringParams : queryStringParams,
headerParams : headerParams,
stage : stage,
stageVariables : stageVariables,
cognitoIdentityId : cognitoIdentityId,
httpMethod : httpMethod,
sourceIp : sourceIp,
userAgent : userAgent,
requestId : requestId,
resourcePath : resourcePath
};
var response = {
statusCode: responseCode,
headers: {
"x-custom-header" : "custom header value"
},
body: JSON.stringify(responseBody)
};
console.log("response: " + JSON.stringify(response))
context.succeed(response);
};
this doesn't put the item to the table for some reason.
I gave the necessary permissions using the roles part, anything I am missing?
**responseCode is only for testing purposes.
Edit:
tried AWS node.js lambda request dynamodb but no response (no err, no return data) and doesn't work either.
Edit2:
Added the full handler code. (it the default generated code when creating first AWS lambda).
I have refactored some bits of your code to look much simpler and use async/await (make sure to select Node 8.10 as the running environment for your function) instead of callbacks. I also got rid of the context and callback parameters, as they were used for older versions of NodeJS. Once you're using Node 8+, async/await should be the default option.
Also, it is possible to chain a .promise() on docClient.putItem, so you can easily await on it, making your code way simpler. I have left only the DynamoDB part (which is what is relevant to your question)
'use strict';
console.log("Loading function");
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient({region:process.env.MOBILE_HUB_PROJECT_REGION});
exports.handler = async (event) => {
let params = {
Item:{
"prop0":1,
"prop2":"text"
},
TableName:"testTable"
};
try {
await docClient.put(params).promise();
} catch (e) {
console.log(e)
return {
messsage: e.message
}
}
return { message: 'Data inserted successfully' };
};
Things to keep in mind if still it does not work:
Make sure your Lambda function has the right permissions to insert items on DynamoDB (AmazonDynamoDBFullAccess will do it)
You ALWAYS have to provide the partition key when inserting items to DynamoDB. On your example, the JSON only has two properties: prop1 and prop2. If none of them are the partition key, your code will certainly fail.
Make sure you table also exists
If you code fails, just check CloudWatch logs as any exception is now captured and printed out on the console.
The reason why no data is written in the table is because the call to DynamoDB put is asynchronous and will return by calling your callback. But during that time, the rest of the code continues to execute and your function eventually finish before the call to DynamoDB has a chance to complete.
You can use the await / async keywords to make your code sychronous :
async function writeToDynamoDB(params) {
return new Promise((resolve,reject) => {
docClient.put(params, function(data, err){
if(err)
reject(500);
else
resolve(data);
});
});
}
let params = ...
var data = await writeToDynamoDB(params)
You can find sample code I wrote (in Typescript) at https://github.com/sebsto/maxi80-alexa/blob/master/lambda/src/DDBController.ts