Problems accessing storage from Lambda function in Amplify - amazon-web-services

What I want to do?
I want to create REST API that returns data from my DynamoDB table which is being created by GraphQL model.
What I've done
Create GraphQL model
type Public #model {
id: ID!
name: String!
}
Create REST API with Lambda Function with access to my PublicTable
$ amplify add api
? Please select from one of the below mentioned services: REST
? Provide a friendly name for your resource to be used as a label for this category in the project: rest
? Provide a path (e.g., /book/{isbn}): /items
? Choose a Lambda source Create a new Lambda function
? Provide an AWS Lambda function name: listPublic
? Choose the runtime that you want to use: NodeJS
? Choose the function template that you want to use: Hello World
Available advanced settings:
- Resource access permissions
- Scheduled recurring invocation
- Lambda layers configuration
? Do you want to configure advanced settings? Yes
? Do you want to access other resources in this project from your Lambda function? Yes
? Select the category storage
? Storage has 8 resources in this project. Select the one you would like your Lambda to access Public:#model(appsync)
? Select the operations you want to permit for Public:#model(appsync) create, read, update, delete
You can access the following resource attributes as environment variables from your Lambda function
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
? Do you want to invoke this function on a recurring schedule? No
? Do you want to configure Lambda layers for this function? No
? Do you want to edit the local lambda function now? No
Successfully added resource listPublic locally.
Next steps:
Check out sample function code generated in <project-dir>/amplify/backend/function/listPublic/src
"amplify function build" builds all of your functions currently in the project
"amplify mock function <functionName>" runs your function locally
"amplify push" builds all of your local backend resources and provisions them in the cloud
"amplify publish" builds all of your local backend and front-end resources (if you added hosting category) and provisions them in the cloud
Succesfully added the Lambda function locally
? Restrict API access No
? Do you want to add another path? No
Successfully added resource rest locally
Edit my Lambda function
/* Amplify Params - DO NOT EDIT
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
Amplify Params - DO NOT EDIT */
const AWS = require("aws-sdk");
const region = process.env.REGION
AWS.config.update({ region });
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: "PublicTable"
}
async function listItems(){
try {
const data = await docClient.scan(params).promise()
return data
} catch (err) {
return err
}
}
exports.handler = async (event) => {
try {
const data = await listItems()
return { body: JSON.stringify(data) }
} catch (err) {
return { error: err }
}
};
Push my updates
$ amplify push
Open my REST API endpoint /items
{
"message": "User: arn:aws:sts::829736458236:assumed-role/myprojectLambdaRolef4f571b-dev/listPublic-dev is not authorized to perform: dynamodb:Scan on resource: arn:aws:dynamodb:us-east-1:8297345848236:table/Public-ssrh52tnjvcdrp5h7evy3zdldsd-dev",
"code": "AccessDeniedException",
"time": "2021-04-21T21:21:32.778Z",
"requestId": "JOA5KO3GVS3QG7RQ2V824NGFVV4KQNSO5AEMVJF66Q9ASUAAJG",
"statusCode": 400,
"retryable": false,
"retryDelay": 28.689093010346657
}
Problems
What I did wrong?
How do I access my table and why I didn't get it when I created it?
Why API_MYPROJECT_PUBLICTABLE_NAME and other constants are needed?

Decision
The problem turned out to be either the NodeJS version or the amplify-cli version. After updating amplify-cli and installing the node on the 14.16.0 version, everything worked.
I also changed the name of the table to what Amplify creates for us, although this code did not work before. The code became like this:
/* Amplify Params - DO NOT EDIT
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
Amplify Params - DO NOT EDIT */
const AWS = require("aws-sdk");
const region = process.env.REGION
const tableName = process.env.API_MYPROJECT_PUBLICTABLE_NAME
AWS.config.update({ region });
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: tableName
}
async function listItems(){
try {
const data = await docClient.scan(params).promise()
return data
} catch (err) {
return err
}
}
exports.handler = async (event) => {
try {
const data = await listItems()
return { body: JSON.stringify(data) }
} catch (err) {
return { error: err }
}
};

Related

how to update secrets manager secret value from lambda function?

I have a access token retrieved from an api and its lifetimes is 5 hours. I saw that secrets manager can rotate a secret but minimum rotation time is 1 day. Because of this i thought that i can write a lambda function that works every 5 hours and takes a new token from api and updates secret with this token. This answer https://stackoverflow.com/a/66610124/11902308 mentions that it can be done but i couldn't find any resource about how to update a secret from a lambda function.
Note:
I have multiple task and token is invalid when a new one is generated because of that task have to share it.
You didn't mention which AWS SDK you are using, but here is an example for AWS SDK for JS to create or update a Secret.
Additionally, it handles the case for deleted secrets (SecretsManager doesn't delete secrets immediately, it marks them as scheduled for deletion) and restores and updates them.
import { AWSError } from 'aws-sdk';
import SecretsManager, { SecretARNType } from 'aws-sdk/clients/secretsmanager';
const secretsManager = new SecretsManager({ apiVersion: '2017-10-17' });
const putSecret = async (name: string, secret: Record<string, unknown>, secretId?: SecretARNType): Promise<SecretARNType> => {
try {
const result = secretId
? secretsManager.updateSecret({
SecretId: secretId,
SecretString: JSON.stringify(secret),
})
: secretsManager.createSecret({
Name: name,
SecretString: JSON.stringify(secret),
});
const { ARN } = await result.promise();
if (!ARN) throw new Error(`Error saving secret ${name}`);
return ARN;
} catch (error) {
const awsError = error as AWSError;
// InvalidRequestException indicates that the secret is scheduled for deletion
if (awsError.code === 'InvalidRequestException') {
// restore existng secret
await secretsManager
.restoreSecret({
SecretId: secretId || name,
})
.promise();
// update secret
const { ARN } = await secretsManager
.updateSecret({
SecretId: secretId || name,
SecretString: JSON.stringify(secret),
})
.promise();
if (!ARN) throw new Error(`Error restoring secret ${name}`);
return ARN;
}
throw error;
}
};
The code is in TypeScript, but should be easily converted into the language of your choice because the AWS SDK functions have the same name in other languages.

AccessDeniedException while starting pipeline

I'm using AWS Lambda with AWS CDK to start my Pipeline
const PipelinesParams = {
name: "GatsbyPipelineLolly",
}
try {
const pipeline = new AWS.CodePipeline();
await docClient.put(params).promise();
pipeline.startPipelineExecution(
PipelinesParams,
function (err: any, data: any) {
if (err) {
console.log(err);
} else {
console.log(data);
}
}
)
and that's the action I authorized
const policy = new PolicyStatement();
policy.addActions('s3:*');
policy.addResources('*');
policy.addActions('codepipeline:*');
still getting unauthorized error image is also been attached for review
Are you sure, the policy is attached to the role, with which you are deploying the pipeline?
It looks like you've created a policy but haven't attached it to the role you are using (from your error message). Please see:
https://docs.aws.amazon.com/cdk/api/latest/docs/aws-iam-readme.html#using-existing-roles and https://docs.aws.amazon.com/cdk/api/latest/docs/#aws-cdk_aws-iam.Policy.html#roles
const role = iam.Role.fromRoleArn(this, 'Role', 'arn:aws:iam...')
policy.attachToRole(role)

How to return a YAML file as an output artifact from Lambda to CodePipeline

I am trying to return a YAML CloudFormation template file to CodePipeline so it can be passed to CloudFormation. The file is uploaded to S3, and can be accessed and read by Lambda just fine. The issue I have is turning this file into an output artifact.
I am currently able to output the raw text as an 'Output variable', which I've confirmed from the CodePipeline Execution History in the AWS Console. In this same page there is also an artifact:
However this link takes me to an error in S3, leading me to believe the artifact is never actually populated in any way.
This the Lambda function I have at the moment:
var AWS = require('aws-sdk')
var s3 = new AWS.S3();
var codepipeline = new AWS.CodePipeline();
exports.handler = (events, context) => {
const params = {
Bucket: 'myBucket',
Key: 'myFile.yml'
};
s3.getObject(params, function(err, data) {
if (err) {
// Handling
} else {
var yamlString = data.Body.toString('utf-8');
var params = {
jobId: jobId,
outputVariables: {
s3Artifact: file
}
};
codepipeline.putJobSuccessResult(params, function(err, data) {
if(err) {
context.fail(err);
} else {
context.succeed(message);
}
});
return;
}
});
}
What confuses me is that CodePipeline clearly allows me to specify an output artifact for this Lambda function, yet I cannot work out how to do so.
Any help is much appreciated.
Ultimately it seems I was going about this the wrong way. I changed the pipeline to have only an S3 source action and have it be triggered by a custom CloudWatch event for when a new image is pushed to ECR. I then manually disabled the S3 event created by CodePipeline so that only an ECR push triggers a deployment.
See more here.

Cannot find module aws-amplify, lambda function

I'm try to make an API post request in my lambda function but in the aws website, using nodejs I cannot import API ? Here is what I am trying
console.log('Loading function');
const AWS = require('aws-sdk');
const translate = new AWS.Translate({ apiVersion: '2017-07-01' });
var API = require('aws-amplify');
exports.handler = async (event, context) => {
try {
const params = {
SourceLanguageCode: 'en', /* required */
TargetLanguageCode: 'es', /* required */
Text: 'Hello World', /* required */
};
const data = await translate.translateText(params).promise();
createSite(data.TranslatedText);
} catch (err) {
console.log(err, err.stack);
}
function createSite(site) {
return API.post("sites", "/sites", {
body: site
});
}
};
I have also tried import...
I think you may be looking at front-end browser based JavaScript examples, which aren't always going to work in a back-end AWS Lambda NodeJS runtime environment. It appears you are trying to use this library, which states it is "a JavaScript library for frontend and mobile developers", which probably isn't what you want to use on AWS Lambda. It appears you also did not include that library in your AWS Lambda function's deployment.
I suggest using the AWS Amplify client in the AWS SDK for NodeJS which is automatically included in your Lambda function's runtime environment. You would create an Amplify client like so:
var amplify = new AWS.Amplify();

NodeJS script works locally but not in the handler of lambda

I have a NodeJS Lambda function that reads from a Kinesis stream, does some processing and writes it to another Kinesis stream. The writing part is causing an error on lambda. We use the aws-sdk npm module to write the data. When we call putRecord no callback is fired and lambda times out.
This writing code works well when run locally as a node app on a laptop.
Local config:
- The credentials of a programmatic user are with Kinesis Full Access policy are stored in the credentials file and the putRecord runs successfully returning the shardId, sequence number, etc.
Lambda function:
- The code for putRecord does not return and it hangs. No errors are thrown either. Lambda has a role with the Kinesis Full Access policy.
Code:
var AWS = require('aws-sdk');
var kinesis = new AWS.Kinesis({
region: 'us-east-1',
});
var randomNumber = Math.floor(Math.random() * 100000);
var data = 'data-' + randomNumber;
var partitionKey = 'pk-' + randomNumber;
var recordParams = {
Data: data,
PartitionKey: partitionKey,
StreamName: streamName
};
kinesis.putRecord(recordParams, function(err, data) {
console.log(data);
if (err) {
console.error(err);
}
});
Any idea what could be causing the issue. VPC or security group related maybe?
Thoughts and suggestions appereciated.
Thanks.
If you have uploaded the exact Node.js script code above to Lambda, it will definitely not work.
Lamda requires you to export a handler function that it will call.
So, your script should be written like this if you want it to be a Lambda function...
'use strict';
var AWS = require('aws-sdk');
var kinesis = new AWS.Kinesis({
region: 'us-east-1',
});
exports.handler = function (event, context, callback) {
var randomNumber = Math.floor(Math.random() * 100000);
var data = 'data-' + randomNumber;
var partitionKey = 'pk-' + randomNumber;
var recordParams = {
Data: data,
PartitionKey: partitionKey,
StreamName: streamName,
};
kinesis.putRecord(recordParams, function (err, data) {
callback(null, data);
if (err) {
callback(err);
}
});
};
Take note that instead of using console.log or console.error, you should call callback instead.
When you add a function to a VPC it only has access to resources inside that VPC. Any attempt to access resources outside the VPC will hang and eventually timeout. Since Kinesis doesn't exist inside your VPC, you can't access it.
The fix is to either run the Lambda function outside the VPC, or add a NAT Gateway to your VPC.