I'm working on an AWS Lambda function using Node.js 12.x. I have the accountId that I pulled from the event.requestContext. Is there a way how to get the name of the account using the accountId inside the lambda function?
You can actually list the account aliases using api call ListAccountAliases. The corresponding example in from the documentaiton
// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region
AWS.config.update({region: 'REGION'});
// Create the IAM service object
var iam = new AWS.IAM({apiVersion: '2010-05-08'});
iam.listAccountAliases({MaxItems: 10}, function(err, data) {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data);
}
});
Related
I have a access token retrieved from an api and its lifetimes is 5 hours. I saw that secrets manager can rotate a secret but minimum rotation time is 1 day. Because of this i thought that i can write a lambda function that works every 5 hours and takes a new token from api and updates secret with this token. This answer https://stackoverflow.com/a/66610124/11902308 mentions that it can be done but i couldn't find any resource about how to update a secret from a lambda function.
Note:
I have multiple task and token is invalid when a new one is generated because of that task have to share it.
You didn't mention which AWS SDK you are using, but here is an example for AWS SDK for JS to create or update a Secret.
Additionally, it handles the case for deleted secrets (SecretsManager doesn't delete secrets immediately, it marks them as scheduled for deletion) and restores and updates them.
import { AWSError } from 'aws-sdk';
import SecretsManager, { SecretARNType } from 'aws-sdk/clients/secretsmanager';
const secretsManager = new SecretsManager({ apiVersion: '2017-10-17' });
const putSecret = async (name: string, secret: Record<string, unknown>, secretId?: SecretARNType): Promise<SecretARNType> => {
try {
const result = secretId
? secretsManager.updateSecret({
SecretId: secretId,
SecretString: JSON.stringify(secret),
})
: secretsManager.createSecret({
Name: name,
SecretString: JSON.stringify(secret),
});
const { ARN } = await result.promise();
if (!ARN) throw new Error(`Error saving secret ${name}`);
return ARN;
} catch (error) {
const awsError = error as AWSError;
// InvalidRequestException indicates that the secret is scheduled for deletion
if (awsError.code === 'InvalidRequestException') {
// restore existng secret
await secretsManager
.restoreSecret({
SecretId: secretId || name,
})
.promise();
// update secret
const { ARN } = await secretsManager
.updateSecret({
SecretId: secretId || name,
SecretString: JSON.stringify(secret),
})
.promise();
if (!ARN) throw new Error(`Error restoring secret ${name}`);
return ARN;
}
throw error;
}
};
The code is in TypeScript, but should be easily converted into the language of your choice because the AWS SDK functions have the same name in other languages.
What I want to do?
I want to create REST API that returns data from my DynamoDB table which is being created by GraphQL model.
What I've done
Create GraphQL model
type Public #model {
id: ID!
name: String!
}
Create REST API with Lambda Function with access to my PublicTable
$ amplify add api
? Please select from one of the below mentioned services: REST
? Provide a friendly name for your resource to be used as a label for this category in the project: rest
? Provide a path (e.g., /book/{isbn}): /items
? Choose a Lambda source Create a new Lambda function
? Provide an AWS Lambda function name: listPublic
? Choose the runtime that you want to use: NodeJS
? Choose the function template that you want to use: Hello World
Available advanced settings:
- Resource access permissions
- Scheduled recurring invocation
- Lambda layers configuration
? Do you want to configure advanced settings? Yes
? Do you want to access other resources in this project from your Lambda function? Yes
? Select the category storage
? Storage has 8 resources in this project. Select the one you would like your Lambda to access Public:#model(appsync)
? Select the operations you want to permit for Public:#model(appsync) create, read, update, delete
You can access the following resource attributes as environment variables from your Lambda function
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
? Do you want to invoke this function on a recurring schedule? No
? Do you want to configure Lambda layers for this function? No
? Do you want to edit the local lambda function now? No
Successfully added resource listPublic locally.
Next steps:
Check out sample function code generated in <project-dir>/amplify/backend/function/listPublic/src
"amplify function build" builds all of your functions currently in the project
"amplify mock function <functionName>" runs your function locally
"amplify push" builds all of your local backend resources and provisions them in the cloud
"amplify publish" builds all of your local backend and front-end resources (if you added hosting category) and provisions them in the cloud
Succesfully added the Lambda function locally
? Restrict API access No
? Do you want to add another path? No
Successfully added resource rest locally
Edit my Lambda function
/* Amplify Params - DO NOT EDIT
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
Amplify Params - DO NOT EDIT */
const AWS = require("aws-sdk");
const region = process.env.REGION
AWS.config.update({ region });
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: "PublicTable"
}
async function listItems(){
try {
const data = await docClient.scan(params).promise()
return data
} catch (err) {
return err
}
}
exports.handler = async (event) => {
try {
const data = await listItems()
return { body: JSON.stringify(data) }
} catch (err) {
return { error: err }
}
};
Push my updates
$ amplify push
Open my REST API endpoint /items
{
"message": "User: arn:aws:sts::829736458236:assumed-role/myprojectLambdaRolef4f571b-dev/listPublic-dev is not authorized to perform: dynamodb:Scan on resource: arn:aws:dynamodb:us-east-1:8297345848236:table/Public-ssrh52tnjvcdrp5h7evy3zdldsd-dev",
"code": "AccessDeniedException",
"time": "2021-04-21T21:21:32.778Z",
"requestId": "JOA5KO3GVS3QG7RQ2V824NGFVV4KQNSO5AEMVJF66Q9ASUAAJG",
"statusCode": 400,
"retryable": false,
"retryDelay": 28.689093010346657
}
Problems
What I did wrong?
How do I access my table and why I didn't get it when I created it?
Why API_MYPROJECT_PUBLICTABLE_NAME and other constants are needed?
Decision
The problem turned out to be either the NodeJS version or the amplify-cli version. After updating amplify-cli and installing the node on the 14.16.0 version, everything worked.
I also changed the name of the table to what Amplify creates for us, although this code did not work before. The code became like this:
/* Amplify Params - DO NOT EDIT
API_MYPROJECT_GRAPHQLAPIIDOUTPUT
API_MYPROJECT_PUBLICTABLE_ARN
API_MYPROJECT_PUBLICTABLE_NAME
ENV
REGION
Amplify Params - DO NOT EDIT */
const AWS = require("aws-sdk");
const region = process.env.REGION
const tableName = process.env.API_MYPROJECT_PUBLICTABLE_NAME
AWS.config.update({ region });
const docClient = new AWS.DynamoDB.DocumentClient();
const params = {
TableName: tableName
}
async function listItems(){
try {
const data = await docClient.scan(params).promise()
return data
} catch (err) {
return err
}
}
exports.handler = async (event) => {
try {
const data = await listItems()
return { body: JSON.stringify(data) }
} catch (err) {
return { error: err }
}
};
I am working with Temperature data sending. I am done with storing the data from DynamoDB.
Can anyone help me in getting the data from DynamoDB using AWS Lambda and send it to AWS S3 for quick sight?
The Following code is a little snapshot of your requirement of getting from dynamodb through Lambda, for details check out this doc.
// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region
AWS.config.update({region: 'REGION'});
// Create the DynamoDB service object
var ddb = new AWS.DynamoDB({apiVersion: '2012-08-10'});
var params = {
TableName: 'TABLE',
Key: {
'KEY_NAME': {N: '001'}
},
ProjectionExpression: 'ATTRIBUTE_NAME'
};
// Call DynamoDB to read the item from the table
ddb.getItem(params, function(err, data) {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data.Item);
}
});
To Create a Data Set Using Amazon S3 Files, you can check out the this doc
I'm building a React native app with serverless framework using AWS services.
I created a RESTapi with lambda function (nodeJs8.10 environment) and API gateway to use rekognition services such as indexFaces, listCollection, etc. My lambda is in VPC with RDS( later I'll Aurora) to store faceID and other data.
Everything works fine except rekognition services.
When I call any rekognition services it shows Task timed out after 270.04 seconds.But it works when I call locally using serverless-offline-plugin
I attach all necessary permissions to my lambda like AmazonRekognitionFullAccess
Here is my code
index.js
app.post('/myapi', function (req, res) {
var params = {
MaxResults: 3,
};
const rekognition = aws_config(); <-- rekognition configuration
rekognition.listCollections(params, function(err, data) {
if (err) {
res.json(err.stack);
console.log(err, err.stack);
}
else{
res.json(data);
console.log(data);
}
});
});
function aws_config(){
const $options = {
'region' : 'ap-southeast-2',
'version' : '2016-06-27',
'accessKeyId ' : config.ENV.aws_key,
'secretAccessKey ' : config.ENV.aws_secret,
};
return new AWS.Rekognition($options);
}
How to solve this timeout error as it doesn't show any error on CloudWatch logs?
I have a NodeJS Lambda function that reads from a Kinesis stream, does some processing and writes it to another Kinesis stream. The writing part is causing an error on lambda. We use the aws-sdk npm module to write the data. When we call putRecord no callback is fired and lambda times out.
This writing code works well when run locally as a node app on a laptop.
Local config:
- The credentials of a programmatic user are with Kinesis Full Access policy are stored in the credentials file and the putRecord runs successfully returning the shardId, sequence number, etc.
Lambda function:
- The code for putRecord does not return and it hangs. No errors are thrown either. Lambda has a role with the Kinesis Full Access policy.
Code:
var AWS = require('aws-sdk');
var kinesis = new AWS.Kinesis({
region: 'us-east-1',
});
var randomNumber = Math.floor(Math.random() * 100000);
var data = 'data-' + randomNumber;
var partitionKey = 'pk-' + randomNumber;
var recordParams = {
Data: data,
PartitionKey: partitionKey,
StreamName: streamName
};
kinesis.putRecord(recordParams, function(err, data) {
console.log(data);
if (err) {
console.error(err);
}
});
Any idea what could be causing the issue. VPC or security group related maybe?
Thoughts and suggestions appereciated.
Thanks.
If you have uploaded the exact Node.js script code above to Lambda, it will definitely not work.
Lamda requires you to export a handler function that it will call.
So, your script should be written like this if you want it to be a Lambda function...
'use strict';
var AWS = require('aws-sdk');
var kinesis = new AWS.Kinesis({
region: 'us-east-1',
});
exports.handler = function (event, context, callback) {
var randomNumber = Math.floor(Math.random() * 100000);
var data = 'data-' + randomNumber;
var partitionKey = 'pk-' + randomNumber;
var recordParams = {
Data: data,
PartitionKey: partitionKey,
StreamName: streamName,
};
kinesis.putRecord(recordParams, function (err, data) {
callback(null, data);
if (err) {
callback(err);
}
});
};
Take note that instead of using console.log or console.error, you should call callback instead.
When you add a function to a VPC it only has access to resources inside that VPC. Any attempt to access resources outside the VPC will hang and eventually timeout. Since Kinesis doesn't exist inside your VPC, you can't access it.
The fix is to either run the Lambda function outside the VPC, or add a NAT Gateway to your VPC.