Is it possible to reference an AWS Lambda from itself? - amazon-web-services

I apologize if this question is unclear in any way - I will do my best to add detail if it is difficult to understand. I have an AWS Lambda, from which I would like to access the tags for that same lambda. I have found the listTags method for AWS Lambda, which appears to be what I am looking for. It can be called as follows:
var params = {
Resource: "arn:aws:lambda:us-west-2:123456789012:function:my-function"
};
lambda.listTags(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else console.log(data); // successful response
});
However, in order to use this function, we have to create a new instance of the lambda using the lambda constructor:
var lambda = new AWS.Lambda({apiVersion: '2015-03-31'});
I don't think that this is what I want to do. Instead, I want to have access to the tags for this particular lambda whenever the lambda is run. So, if I invoke the lambda, I want that invocation to be able to look and see that the lambda, itself, has a tag with the key "environment" and value "production," for example. I wouldn't think I would want to construct a new instance from within it... of itself.
Surely there has to be a way to do this? I may be missing something obvious. I've tried the code I've provided above using the context object in place of the lambda, but to no avail.

You should consider Lambda Environment Variables.
AWS Tags simply is metadata used to organise resources, aws documents indicates their usage as:
- Tags for resource organization
- Tags for cost allocation
- Tags for automation
- Tags for access control
Ref: https://docs.aws.amazon.com/general/latest/gr/aws_tagging.html

Related

Evaluate AWS CDK Stack output to another Stack in different account

I am creating two Stack using AWS CDK. I use the first Stack to create an S3 bucket and upload lambda Zip file to the bucket using BucketDeployment construct, like this.
//FirstStack
const deployments = new BucketDeployment(this, 'LambdaDeployments', {
destinationBucket: bucket,
destinationKeyPrefix: '',
sources: [
Source.asset(path)
],
retainOnDelete: true,
extract: false,
accessControl: BucketAccessControl.PUBLIC_READ,
});
I use the second Stack just to generate CloudFormation template to my clients. In the second Stack, I want to create a Lambda function with parameters S3 bucket name and key name of the Lambda zip I uploaded in the 1st stack.
//SecondStack
const lambdaS3Bucket = "??"; //TODO
const lambdaS3Key = "??"; //TODO
const bucket = Bucket.fromBucketName(this, "Bucket", lambdaS3Bucket);
const lambda = new Function(this, "LambdaFunction", {
handler: 'index.handler',
runtime: Runtime.NODEJS_16_X,
code: Code.fromBucket(
bucket,
lambdaS3Key
),
});
How do I refer the parameters automatically from 2nd Lambda?
In addition to that, the lambdaS3Bucket need to have AWS::Region parameters so that my clients can deploy it in any region (I just need to run the first Stack in the region they require).
How do I do that?
I had a similar usecase to this one.
The very simple answer is to hardcode the values. The bucketName is obvious.
The lambdaS3Key You can look up in the synthesized template of the first stack.
More complex answer is to use pipelines for this. I've did this and in the build step of the pipeline I extracted all lambdaS3Keys and exported them as environment variable, so in the second stack I could reuse these in the code, like:
code: Code.fromBucket(
bucket,
process.env.MY_LAMBDA_KEY
),
I see You are aware of this PR, because You are using the extract flag.
Knowing that You can probably reuse this property for Lambda Key.
The problem of sharing the names between the stacks in different accounts remains nevertheless. My suggestion is to use pipelines and the exported constans there in the different steps, but also a local build script would do the job.
Do not forget to update the BucketPolicy and KeyPolicy if You use encryption, otherwise the customer account won't have the access to the file.
You could also read about the AWS Service Catalog. Probably this would be a esier way to share Your CDK products to Your customers (CDK team is going to support the out of the box lambda sharing next on)

How to create metrics/alarms for AWS Logs SubscriptionFilter using CDK?

Context
I have created a AWS Logs SubscriptionFilter using CDK. I am now trying to create a metric/alarm for some of the metrics for this resource.
Problem
All the metrics I am interested in (see ForwardedLogEvents, DeliveryErrors, DeliveryThrottling in the Monitoring AWS Logs with CloudWatch Metrics docs) requires these dimensions to be specified:
LogGroupName
DestinationType
FilterName
The first two are easy to specify since the LogGroupName is also required while creating the construct and DestinationType in my case is just Lambda. However, I see no way to get FilterName using CDK.
Using CloudWatch, I see that the FilterName is like MyStackName-MyLogicalID29669D87-GCMA0Q4KKALH. So I can't directly specify it using a Fn.ref (since I don't know the logical id). Using CloudFormation, I could have directly done Ref: LogicalId.
I also don't see any properties on the SubscriptionFilter object that will return this (unlike most other CDK constructs this one seems pretty bare and returns absolutely no information about the resource).
There are also no metric* methods on SubscriptionFilter object (unlike other standard constructs like Lambda functions, S3 buckets etc.), so I have to manually specify the Metric object. See for example: CDK metric objects docs.
The CDK construct (and the underlying CloudFormation resource: AWS::Logs::SubscriptionFilter) does not let me specify the FilterName - so I can't use a variable to specify it also and the name is dynamically generated.
Example code that is very close to what I need:
const metric = new Metric({
namespace: 'AWS/Logs',
metricName: 'ForwardedLogEvents',
dimensions: {
DestinationType: 'Lambda',
// I know this value since I specified it while creating the SubscriptionFilter
LogGroupName: 'MyLogGroupName',
FilterName: Fn.ref('logical-id-wont-work-since-it-is-dynamic-in-CDK')
}
})
Question
How can I figure out how to acquire the FilterName property to construct the Metric object?
Or otherwise, is there another way to go about this?
I was able to work around this by using Stack#getLogicalId method.
Example code
In Kotlin, as an extension function for any Construct):
fun Construct.getLogicalId() = Stack.of(this).getLogicalId(this.node.defaultChild as CfnElement)
... and then use it with any Construct:
val metric = Metric.Builder.create()
.namespace("AWS/Logs")
.metricName("ForwardedLogEvents")
.dimensions(mapOf(
"DestinationType" to "Lambda",
"LogGroupName" to myLogGroup.logGroupName,
"FilterName" to mySubscriptionFilter.getLogicalId()
))
.statistic("sum")
.build()

How to delete aws-lambdas based on their tag

I have few lambdas created by automation using Ansible.
- lambda:
name: 'NAME'
state: present
zip_file: 'index.js.zip'
tags:
createdBy: 'ansible'
And few more lambdas which are created manually.
I would like to delete all lambdas which are created by "Ansible", so I added "tag" attribute to all the automated lambdas.
I know, we can delete lambda if we have its name, but I would like to get all lambdas and filter the lambda which has tags['createdBy']='ansible.
lambda_facts are a way to get all lambda configuration, but it doesn't give me tag details.
How do I delete lambdas by filtering tags ?
I suggest to get all your functions with help of lambda.listFunctions and run lambda.listTags for each Lambda and build your own array with all functions which includes ansible as a tag.
In the end iterate over this array and call lambda.deleteFunction for each entry.
(I dont know which language you prefer. All my examples are using the JavaScript SDK)

DynamoDB to 'vanilla' JSON

I'm starting out using some of the managed services in AWS. One thing that seems like it should be easy, is to use the API gateway to secure and expose calls to DynamoDB.
I've got this working. However, it seems a little clunky. DynamoDB returns something like this:
{
"id":{"N":"3"}
// Lots of other fields
}
When really I (and most other consumers out there) would like something like this:
{
"id":"3"
// Lots of other fields
}
The way I see it, I've got two options.
1) Add a response mapping field by field in the AWS API UI. This seems laborious and error prone:
#set($inputRoot = $input.path('$'))
{
"Id": "$elem.Id.N"
// Lots of other fields
}
2) Write a specific lambda between API Gateway and Dynamo that does this mapping. Like https://stackoverflow.com/a/42231827/2012130 This adds another thing in the mix to maintain.
Is there a better way? Am I missing something? Seems to be so close to awesome.
const AWS = require('aws-sdk');
var db = new AWS.DynamoDB.DocumentClient({
region: 'us-east-1',
apiVersion: '2012-08-10'
});
You can use ODM like dynogels,
https://github.com/clarkie/dynogels
We use that heavily without dealing with dynamodb syntaxes.
This brings lambda and language in the mix, but it is much easier to handle when an object grows larger to perform the mapping.
Hope this helps.
There's another option added today. It'll still involve a lambda step, but..
"The Amazon DynamoDB DataMapper for JavaScript is a high-level client for writing and reading structured data to and from DynamoDB, built on top of the AWS SDK for JavaScript."
https://aws.amazon.com/blogs/developer/introducing-the-amazon-dynamodb-datamapper-for-javascript-developer-preview/

How to use AWS Lambda to check file in S3

Brand new to AWS Lambda so I'm not even sure if this is the right tool to accomplish what I'm trying to do.
Basically what I'm trying to do is check if a file exists or if it was updated recently in S3. If that file isn't there or wasn't updated recently I want an AMI to be cloned to an AWS instance.
Is the above possible?
Is Lambda the right tool for the job?
I'm fairly competent in JavaScript but have never used node.js or Python so writing a Lambda function seems complex to me.
Do you know of any resources that can help with building Lambda functions?
Thanks!
Is will be easy if you've know about Javascript and know about NPM. Let me give you easy way with node js :
login to your AWS.
go to AWS console menu, the button at right top corner.
choose lambda, click function and create new function.
click skip button on blue print page.
skip configuration trigger page.
you will see configuration function page, then you can fill the function name, runtime use NodeJS.4.3, and choose code type Edit Code Inline
at the bottom from Edit Code Inline box, you must choose IAM Role that you ever have. if you don't have any IAM Roles, please go to AWS Console and choose Identity and Access Management(IAM), select Roles and create it new.
If you have finish fill all field required, you can click next and create Lambda Function.
NOTE : in your Edit Code Inline Box, please write down this code :
exports.handler = function(event, context, callback) {
var AWS = require('aws-sdk');
AWS.config.update({accessKeyId: 'xxxxxxxxxxx', secretAccessKey: 'xxxxxxxxxxxxxxxxxxxx'});
var s3 = new AWS.S3();
var params = {Bucket: 'myBucket', Key: 'myFile.html'};
s3.getObject(params, function(err, data) {
if (err) {
console.log(err, err.stack);
// file does not exist, do something
}
else {
console.log(data);
//file exist, do something
}
});
};
you can get accessKeyId from IAM menu -> Users -> Security Credentials -> Create Access Key. then you will get secretAccessKey too.
Hope this answer will be help you.