add CloudwatchEvent dynamically in AWS Lambda code - amazon-web-services

I have a Lambda function triggered by S3 file Put event.Now once the lambda is triggered i want to attach cloudwatch event(cron) to the same lambda in the code.Is it possible?

You need to do 2 things to accomplish this
Add a target(lambda) in the cloudwatch rule (cron)
Add permission in lambda to allow the rule to invoke the lambda
I don't have an exact code sample to give you, but the below snippets will have to be included in your function to achieve this -
import boto3
event_client = boto3.client('events')
event_response = event_client.put_targets(
Rule=RULENAME,
Targets=[{
'Id': 'A_UNIQUE_STRING',
'Arn': 'ARN_LAMBDA'
}]
)
lambda_client = boto3.client('lambda')
lambda_response = lambda_client.add_permission(
FunctionName="LAMBDA_NAME",
StatementId="A_UNIQUE_STRING",
Action="lambda:InvokeFunction",
Principal="events.amazonaws.com",
SourceArn="ARN_RULE"
)
ARN_LAMBDA should be something like - arn:aws:lambda:<aws-region>:<aws-account-number>:function:<lambda-name>
ARN_RULE should be something like - arn:aws:events:<aws-region>:<aws-account-number>:rule/<rule-name>
A_UNIQUE_STRING - you can generate something in your code which is meaningful and unique or just a random string.
You can refer the guides in the boto3 documentation of lambda and cloudwatch events for more details -
http://boto3.readthedocs.io/en/latest/reference/services/lambda.html#Lambda.Client.add_permission
http://boto3.readthedocs.io/en/latest/reference/services/events.html#CloudWatchEvents.Client.put_targets

This should be possible for non-streaming triggers but you need to handle the two different event types in code.
On the other hand it would be better to use two seperate Lambdas since you will be only paying for the usage.

Related

List all LogGroups using cdk

I am quite new to the CDK, but I'm adding a LogQueryWidget to my CloudWatch Dashboard through the CDK, and I need a way to add all LogGroups ending with a suffix to the query.
Is there a way to either loop through all existing LogGroups and finding the ones with the correct suffix, or a way to search through LogGroups.
const queryWidget = new LogQueryWidget({
title: "Error Rate",
logGroupNames: ['/aws/lambda/someLogGroup'],
view: LogQueryVisualizationType.TABLE,
queryLines: [
'fields #message',
'filter #message like /(?i)error/'
],
})
Is there anyway I can add it so logGroupNames contains all LogGroups that end with a specific suffix?
You cannot do that dynamically (i.e. you can't make this work such that if you add a new LogGroup, the query automatically adjusts), without using something like AWS lambda that periodically updates your Log Query.
However, because CDK is just a code, there is nothing stopping you from making an AWS SDK API call inside the code to retrieve all the log groups (See https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/CloudWatchLogs.html#describeLogGroups-property) and then populate logGroupNames accordingly.
That way, when CDK compiles, it will make an API call to fetch LogGroups and then generated CloudFormation will contain the log groups you need. Note that this list will only be updated when you re-synthesize and re-deploy your stack.
Finally, note that there is a limit on how many Log Groups you can query with Log Insights (20 according to https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/AnalyzingLogData.html).
If you want to achieve this, you can create a custom resource using AwsCustomResource and AwsSdkCall classes to do the AWS SDK API call (as mentioned by #Tofig above) as part of the deployment. You can read data from the API call response as well and act on it as you want.

how to pass s3 event as input to step function?

I am new on aws so please understand if I miss something,
So i want to create step function that trigger the lambda.
I have a situation like: lambda_A copy file from one s3 bucket to another like bucket_1 to bucket_2 and in same lambda i am invoking the step function with boto3 client. in the step function i am triggering the another lambda_B but lambda_B require s3 event of bucket_2.
Now the problem is how can i pass s3 event of bucket_2 as input to step function ?
Note: I tried creating another lambda that triggered from bucket_2 and in that lambda i started the step function with s3 event however i cannot create another lambda just for this small work.
If anyone have any idea about this situation please share.
Thank you in advance.
Normally you would pass the event as input when you start your SFN.
For example in boto3, when you use start_execution to start your SFN, you would pass the S3 event as an input parameter:
response = client.start_execution(
stateMachineArn='string',
name='string',
input=json.dumps(event)
)

Can I create temporary users through Amazon Cognito?

Does Amazon Cognito support temporary users? For my use case, I want to be able to give access to external users, but limited to a time period (e.g. 7 days)
Currently, my solution is something like:
Create User in User Group
Schedule cron job to run in x days
Job will disable/remove User from User Group
This all seems to be quite manual and I was hoping Cognito provides something similar automatically.
Unfortunately there is no functionality used to automate this workflow so you would need to devise your own solution.
I would suggest the below approach to handling this:
Create a Lambda function that is able to post process a user sign up. This Lambda function would create a CloudWatch Event with a schedule for 7 days in the future. Using the SDK you would create the event and assign a target of another Lambda function. When you specify the target in the put_targets function use the Input parameter to pass in your own JSON, this should contain a metadata item related to the user.
You would then create a post confirmation Lambda trigger which would trigger the Lambda you created in the above step. This would allow you to schedule an event every time a user signs up.
Finally create the target Lambda for the CloudWatch event, this will access the input passed in from the trigger and can use the AWS SDK to perform any cognito functions you might want to use such as deleting the user.
The benefit to using these services rather a cron, is that you can perform the most optimal processing only when it is required. If you have many users in this temporary group you would need to loop through every user and compare if its ready to be removed for a one time script (and perhaps sometimes never remove users).
My solution for this is the following: Instead of creating a post confirmation lambda trigger you can also create a pre authentication lambda trigger. This trigger will check for the user attribute "valid_until" which contains a unix timestamp. The pre authentication lambda trigger will only let the user in if the value of the "valid_until" attribute is in the future. Main benefit of this solution is that you don't need any cron-jobs.

How to create metrics/alarms for AWS Logs SubscriptionFilter using CDK?

Context
I have created a AWS Logs SubscriptionFilter using CDK. I am now trying to create a metric/alarm for some of the metrics for this resource.
Problem
All the metrics I am interested in (see ForwardedLogEvents, DeliveryErrors, DeliveryThrottling in the Monitoring AWS Logs with CloudWatch Metrics docs) requires these dimensions to be specified:
LogGroupName
DestinationType
FilterName
The first two are easy to specify since the LogGroupName is also required while creating the construct and DestinationType in my case is just Lambda. However, I see no way to get FilterName using CDK.
Using CloudWatch, I see that the FilterName is like MyStackName-MyLogicalID29669D87-GCMA0Q4KKALH. So I can't directly specify it using a Fn.ref (since I don't know the logical id). Using CloudFormation, I could have directly done Ref: LogicalId.
I also don't see any properties on the SubscriptionFilter object that will return this (unlike most other CDK constructs this one seems pretty bare and returns absolutely no information about the resource).
There are also no metric* methods on SubscriptionFilter object (unlike other standard constructs like Lambda functions, S3 buckets etc.), so I have to manually specify the Metric object. See for example: CDK metric objects docs.
The CDK construct (and the underlying CloudFormation resource: AWS::Logs::SubscriptionFilter) does not let me specify the FilterName - so I can't use a variable to specify it also and the name is dynamically generated.
Example code that is very close to what I need:
const metric = new Metric({
namespace: 'AWS/Logs',
metricName: 'ForwardedLogEvents',
dimensions: {
DestinationType: 'Lambda',
// I know this value since I specified it while creating the SubscriptionFilter
LogGroupName: 'MyLogGroupName',
FilterName: Fn.ref('logical-id-wont-work-since-it-is-dynamic-in-CDK')
}
})
Question
How can I figure out how to acquire the FilterName property to construct the Metric object?
Or otherwise, is there another way to go about this?
I was able to work around this by using Stack#getLogicalId method.
Example code
In Kotlin, as an extension function for any Construct):
fun Construct.getLogicalId() = Stack.of(this).getLogicalId(this.node.defaultChild as CfnElement)
... and then use it with any Construct:
val metric = Metric.Builder.create()
.namespace("AWS/Logs")
.metricName("ForwardedLogEvents")
.dimensions(mapOf(
"DestinationType" to "Lambda",
"LogGroupName" to myLogGroup.logGroupName,
"FilterName" to mySubscriptionFilter.getLogicalId()
))
.statistic("sum")
.build()

Cannot add a Lambda target to an existing CloudWatch rule

I want to add a Lambda target to an existing CloudWatch rule. I used the following to refer to the existing rule:
rule = events.Rule.from_event_rule_arn(self, "Rule",event_rule_arn='')
Later I add a target with:
rule.add_target(targets.LambdaFunction(lambdaFn))
When I execute a cdk synth or deploy, I get the following error:
AttributeError: '+' object has no attribute 'add_target'
I know the IRule element does not have that method but I cannot find a clear way how to achieve what I need.
I also try using event source in Lambda but got the following error:
Unrecognized event source, must be kinesis, dynamodb stream or sqs.
I do not think this is possible. You need to reference the lambda function and manage the rule from the stack to which the rule belongs.
As MilanG suggests, it is not possible to do.
My use case needs to create several Lambda functions and set the same trigger to them and CloudWatch Rules is not a fit for it because of a 5 targets per rule hard limit. I use SNS instead as follows:
sns_topic = aws_sns.Topic.from_topic_arn(scope, id, topic_arn=config)
lambdaFn.add_event_source(aws_lambda_event_sources.SnsEventSource(sns_topic))