Moving specific Logs from AWS Cloudtrail to S3 - amazon-web-services

tI want to send a Cloudtrail log (Specifically when an secrets manager key rotates) to an S3 bucket. We already have the logs in Cloudtrail, is there an easy way to configure Cloudtrail to send these logs to S3 as soon as it happens? I was thinking of setting up a Lambda function that runs on a CRON schedule to do this for me, but would there be an easier way?

If you want to get specific events from CT in real-time as they happen, then you should setup CloudWatch Event rule for them:
Creating a CloudWatch Events Rule That Triggers on an AWS API Call Using AWS CloudTrail
The rule could be:
{
"source": [
"aws.secretsmanager"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"secretsmanager.amazonaws.com"
],
"eventName": [
"RotateSecret"
]
}
}
With a target of Firehose delivery stream set for S3 bucket of your choise.

There is a slightly easier way, although not necessarily cost-effective (depends on your case) and you won't avoid using Lambda. Instead of setting up a cron, you can enable CloudWatch export for your trail, from where you can set a Lambda subscription filter. This way you can export to S3 exactly the events you want (don't have to code the filters into function) as soon as they come. But - you have to pay extra for the CloudWatch Logs, so it's not a good option if you have a large trail.

Related

AWS EventBridge Pattern not capturing all events from SecretManager

I have the following pattern in event bridge:
{
"source": [
"aws.secretsmanager"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"secretsmanager.amazonaws.com"
],
"eventName": [
"CreateSecret",
"UpdateSecret",
"DeleteSecret",
"PutSecretValue",
"GetSecretValue",
"ListSecrets",
"RotationFailed",
"RotationSucceeded",
"DescribeSecret"
]
}
}
it is pointing to a Lambda that prints the event to Cloudwatch. Works just fine but when i try to capture events like:
"ListSecrets",
"RotationFailed",
"RotationSucceeded",
"DescribeSecret"
They never get capture by the event system filter i created. Other actions like Update/Create/Delete works just fine.
Is there any steps i am missing to get those?
Documentation Reference: https://docs.amazonaws.cn/en_us/secretsmanager/latest/userguide/retrieve-ct-entries.html
Thanks
All events that are delivered via CloudTrail have AWS API Call via CloudTrail as the value for detail-type. Events from API actions that start with the keywords List, Get, or Describe are not processed by EventBridge, with the exception of events from the following STS actions: GetFederationToken and GetSessionToken. Data events (for example, for Amazon S3 object level events, DynamoDB, and AWS Lambda) must have trails configured to receive those events. Learn more.
Warning from AWS at EventBridge page about Secrets Manager

AWS Event Bus fails to write logs to CloudWatch on a custom log group from AWS Lambda

I have an AWS lambda whose job it is to consume logs from an external source and write these logs to a custom CloudWatch log group. Please note that this lambda is already writing logs to its own log group, that's not my question. What I want is for it to write the externally-derived logs to another CloudWatch group.
Following the AWS documentation, and using CloudFormation, I created an event bus and a rule that targets CloudWatch:
redacted
I have omitted most of the CloudFormation template for clarity, just leaving in the parts that seem relevant.
What I am finding is that the Lambda receives the logs (via Kinesis), processes them and sends them to the event bus in the code snippet below:
redacted
The last line above indicates that the event is sent to the event bus:
redacted
However the Event Bus, having i believe, received the event, does not send the event off to CloudWatch. Even if i manually create the log group: ${AWS::StackName}-form-log-batch-function (I have kept the stack reference as a parameter to preserve anonymity).
I have checked the CloudFormation creation and all resources are present (confirmed by the Lambda not experiencing any exceptions, when it tries to send the event).
Anyone understand what I am missing here?
You can't write to CloudWatch Logs (CWL) using your WebLogsEventBusLoggingRole role. As AWS docs explain, you have to use CWL resource-based permissions:
When CloudWatch Logs is the target of a rule, EventBridge creates log streams, and CloudWatch Logs stores the text from the triggering events as log entries. To allow EventBridge to create the log stream and log the events, CloudWatch Logs must include a resource-based policy that enables EventBridge to write to CloudWatch Logs.
Sadly, you can't setup such permissions from vanila CloudFormation (CFN). This is not supported:
AWS::Events::Rule targetting Cloudwatch logs
To do it from CFN, you have to create custom resource in a form of a lambda function. The function would set CWL permissions using AWS SDK.
I hope this is helpful for others still looking for answers.
Problem#1 I want Cloudformation to work
You can use the cloud formation from https://serverlessland.com/patterns/eventbridge-cloudwatch or terraform
Problem#2 Why is EventBridge not able to write to Cloudwatch Logs in general
Just like what was said above, in order for aws event bridge to write to cloudwatch there needs to be a resource policy (a Policy set on the destination, in this case, Cloudwatch Logs). However please note
If you create a cloudwatch logs Target in the console a Resource Policy will be auto-generated for you, but the auto-generated one has a Twist
{
"Version": "2012-10-17",
"Statement":
[
{
"Sid": "TrustEventsToStoreLogEvent",
"Effect": "Allow",
"Principal":
{
"Service":
[
"events.amazonaws.com",
"delivery.logs.amazonaws.com"
]
},
"Action":
[
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": "arn:aws:logs:us-east-1:777777777:log-group:/*:*"
}
]
}
You will notice that the resource take the form of /*:*
which mean the log group has to start with / if you are going to use the auto generated one.
So if your log group is not in the format /event/myloggroup/ then the policy will not help you.
So for example
Target Log Group Name
ARN
Does it work?
event-bridge-rule2
arn:aws:logs:us-east-1:281458815962:log-group:event-bridge-rule2:*
Note the arn is missing starting /
/aws/events/helpme
arn:aws:logs:us-east-1:281458815962:log-group:/aws/events/helpme:*
Works like a charm
My advice is put a policy that makes sense to you and doesn't rely on the automatic one.
just create a loggroup with the name /aws/events/<yourgroupname> and it will work fine & also set logs:*

How can I trigger a lambda when a log group is created in cloudwatch?

How can I trigger a lambda when a log group is created in cloudwatch? What I am thinking the easiest way to do is to create a cloudwatch rule to send cloudtrail event to lambda. Is it reasonable to do? If yes, how can I filter out other events but only trigger lambda when a log group is created?
The only event type supported by CloudWatch Events (CWE) for CW Logs (CWL) is:
AWS API Call via CloudTrail
Therefore, you can catch the events of interests when you enabled CloudTrail (CT) trail. Once enable, API events would be available in CWE. Then, you would have to create CWE rule which captures CreateLogGroup API call. The rule would trigger your lambda function.
An example CWE rule could be:
{
"source": [
"aws.logs"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"logs.amazonaws.com"
],
"eventName": [
"CreateLogGroup"
]
}
}

How to trigger AWS Cloudwatch event only via put_events

My end goal is to start an ECS (fargate) task/Lambda based on API call (manually).
When creating the CloudWatch Rule I have to select a service to listen events on. I'm not sure what service I should use for my purpose.
What is the best thing to do? Should I create a CloudWatch alarm that I manually trigger?
Thanks
So you want to trigger a lambda function/ECS task based on an API call.This cloudwatch event rule service will depend on the type of API call you are running.
For example if there is a S3 Put event ,then you will select the S3 as the service and then the specific S3 operation you are running
{
"source": [
"aws.s3"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"s3.amazonaws.com"
],
"eventName": [
"PutObject"
]
}
}
If this a non-aws API call then you can use cloudwatch logs to trigger the Cloudwatch event rule.

How to create a custom event trigger to invoke a lambda whenever a new bucket is created?

I have a lambda function in Python that I want to invoke whenever a new s3 bucket is created. I want to create a custom event trigger to invoke it. What would be the best way to go ahead implementing this.
You can create a cloudwatch rule (see below) that triggers when a bucket is created or deleted and launches a lambda as its target.
In Cloud watch create rule > Choose
Service Name: Simple Storage Service s3
Event type: Bucket Level Operations
and select Specific Operations, specifying CreateBucket (and DeleteBucket) if you need it.
This will produce "custom" code similar to below.
{
"detail-type": [
"AWS API Call via CloudTrail"
],
"source": [
"aws.s3"
],
"detail": {
"eventSource": [
"s3.amazonaws.com"
],
"eventName": [
"CreateBucket",
"DeleteBucket"
]
}
}
I could answer here, but have a look on this: How to Execute Lambda Functions on S3 Event Triggers
Hello You can monitor new bucket creation from AWS Config or AWS Cloud Trail services and call Lambda function for such event.