I have a Lambda, that copies data from Redshift to S3.
I am trying to find the logs in CloudWatch when I manually trigger the Lambda. I click logs and search under "log groups" and cannot see these.
I have enabled logs on Redshift and S3, and assume any Lambda generated has logs.
The end goal is to set up "log groups" per service so that I can subscribe through Kinesis and send the data to Redshift.
If I try to 'create a log group' under actions, I can create '/aws-s3/test' for example, but I don't know what log stream is, or how to send all S3 logs from a particular folder to S3.
Where are the logs?
The logs from the AWS Lambda function will be automatically created in Amazon CloudWach Logs.
However, you must ensure that the Lambda function has permission to use CloudWatch Logs.
This is normally done by assigning the AWSLambdaBasicExecutionRole managed policy to the AIM Role used by the Lambda function. It contains the permissions:
logs:CreateLogStream
logs:PutLogEvents
They will allow the Lambda function to create the log entries.
See: AWS Lambda Execution Role - AWS Lambda
Related
I'm using AWS and I tried to generate a policy using access analyzer. The generated policy never contains the items I expect and am interested in most. I cannot figure out why. Moreover, the events I can see in the cloudtrail event logs do not include data events even though I've configured data events.
I have executed the following action
DynamoDB CreateTable aws dynamodb create-table --tablename ....
DynamoDB PutItem aws dynamodb put-item --table-name xxx --item file://contents.json
S3 list aws s3 ls s3://mygreatbucket
S3 download aws s3 cp s3://mygreatbucket/theevengreater/file .
The only relevant event that is being logged in the cloudtrail is the create-table event. The data events are missing. I can't figure out what I'm doing wrong. The cloud trail config says in the "data events" section "Log All Events" for both S3 and DynamoDB.
I followed the instructions in https://docs.aws.amazon.com/IAM/latest/UserGuide/access-analyzer-policy-generation.html. I know it says Data events not available – IAM Access Analyzer does not identify action-level activity for data events, such as Amazon S3 data events, in generated policies. but if so, what is the purpose of adding "Logging of data events" to the cloudtrail configuration?
Following this guide and creating a Kinesis Firehose Stream.
I have followed the guide and when I get to creating a subscription filter (step 12), I encounter this error when trying to send to S3:
An error occurred (InvalidParameterException) when calling the PutSubscriptionFilter operation: Could not deliver test message to specified Firehose stream. Check if the given Firehose stream is in ACTIVE state.
I can confirm that the stream is active and I can send test data via the console and it arrives in S3 as expected.
This is the command I am running (changed my account id):
aws logs put-subscription-filter --log-group-name "myLogGroup" --filter-name "Destination" --filter-pattern "{$.userIdentity.type = Root}" --destination-arn "arn:aws:firehose:ap-southeast-1:1234567890:deliverystream/my-delivery-stream" --role-arn "arn:aws:iam::1234567890:role/CWLtoKinesisFirehoseRole"
I have checked the trusted entities and the role has priviliges to logs and firehose. Any ideas?
I also struggle with this for a long time, for me it was those 2 gothca's:
step 4 in the guide:
make sure to change to bucket name to you bucket:
step 8 !!!:
make sure to put your account ID it is not highlighted:
I am sure you already know how to configure logs subscription filter so not adding steps in my answer
Go to firehose and check logs is your firehose has access to execute lambda ,if not please add required role.
Now start dummy data stream using firehose test and see is your data is moving till lambda or S3.
check cloud trail and cloud watch logs and see if found any error .
Open your IAM role and check all required role added to your role , now click trust relationship add- "logs group ","IAM" and component name in my case its "Ec2"
Hope this will helpful to resolve your issue.
https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/SubscriptionFilters.html#FirehoseExample
In short, I want to enable cloud trail for several objects in different S3 buckets. I am able to directly mention all the objects when creating CloudTrail from CloudFormation. But i want to add them at later point in time.
Create an AWS CloudTrail trail in a CloudFormation stack and export the trail's ARN.
Then when creating objects in S3 bucket to which i need CloudTrail data events for, I want to add them as this existing CloudTrail.
Here is the spot in console where I can manually add it.
CloudTrail AWS Console
So, Looking to add data events to an existing CloudTrail via CloudFormation.
Looked entire documentation several times, I can only see a way to add while creating the CloudTrail:
Create a CloudWatch Events Rule for an Amazon S3 Source (AWS CloudFormation Template) - CodePipeline
Please advice what is the resource type that supports this?
you can probably get some hint from the CFT I have created - from an S3 Event probably an putObject operations logs the events details into an separate bucket from where using CloudWatch Events trigger the execution of the Step Function State Machine.
cloudtrail:
Type: AWS::CloudTrail::Trail
Properties:
EnableLogFileValidation: Yes
EventSelectors:
- DataResources:
- Type: AWS::S3::Object
Values:
- arn:aws:s3:::s3-event-step-bucket/
IncludeManagementEvents: Yes
ReadWriteType: All
IncludeGlobalServiceEvents: Yes
IsLogging: Yes
IsMultiRegionTrail: Yes
S3BucketName: s3-event-step-bucket-storage
TrailName: xyz
When you deploy this CFT , it will update the existing Trail with CloudTrail data events as the Trigger Point.
I'm creating a logs aggregator lambda to send Cloudwatch logs to a private log analysis service. Given the number of resources used by my employer, it was decided to create a subscription lambda that handles log group subscription to the aggregator.
The solution works fine, but it requires to manually search a resource's log group via amazon console and then invoke the subscription lambda with it.
My question:
Is there a way to, given a resource arn, find which log group is mapped to it? Since I'm using Cloudformation to create resources it is easy to export a resource's arn.
UPDATE
To present an example:
Let's say I have the following arn:
arn:aws:appsync:<REGION>:<ACCOUNTID>apis/z3pihpr4gfbzhflthkyjjh6yvu
which is an Appsync GraphQL API.
What I want it a method (using te API or some automated solution) to get the Cloudwatch log group of that resource.
You can try the describe-log-groups command. It is available on the cli, must also be there on the API.
To get the names of the log groups you can go with:
aws logs describe-log-groups --query 'logGroups[*].logGroupName' --log-group-name-prefix '/aws/appsync/[name-of-the-resource]'
Output will look like this:
[
"/aws/appsync/[name-of-your-resource]"
]
In AWS Lambda, I can see errors in the cloud watch metrics sometimes, but when I check the logs for the logs, I don't see any error. Does the AWS Lambda don't log the errors automatically?
Make sure your Lambda function's execute role has the permissions logs:CreateLogGroup, logs:CreateLogStream and logs:PutLogEvents. A good way to do this is to grant the built-in policy AWSLambdaBasicExecutionRole to your Lambda function's execution role.