Amazon Web Services: Unable to generate policy with access analyzer - amazon-web-services

I'm using AWS and I tried to generate a policy using access analyzer. The generated policy never contains the items I expect and am interested in most. I cannot figure out why. Moreover, the events I can see in the cloudtrail event logs do not include data events even though I've configured data events.
I have executed the following action
DynamoDB CreateTable aws dynamodb create-table --tablename ....
DynamoDB PutItem aws dynamodb put-item --table-name xxx --item file://contents.json
S3 list aws s3 ls s3://mygreatbucket
S3 download aws s3 cp s3://mygreatbucket/theevengreater/file .
The only relevant event that is being logged in the cloudtrail is the create-table event. The data events are missing. I can't figure out what I'm doing wrong. The cloud trail config says in the "data events" section "Log All Events" for both S3 and DynamoDB.
I followed the instructions in https://docs.aws.amazon.com/IAM/latest/UserGuide/access-analyzer-policy-generation.html. I know it says Data events not available – IAM Access Analyzer does not identify action-level activity for data events, such as Amazon S3 data events, in generated policies. but if so, what is the purpose of adding "Logging of data events" to the cloudtrail configuration?

Related

AWS - SNS notification for backup vault events - cross-account

I'm facing a problem regarding SNS notification.
I would like to prepare a backup policy for Organization where I have w backup policy on the master account and SNS topic also. I'm using Terraform to prepare stack in Cloudformation to create a backup vault for each selected account but when I want to set up the notification then I receive the following error:
aws backup put-backup-vault-notifications \
--backup-vault-name test_backup_vault \
--sns-topic-arn arn:aws:sns:eu-central-1:XXXXX:cross-account-backup-topic \
--backup-vault-events RESTORE_JOB_STARTED RESTORE_JOB_COMPLETED RECOVERY_POINT_MODIFIED
An error occurred (InvalidParameterValueException) when calling the PutBackupVaultNotifications
operation: Provided SNS topic belongs to another account
I saw there is an option to grant permission cross-account: https://docs.aws.amazon.com/aws-backup/latest/devguide/sns-notifications.html
How I can achieve this?

How to use Cloudtrail to get who created IAM user

How to use Cloudtrail to get who created IAM user , how to get this from logs
If the IAM user was created inside the last 90 days, you can find who created the user using CloudTrail Event history.
Using the AWS CLI:
aws cloudtrail lookup-events --lookup-attributes AttributeKey=EventName,AttributeValue=CreateUser --region us-east-1
Using the Console: Go to Event History in CloudTrail Service, choose the EventName filter with a value of CreateUser. You have to use the region us-east-1 to view the events.
If the IAM user was created outside the 90 days time window, you can still find out who created the user if you have a trail enabled in CloudTrail. You can use Amazon Athena or some other method to search the log files created by CloudTrail in S3.
References:
https://docs.aws.amazon.com/awscloudtrail/latest/userguide/view-cloudtrail-events-cli.html (Note the disclaimer for global services post November 22, 2021)
https://docs.aws.amazon.com/awscloudtrail/latest/userguide/view-cloudtrail-events-console.html
https://docs.aws.amazon.com/athena/latest/ug/cloudtrail-logs.html

AWS Put Subscription Filter for Kinesis Firehose using Cloudformation - Check if the given Firehose stream is in ACTIVE state

Following this guide and creating a Kinesis Firehose Stream.
I have followed the guide and when I get to creating a subscription filter (step 12), I encounter this error when trying to send to S3:
An error occurred (InvalidParameterException) when calling the PutSubscriptionFilter operation: Could not deliver test message to specified Firehose stream. Check if the given Firehose stream is in ACTIVE state.
I can confirm that the stream is active and I can send test data via the console and it arrives in S3 as expected.
This is the command I am running (changed my account id):
aws logs put-subscription-filter --log-group-name "myLogGroup" --filter-name "Destination" --filter-pattern "{$.userIdentity.type = Root}" --destination-arn "arn:aws:firehose:ap-southeast-1:1234567890:deliverystream/my-delivery-stream" --role-arn "arn:aws:iam::1234567890:role/CWLtoKinesisFirehoseRole"
I have checked the trusted entities and the role has priviliges to logs and firehose. Any ideas?
I also struggle with this for a long time, for me it was those 2 gothca's:
step 4 in the guide:
make sure to change to bucket name to you bucket:
step 8 !!!:
make sure to put your account ID it is not highlighted:
I am sure you already know how to configure logs subscription filter so not adding steps in my answer
Go to firehose and check logs is your firehose has access to execute lambda ,if not please add required role.
Now start dummy data stream using firehose test and see is your data is moving till lambda or S3.
check cloud trail and cloud watch logs and see if found any error .
Open your IAM role and check all required role added to your role , now click trust relationship add- "logs group ","IAM" and component name in my case its "Ec2"
Hope this will helpful to resolve your issue.
https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/SubscriptionFilters.html#FirehoseExample

Where are S3/Redshift/Lambda Logs?

I have a Lambda, that copies data from Redshift to S3.
I am trying to find the logs in CloudWatch when I manually trigger the Lambda. I click logs and search under "log groups" and cannot see these.
I have enabled logs on Redshift and S3, and assume any Lambda generated has logs.
The end goal is to set up "log groups" per service so that I can subscribe through Kinesis and send the data to Redshift.
If I try to 'create a log group' under actions, I can create '/aws-s3/test' for example, but I don't know what log stream is, or how to send all S3 logs from a particular folder to S3.
Where are the logs?
The logs from the AWS Lambda function will be automatically created in Amazon CloudWach Logs.
However, you must ensure that the Lambda function has permission to use CloudWatch Logs.
This is normally done by assigning the AWSLambdaBasicExecutionRole managed policy to the AIM Role used by the Lambda function. It contains the permissions:
logs:CreateLogStream
logs:PutLogEvents
They will allow the Lambda function to create the log entries.
See: AWS Lambda Execution Role - AWS Lambda

How do you enable S3 Object Logging to Cloud Trail using AWS CLI?

Its possible to do object logging on a S3 bucket to Cloud trail using the following guide, but this is through the console.
https://docs.aws.amazon.com/AmazonS3/latest/user-guide/enable-cloudtrail-events.html
I've been trying to figure out a way to do this via the cli since want to do this for many buckets but haven't had much luck. I've setup a new cloud trail on my account and would like to map it to s3 buckets to do object logging. Is there a cli for this?
# This is to grant s3 log bucket access (no link to cloudtrail here)
aws s3api put-bucket-logging
It looks like you'll need to use the CloudTrail put_event_selectors() command:
DataResources
CloudTrail supports data event logging for Amazon S3 objects and AWS Lambda functions.
(dict): The Amazon S3 buckets or AWS Lambda functions that you specify in your event selectors for your trail to log data events.
Do a search for object-level in the documentation page.
Disclaimer: The comment by puji in the accepted answer works. This is an expansion of that answer with the resources.
Here is the AWS documentation on how to do this through the AWS CLI
https://docs.aws.amazon.com/cli/latest/reference/cloudtrail/put-event-selectors.html
The specific CLI command you are interested is the following from the above documentation. The original documentation lists two objects in the same bucket. I have modified it to cover all the objects in two buckets.
aws cloudtrail put-event-selectors --trail-name TrailName --event-selectors '[{"ReadWriteType": "All","IncludeManagementEvents": true,"DataResources": [{"Type":"AWS::S3::Object", "Values": ["arn:aws:s3:::mybucket1/","arn:aws:s3:::mybucket2/"]}]}]'
If you want all the S3 buckets in your AWS accounts covered you can use arn:aws:s3::: instead of list of bucket arns like the following.
aws cloudtrail put-event-selectors --trail-name TrailName2 --event-selectors '[{"ReadWriteType": "All","IncludeManagementEvents": true,"DataResources": [{"Type":"AWS::S3::Object", "Values": ["arn:aws:s3:::"]}]}]'