Who has deleted files in S3 bucket? - amazon-web-services

Which is the best way to find out who deleted files in AWS S3 bucket?
I am working on AWS S3 Bucket. Going through the AWS docs and haven't found the best way to monitor S3 buckets so thought of checking if anyone can help me here.

For monitoring S3 object operations, such as DeleteObject, you have to enable CloudTrail with S3 data events:
How do I enable object-level logging for an S3 bucket with AWS CloudTrail data events?
Examples: Logging Data Events for Amazon S3 Objects
However, the trials don't work retrospectively. Thus, you have to check if you have such trial enabled in CloudTrail console. If not, then you can create one to monitor any future S3 object level activities for all, or selected, buckets.
To reduce the impact of accidental deletions you can enable object version. And to fully protect against that for important objects, you can use MFA delete.

You can check S3 access logs or CloudTrail to check who deleted files from your S3 bucket. More information here - https://aws.amazon.com/premiumsupport/knowledge-center/s3-audit-deleted-missing-objects/

Related

How can you find the most retrieved object in your s3 bucket?

I want to know how many times a object retrieved from a aws s3 bucket. When we use aws CloudFront, we can use cloudFront popular objects report. How can i get similar report for s3
Download metrics for individual S3 objects are not readily available, afaik.
You could derive them from one of:
Amazon S3 access logs
CloudTrail logs

How to get list of users who are accessing the objects in S3 Buckets?

Scenario:
My client have 80+ S3 Buckets and 1000+ applications is running in their AWS account. I want to get the list of IAM users/roles who are accessing the objects in all the S3 Buckets.
Method 1: Initially I tried to fetch it from CloudTrail Event History, but no luck.
From the above image, you can see CloudTrail is failing to log the object level logging.
Method 2: I created a CloudTrail Trails to log the activities. But it captures all management level activities happening through out the account which makes me hard to find the S3 logs alone(I already mentioned that there is 80+ Buckets & 1000+ applications in the account).
Method 3: S3 Server Access Log: If I enable this option, it creates log entry for every action happening to the objects. (that is: When I attempt to read a log file, it creates an another log. It keeps on doubling the count of logs)
If anyone have a solution to find the list of IAM users/roles who are accessing the S3 bucket objects and in an effective way, please help me.
Thanks in advance.
For each bucket, configure object-level logging.
Once that is complete, you can use the CloudTrail API to filter events and extract IAM identities making the requests.
aws cloudtrail lookup-events --lookup-attributes AttributeKey=ResourceType,AttributeValue=AWS::S3::Object --query Events[*].Username

Copy files from s3 bucket to another AWS account

Is it possible to send/sync files from source AWS S3 bucket into destination S3 bucket on a different AWS account, in a different location?
I found this: https://aws.amazon.com/premiumsupport/knowledge-center/copy-s3-objects-account/
But if I understand it correctly, this is the way how to sync files from destination account.
Is there a way how to do it other way around? Accessing destination bucket from source account (using source IAM user credentials).
AWS finally came up with a solution for this: S3 batch operations.
S3 Batch Operations is an Amazon S3 data management feature that lets
you manage billions of objects at scale with just a few clicks in the
Amazon S3 Management Console or a single API request. With this
feature, you can make changes to object metadata and properties, or
perform other storage management tasks, such as copying objects
between buckets, replacing object tag sets, modifying access controls,
and restoring archived objects from S3 Glacier — instead of taking
months to develop custom applications to perform these tasks.
It allows you to replicate data at bucket, prefix or object level, from any region to any region, between any storage class (e.g. S3 <> Glacier) and across AWS accounts! No matter if it's thousands, millions or billions of objects.
This introduction video has an overview of the options (my apologies if I almost sound like a salesperson, I'm just very excited about it as I have a couple of million objects to copy ;-) https://aws.amazon.com/s3/s3batchoperations-videos/
That needs the right IAM and Bucket policy settings.
A detailed configuration for cross account access, is discussed here
Once you have it configured you can perform sync,
aws s3 sync s3://sourcebucket s3://destinationbucket --recursive
Hope it helps.

Is there a way to log files that are copied from an S3 bucket?

I'm looking for a way to log when data is copied from my S3 bucket. Most importantly, which file(s) were copied and when. If I had my way, I'd like by who and where but I don't want to get ahead of myself.
A couple of options:
Server Access Logging provides detailed records for the requests that are made to an S3 bucket
AWS CloudTrail captures a subset of API calls for Amazon S3 as events, including calls from the Amazon S3 console and from code calls to the Amazon S3 APIs

AWS console see what action has been done by which user

On the AWS console, is there any history of users actions? I would like to see which of ours users has last modified a property of a S3 bucket for example
For this you can do few things.
Setup AWS CloudTrail to audit user actions to AWS S3
Enable logging for the S3 bucket and store the logs either in a bucket in the same account of in a different account (Better if you need more security).
Enable versioning on S3 buckets, so past versions remains and allows to revert the changes.
The best way to collect all user actions in AWS is using CloudTrail. Using CloudTrail you can also create trails that includes S3 object-level operation events.