How to stop people from downloading my Amazon S3 video files - amazon-web-services

I have created a educational website where i'am providing video lectures.
i'am using Amazon S3 bucket for storing/hosting video.
and using the link i'am showing that video to my website but...
That Video has the download button at bottom...
and any one can download the full video
i want to prevent it from downloading
i'am just frustrated
tried changing bucket policies
tried changing ACL's but nothing is working may be i'am wrong
and also tried the aws pre-signed urls but that also has a download button.
please please please..... help me

One option is to have your back-end generate an Amazon S3 pre-signed URL, which is a time-limited URL that provides temporary access to a private object. Your app would first authenticate the user, then generate the pre-signed URL. It will continue to work for a given period of time. After that time, the link no longer works. However, during that time they could download the file.
Another option is to use a streaming server (eg Wowza or Elemental) that sends content that can't be downloaded. However, extra costs are involved to run such a service.
Unfortunately, it is the nature of the Internet that you need to send content to users, and users can either consume or download that content. Some people would say that this is a benefit of the Internet!

first find the bucket where you keep your files >> go to properties and edit
Default encryption
Enabled
Server-side encryption
Amazon S3-managed keys (SSE-S3)
then save next go to >> permissions and edit the bucket policy
add the condition below: StringLike and Referer
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicRead",
"Effect": "Allow",
"Principal": "",
"Action": [
"s3:GetObject"
],
"Resource": "arn:aws:s3:::YOUR-BUCKET-NAME/",
"Condition": {
"StringLike": {
"aws:Referer": "https: //YOURWEBSITE/*"
}
}
}
]
}
then save
i tested it in my website. it works. it also works in images but it will not affect your pdf documents
if you have any pdf files

Related

How to automate visualization of S3 objects by using Lambda?

The thing is that I have these objects in my S3 bucket (email) and I want to be able to visualize them since they're not that easy to open them and if you want to you need to manually do it. I was told there was a way to use Lambda function with S3 to be able to create a some kind of html view page to see the objects.
Hope someone can help me do that.
Based on what you have provided, Here's what I understand.
You've email(s) in S3
You want to visualize/read them via website
There are many ways to do it. Simplest of all is using s3's public hosting capability. See this AWS doc for step by step details.
This example assumes your email is in .txt format. If you have any other formt (e.g. pdf / eml etc) you will need corresponding parser library and logic to open and read those. In that case this example may not work. you may want to look at other aws options such as aws lightsail or aws amplify, It will depend on your requirements.
Based on AWS doc, Here's what you can do at a high-level.
Create a basic index.html and upload to s3 folder.
Create a basic error.html and upload to your s3 folder.
Under bucket URL -> properties edit and enable 'Static Website Hosting'
e.g. https://s3.console.aws.amazon.com/s3/buckets/yours3bucket?region=us-west-1&tab=properties
This will give you an website URL something like below.
http://yours3bucket.s3-website-us-west-1.amazonaws.com
Under bucket URL -> permissions , UnCheck "Block all public access".
[ caution: this will open your s3 bucket to the world. For better access control, consider using IAM ]
Add a bucket policy.
[This example shows policy to enable access to anyone. you should consider restricting to certain users using IAM]
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicRead",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject",
"s3:GetObjectVersion"
],
"Resource": "arn:aws:s3:::yours3bucket/*"
}
]
}
After that, simply launch your static website
http://yours3bucket.s3-website-us-west-1.amazonaws.com/index.html
Each of your email (assuming its in text format) should be accessible as below.
http://yours3bucket.s3-website-us-west-1.amazonaws.com/youremailobject.txt
Here's how it looks at my end

Restrict read-write access from my S3 bucket

I am hosting a website where users can write and read files, which are stored into another S3 Bucket. However, I want to restrict the access of these files only to my website.
For example, loading a picture.
If the request comes from my website (example.com), I want the read (or write if I upload a picture) request to be allowed by the AWS S3 storing bucket.
If the request comes from the user who directly writes the Object URL in his browser, I want the storing bucket to block it.
Right now, even with all I have tried, people can access ressources from the Object URL.
Here is my Bucket Policy:
{
"Version": "2012-10-17",
"Id": "Id",
"Statement": [
{
"Sid": "Sid",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:GetObjectAcl"
],
"Resource": "arn:aws:s3:::storage-bucket/*",
"Condition": {
"StringLike": {
"aws:Referer": "http://example.com/*"
}
}
}
]
}
Additionnal informations:
All my "Block public access" are unchecked as you can see here. (I think that the problem comes from here. When I check the two boxes about ACL, my main problem is fixed, but I got a 403 error - Forbidden - when it comes to upload files to the Bucket, another problem);
My ACL looks like this;
My website is statically hosted on another S3 Bucket.
If you need more informations or details, ask me.
Thank you in advance for your answers.
This message has been written by a French speaking guy. Sorry for the mistakes
"aws:Referer": "http://example.com/*
The referer is an http header passed by the browser and any client could just freely set the value. It provides no real security
However, I want to restrict the access of these files only to my website
Default way restrict access to S3 resources for a website is using the pre-signed url. Basically your website backend can create an S3 url to download or upload an s3 object and pass the url only to authenticated /allowed client. Then your resource bucket can restrict the public access. Allowing upload without authentication is usually a very bad idea.
Yes, in this case your website is not static anymore and you need some backend logic to do so.
If your website clients are authenticated, you may use the AWS API Gateway and Lambda to create this pre-signed url for the clients.

Amazon S3 image hosting with Shopify

I have an AWS S3 bucket that I store product images on. I sell on multiple sales channels and use ChannelAdvisor to share all my product data to all the different sites. My image URLs are sent via ChannelAdvisor to the sites. Amazon reads my images fine, my website on Shopify does not read the images at all.
I think it's because how the images are shared. If you put the image URL in your browser, it downloads the image, but I want it to just show the image in the browser. I think this is my problem with Shopify.
Below is my current AWS policy, my question is how do I change the policy or shared URLs to make AWS load in the browser not download the image?
{
"Version": "2008-10-17",
"Statement": [
{
"Sid": "AllowPublicRead",
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::mybucket-name/*"
}
]
}
This is not a function of policy, but rather one of metadata. Browsers use the Content-Type response header to determine what kind of file is coming in, and how to handle it. For example, for a .png file, the content type needs to be set to image/png. You set this when uploading the files to S3.

Can't download objects from S3 using the URL link

I have a policy like below which I've attached to several users:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:*"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::foo-bar",
"arn:aws:s3:::foo-bar/*"
]
}
]
}
The intent of this policy is that these users would have full control over the bucket foo-bar. However, I'm noticing that even though the users can download the objects in these buckets using Accesskey and Secretkey. They can not download the objects via a URL e.g. https://s3.amazonaws.com/foo-bar/test.docx
I am currently logged in as a IAMManager user and also have AmazonS3FullAccess. I can see the list of objects in this bucket but when I click the URL I can't download them. I can, however, download them via clicking the Download button.
Question
Is there anything I need to change in my policy OR the objects can only be downloaded via the URL when they are publicly available?
By using your IAM policy on your users, you are granting S3 API access to the bucket for those users only. However, when accessing the bucket via the URL, you are not using the S3 APIs and no authentication information is being sent.
In order to download objects from an S3 bucket directly via the URL, the bucket needs to be made public, which you may or may not want to do.

AWS S3 MimeMessage files generated by SES cause 403 when accessed by java SDK

So I've been working with this problem all day, and I can't seem to find the cause of this issue.
I have an action in SES that will forward all emails at a specific subdomain to a specific bucket. These messages can be downloaded fine and contain all necessary information when interacted with in the console, but fail to be retrieved by using getObject() in the Java SDK.
I can confirm that the SDK credentials work correctly, as I can download other files from the same bucket, even with the same key prefix through my code.
That proves my bucket policy is set up correctly. The entry dealing with the getObject permission looks like this:
{
"Sid": "EmailsAccess",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:DeleteObject",
"s3:GetObject"
],
"Resource": "arn:aws:s3:::foo-bucket-foo/foo-prefix-foo/*"
}
I'm sure that the root cause of the issue has to do with the owner, since that is defined as "aws-ses+publishing.us-east-1.prod" in each file generated by SES. Why is that causing my code to bring up 403s? Is there any way to change a file's owner, or is there a more elegant solution?
I found the issue being that the accounts used in the SDK were a different set of accounts from the bucket owner, and did not have permission to view those specific messages.
Following this guide would have solved the issue, but we decided to go another route and are trying to log in using the bucket owner accounts.