how do you stop downloads from AWS S3 with the object url - amazon-web-services

i have a website similar to video hosting where i need to display upload videos and images and have the images be visible and also the videos if they are purchased, however their locations are saved in the database (MongoDB) and are displayed on the web-page and therefore show up in the network tab in the developer console.
this means that if you click on the link e.g. "https://s3.Region.amazonaws.com/bucket-name/key-name/folder/file-name.mp4" it will auto download, this only happens on chrome though but not Firefox where it just displays the object with no download option. i have tried to change the bucket policy, add encryption but either that causes the images that i want to display to become invisible as they are not publicly accessible or just has no effect and still allows for the video to be downloaded. is there any way for me to have the images and videos in the same bucket and have them both be visible under the right circumstances but block access to the bucket and prevent them from being downloaded by anyone but the bucket owner?

You cannot stop the downloads because the ability to show videos and images in a browser also means that the files are accessible via URL (that's how the browser fetches them).
One option is to use an Amazon S3 pre-signed URL, which is a time-limited URL that provides temporary access to a private object. The way it would work is:
Users authenticate to your back-end service
When a user requests access to one of the videos or images, your back-end checks that they are authorized to access the file
If so, your back-end generates an Amazon S3 pre-signed URL and includes it in the web page (eg <img src='...'>)
When the user's browser accesses that URL, Amazon S3 will verify that the URL is correct and the time-limit has not expired. If it's OK, then the file is provided.
Once the time limit expires, the URL will not work
This will not prevent a file being downloaded, but it will limit the time during which it can be done.
Alternate methods would involve serving content via streaming instead of via a file, but that is a much more complex topic. (For example, think about how Netflix streams content to users rather than waiting for them to download files.)

Related

Selling access to web apps stored in S3 Bucket

I have several Rise 360 courses that I have exported to web apps and added them to my S3 bucket. I want to know the best was that I can sell access to these web apps from my website which I have built on the WordPress platform. I currently have 10 web apps in one bucket.
I don't want people to be able to take the URL and post it somewhere.
Content in Amazon S3 is private by default. Access is only available if you grant access in some way.
A good way to grant access to private content is to use Amazon S3 pre-signed URLs. These grant temporary access to private objects.
The flow would work something like this:
A user purchases a course
They then access a "My Courses" page
When generating that page, the PHP code would consult a database to determine what courses they have purchased
For each course they are allowed to access, the PHP code will generate a pre-signed URL to the course in Amazon S3. The URL can be configured to provide access for a period of time, such as 30 minutes
The user follows that URL and access the course. (Note: This assumes that only a single object is accessed.)
Once the expiry time is passed, the object is no longer accessible. The user would need to return to the "My Courses" page and click a newly-generated link to access the course again
If a user extracts the URL from the page, they will be able to download the object. You say "I don't want people to be able to take the URL and post it somewhere." This is not possible to guarantee because the app is granting them access to the object. However, that access will be time-limited so if they share the URL, it will stop working after a while.
If your app requires access to more than one URL (eg if the first page refers to a second page), then this method will not work. Instead, users will need to access the content via your app, with the app checking their access every time rather than allowing users to access the content directly from S3.

Google storage bucket (google api link) integration with Cloud CDN

Respected seniors!
I have hosted a website on GCE and storing all static content on GCS bucket. The content is then accessed using google API link e.g https://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]...
Now the API link is embedded on webpage on my website as https://www.example.com/post/[random post number](this is an html link which I need to serve an updated copy every time so I set up a cookie to avoid caching in CDN)
I have enabled Cloud CDN on my load balancer backend GCE. My understanding was that the link embedded in webpage for static content will automatically be cached in CDN because the bucket access is made public.
Upon verifying using cURL I came to know that whenever I access the object using API link (https://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]), the content is cached in CDN and whenever I use my site link (https://www.example.com/post/[random post number]), the content is not cached.
Kindly suggest any method to cache the content on bucket in clod CDN whose link is embedded in dynamic content url which has to be fetched fresh every time.
Thanks in advance
Based on your description, this look like working as intended. The reason why the content is not being cached is due to cookies header being present.
This article explain when content is non-cacheable: https://cloud.google.com/cdn/docs/caching#non-cacheable_content
This is also confirmed when you mentioned:"(https://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]), the content is cached in CDN." The bucket doesn't have any cookies and this is why it's cached.

How to restrict users from download files uploaded to aws s3

I am developing a LMS in Laravel and uploading all the video files to aws s3 bucket and can play them using video js player. But problem is, users can download the video files, which I want to stop. Can anybody suggest me is it possible or not? If possible, can anyone tell me how can I do that?
Objects in Amazon S3 are private by default.
However, if you wish students to make use of a file (eg a learning course video), you will need to grant access to the file. The best way to do this is by using Amazon S3 pre-signed URLs, which provide time-limited access to a private object.
For example, the flow would be:
A students logs into the LMS
A student requests access to a course
The LMS checks whether they are entitled to view the course (using your own business logic)
If they are permitted to use the course, the LMS generates a pre-signed URL using a few lines of code, and returns the link in a web page (eg via an <a> tag).
The student can access the content
Once the expiry duration has passed, the pre-signed URL no longer works
However, during the period where the student has access to the file, they can download it. This is because access has been granted to the object. This is necessary because the web browser needs access to the object.
The only way to avoid this would be to provide courseware on a 'streaming' basis, where there is a continuous connection between the frontend and backend. This is not likely to be how your LMS is designed.

How can I grant access only if resource is accessed through my domain?

I have a bunch of videos and all of them are uploaded on Wistia. On Wistia, I have set up access for my domain, so they will play only when the videos are fetched from my domain.
If someone uses View Source and copies the video URL and pastes it in a separate browser window, they get an "access denied' message.
I'm thinking about moving my videos to Google Cloud Storage. So, my questions are:
Does Google cloud provide a similar domain restriction feature?
How can I set this up? For now, I've created a temporary bucket and uploaded a video and granted it public access. Then I copied the public link of the MP4 file and added to my website, and it obviously plays, but then any paid member can use View Source, copy the MP4 link and upload it to other streaming services for everyone to see.
EDIT
Is there a way to do this programmatically - like my website is in PHP - so something along the lines like - keep the bucket as restricted access and then through PHP - pass some key and retrieve the video file. Not sure if something like this is possible.
Thanks
I do not believe that there is an access control mechanism in Google Cloud Storage equivalent to the one you are using in Wistia.
There are several methods to restrict object access (see https://cloud.google.com/storage/docs/access-control) in GCS, but none of them are based upon where the request came from. The only one that kind of addresses your issue is to use Signed URLs. Basically, a user would go to your site, but instead of giving them the "real" URL of the object they are going to be using, your application retrieves a special URL that is time-limited. You can set the length of time it is valid for.
But if what you are worried about is people copying your video, presumably they could still see the URL someplace and copy the data from there if they did it immediately, so I don't think that really solves your problem.
Sorry I can't be more helpful.

Is there anyway to make the URL of an object in my AWS S3 bucket inaccessible?

I have some images on my website that are loaded in via S3 however I want to make those URL's private/inaccessible without compromising the visibility of the image on my site.. Is this possible?
When I make the URL of the object private like so:
It addresses me with the following page upon attempting to view it:
I am fairly new at S3.. Any suggestions?
No. If the S3 URL is embedded in your page, then the client has to be able to read the file from S3, which means it needs to be public. S3 is no different from any other web resource in that way.