Authenticate users to view files from AWS S3 - amazon-web-services

Good Day Everybody,
I'm fairly new to AWS, and I have a problem right now. I'm not even sure if this is something that is possible with S3. I tried googling it, but couldn't find any proper response (Probably because the keywords I searched doesn't make much sense ;) ).
So my problem is this, I have an node application which uploads user images to S3. I wan't to know how to properly access this images later in the front-end(Some sort of direct link). But at the same time, I should be able to restrict the users who can access the image. For eg: If user xyz uploads an image only that user should be able to see it. Another user say abc tries to open the direct link, it should say access restricited or something similar.
Or if that is not possible, atleast I should be able to put an encrypted timestamp on the get url, so that the image will be accessible through that particular url for only a limited amount of time.
Thanks in advance.

This is the typical use case for S3 Pre-signed URLs.
In S3, you are able to specify some query strings on the URL of your object that include an Access Key, an expiration timestamp and a signature. S3 validates the signature and checks if the request has been made before the expiration timestamp. If that's the case, it will serve the object. Otherwise, it will return an error.
The AWS SDK for JavaScript (Node.js) includes an example on how to generate pre-signed URLs: http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/node-examples.html#Amazon_S3__Getting_a_pre-signed_URL_for_a_getObject_operation__getSignedUrl_

Related

Any way we can share a specific item publicly from a private S3 bucket?

The question is pretty vague but here's the entire problem statement. I am using Django REST APIs, and I'm generating invocies for my clients. Using wkhtmltopdf, I'm able to generate a PDF file which gets automatically backed up to S3. Now, we need to retreive the said invoice once our client clicks on a link.
We're using pre-signed URLs right now, which last for 12 hours, right? Once that link expires, the entire backend fails.
I mean, even if we go for permanent pre-signed links, would there not be a security issue?
I could really use some guidance on this.
Now, we need to retreive the said invoice once our client clicks on a link.
We're using pre-signed URLs right now [...]
Only generate the pre-signed URL for a given S3 URI when the authenticated client clicks on the link. You can then give it a very short expiry.

How to restrict users from download files uploaded to aws s3

I am developing a LMS in Laravel and uploading all the video files to aws s3 bucket and can play them using video js player. But problem is, users can download the video files, which I want to stop. Can anybody suggest me is it possible or not? If possible, can anyone tell me how can I do that?
Objects in Amazon S3 are private by default.
However, if you wish students to make use of a file (eg a learning course video), you will need to grant access to the file. The best way to do this is by using Amazon S3 pre-signed URLs, which provide time-limited access to a private object.
For example, the flow would be:
A students logs into the LMS
A student requests access to a course
The LMS checks whether they are entitled to view the course (using your own business logic)
If they are permitted to use the course, the LMS generates a pre-signed URL using a few lines of code, and returns the link in a web page (eg via an <a> tag).
The student can access the content
Once the expiry duration has passed, the pre-signed URL no longer works
However, during the period where the student has access to the file, they can download it. This is because access has been granted to the object. This is necessary because the web browser needs access to the object.
The only way to avoid this would be to provide courseware on a 'streaming' basis, where there is a continuous connection between the frontend and backend. This is not likely to be how your LMS is designed.

How can I grant access only if resource is accessed through my domain?

I have a bunch of videos and all of them are uploaded on Wistia. On Wistia, I have set up access for my domain, so they will play only when the videos are fetched from my domain.
If someone uses View Source and copies the video URL and pastes it in a separate browser window, they get an "access denied' message.
I'm thinking about moving my videos to Google Cloud Storage. So, my questions are:
Does Google cloud provide a similar domain restriction feature?
How can I set this up? For now, I've created a temporary bucket and uploaded a video and granted it public access. Then I copied the public link of the MP4 file and added to my website, and it obviously plays, but then any paid member can use View Source, copy the MP4 link and upload it to other streaming services for everyone to see.
EDIT
Is there a way to do this programmatically - like my website is in PHP - so something along the lines like - keep the bucket as restricted access and then through PHP - pass some key and retrieve the video file. Not sure if something like this is possible.
Thanks
I do not believe that there is an access control mechanism in Google Cloud Storage equivalent to the one you are using in Wistia.
There are several methods to restrict object access (see https://cloud.google.com/storage/docs/access-control) in GCS, but none of them are based upon where the request came from. The only one that kind of addresses your issue is to use Signed URLs. Basically, a user would go to your site, but instead of giving them the "real" URL of the object they are going to be using, your application retrieves a special URL that is time-limited. You can set the length of time it is valid for.
But if what you are worried about is people copying your video, presumably they could still see the URL someplace and copy the data from there if they did it immediately, so I don't think that really solves your problem.
Sorry I can't be more helpful.

Exposing AWS S3 Signed URLs in Web App?

I have been trying to find an answer to this question for a couple of hours now, but have not managed to come up with a conclusive answer. I am hoping someone here will be able to shed some light on my question. Consider the following Example AWS S3 URL:
https://some-bucket.s3-eu-west-2.amazonaws.com/uploads/images/some_image.jpg?X-Amz-Expires=600&X-Amz-Date=20170920T124015Z&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAI6CJYFYSSWMXXXXX/20170920/eu-west-2/s3/aws4_request&X-Amz-SignedHeaders=host&X-Amz-Signature=0481296b70633de9efb2fce6e20751df2f55fd79b5ff9570c02ff8f587dce825
In my specific example, the above URL is a request to view an image on S3 which I am exposing directly in a HTML img tag, and the user in Amz-Credential has both read and write permissions. The URL is also set to expire in 10 minutes.
Is is safe to link to the image directly via this URL, or is there any possibility that within these 10 minutes, the signature from this URL could be used in a maliciously crafted REST request to delete or modify the image instead of viewing it?
I do suspect a different action will have a different signature to make this impossible, but given my very limited understanding of AWS auth, I thought it better to ask just in case.
I know I could create a read-only user (extra complexity) or hide the S3 URL behind a controller action on my own web app (requires 2 total requests to load each image, making it inefficient), but I would rather learn whether my current approach is safe or not before resorting to either of these.
Thank you kindly for your time. :)
If your pre-signed url has PUT or DELETE permission someone could try to get the Signature + AccessKeyId to overwrite or delete your object.
Just make sure that you are signing the url with a read-only permission and I guess you're good.

Google Drive Change File Ownership Using RESTapi in Python

I manage a domain of users and would like to be able to transfer all the documents of a user to another user. As far as I understand the best way to achieve that is to find the fileID's of all files belonging to one user and transfer them to another user. However, I have problem constructing a query.
UPDATE:
So the correct query to retrieve the list of files would be:
response = drive_service.files().list(q="'user#company.com' in owners").execute()
However, it only works for me as an admin. If I try to retrieve the list of files for any other user in my domain it returns an empty list.
Files.list will retrieve all the user's files, in this case it will get all your own files. In order for that query to work would be only if that user is also owner one(or more) of your files.
Even as an admin you cannot access users files directly.
To access other user's files, as an admin you need to impersonate the users and then perform actions in their behalf.
This is achieved by using a service account with domain wide delegation of authority.
Here you can find more information on that as well as a python example.
Hope it helps.
If you want to transfer all the files of one user into another user's Drive, the easiest way would be to use the Data Transfer API provided by Google. This way you don't have to list the files and transfer them one by one. Also you only need the admin access token and wouldn't need domain wide delegation either. You can get the official documentation here