How to search a file in the console of AWS-S3? - amazon-web-services

I have uploaded an image to S3 on Amazon Web Services. I just wanted to search for the image in the admin console of S3. I cannot find any search options there. Is there any other way?

Once in the console (e.g. in the bucket folder), you can just start typing the name of the object you are looking for. The list will refresh with the top file being the one you're searching for.

Related

how do you stop downloads from AWS S3 with the object url

i have a website similar to video hosting where i need to display upload videos and images and have the images be visible and also the videos if they are purchased, however their locations are saved in the database (MongoDB) and are displayed on the web-page and therefore show up in the network tab in the developer console.
this means that if you click on the link e.g. "https://s3.Region.amazonaws.com/bucket-name/key-name/folder/file-name.mp4" it will auto download, this only happens on chrome though but not Firefox where it just displays the object with no download option. i have tried to change the bucket policy, add encryption but either that causes the images that i want to display to become invisible as they are not publicly accessible or just has no effect and still allows for the video to be downloaded. is there any way for me to have the images and videos in the same bucket and have them both be visible under the right circumstances but block access to the bucket and prevent them from being downloaded by anyone but the bucket owner?
You cannot stop the downloads because the ability to show videos and images in a browser also means that the files are accessible via URL (that's how the browser fetches them).
One option is to use an Amazon S3 pre-signed URL, which is a time-limited URL that provides temporary access to a private object. The way it would work is:
Users authenticate to your back-end service
When a user requests access to one of the videos or images, your back-end checks that they are authorized to access the file
If so, your back-end generates an Amazon S3 pre-signed URL and includes it in the web page (eg <img src='...'>)
When the user's browser accesses that URL, Amazon S3 will verify that the URL is correct and the time-limit has not expired. If it's OK, then the file is provided.
Once the time limit expires, the URL will not work
This will not prevent a file being downloaded, but it will limit the time during which it can be done.
Alternate methods would involve serving content via streaming instead of via a file, but that is a much more complex topic. (For example, think about how Netflix streams content to users rather than waiting for them to download files.)

Show/Render S3 data through CloudFront visually in a folder structure

I have a private bucket with some csv files. I want to provide these files to the end user, but I do not want the end user to be logging into my S3 bucket console to download these files.
I can have a CloudFront distribution that allows access to these files only through the that distribution when I use Origin Access Identities (OAI). To fetch the files through cloudfront the user is expected to the know the full path of the file in S3.
In my case the user does not know the full path, or the name of the files. I am trying to find a way to render the csv files on S3 and provide them in some way to the user to download without having them go to the console. Ideally they would see a very basic folder structure that they can navigate and click on to download the files.
Does this require building a full web app? What is the easiest way?

Transferring Specific Files/Folders over with Google Admin SDK API

Google API Admin SDK Data Transfer: Can be found here
I have been able to successfully copy over all files from 1 Google account to another, but I am looking to copy over just 1 specific file. Scopes and permissions are correct.
The successful API request body to move all files is:
{
"oldOwnerUserId": "{ID transferring from}",
"newOwnerUserId": "{ID Transferring to}",
"applicationDataTransfers": [
{
"applicationId": "{Google Drive Application ID}"
}
]
}
The ID for the Google Drive Folder I'm looking to copy over is 1HCrNywrvoUly_MrYndR. How can I adapt this code to only transfer over this Google Drive folder?
Alternatively, is there a way I can create default folders in other user's accounts through the Google API? I need to create a set of blank folders for every new user that creates an account and haven't been able to find a way to create folders in other accounts using the Google Drive API. This workaround is to create all folders in my Google Drive and then using the Admin SDK API to ship off the completed folder to the new user.
After digging around, I found a solution that was way less complicated than using a Google Service Account. All you need is Admin credentials.
if you create a folder in your own Google Drive through the API, you can load it up with whatever you want. After that, you can then change the permissions of the parent folder to make the target user an owner. You then delete your own permissions and voila, that file now belongs to the other person.
Answer:
The Admin SDK API does not have methods which allow you to copy a single file or folder from one user account's Drive to another. For this, you must use the Google Drive API.
More Information:
You will need to use a service account to do this, as to authorise an application as more than one user you must have an application being run with an account that has domain-wide delegation, but once this has been set up in your Google Cloud Project Console, you can use the regular Google Drive API methods to copy individual files or folders from one Drive to another.
Things to be aware of:
You will need to use the delete and add methods of the Drive API to create the files/folders, rather than copying them.
If there are files inside a folder you wish to copy over, you will need to copy these recursively into the newly created folder.
You can get a list of the files in a folder using the list method of Drive: files in the API.
References:
Google Cloud - Understanding service accounts
Google Drive API
Method Files: create
Method Files: delete
Method Files: list

AWS - Download Current Site With User Content

Is there a way to download the current site content, namely, the uploaded user images, from a web application on AWS? Everything I have found only gives access to previous code deployments, which do not include the user uploaded files.
I have tried the instructions here but it only seems to give access to the code as it was at the time of deployment.
Thank you for any help.
User uploaded images are usually stored in Amazon's S3 service, so go to your AWS dashboard and navigate to the S3 section, and you should find the files in a bucket there
Are you trying to download your own website ? Then you need to get not just code or user images; but also database containing data. You need to check the code where images are saved.. Are they on local EBS or EFS or S3 and correspondingly copy from there.
If you are trying to download some-one else website. Then surely you will not have access to database or code or other user images; but still you can download full website as seen to the public using many tools like WinHTTrack.

Files:insert - Google Drive SDK - Python Example - What is Drive API service instance?

USING: Windows7, Python 2.7, Google App Engine
Google's documentation for inserting(creating) a file to Google Drive using Python and the Drive API. Here is the link showing the code near the bottom of the page:
Write a file to a Google Drive using Python
A function named: insert_file is defined in the Python module.
def insert_file(service, title, description, parent_id, mime_type, filename):
The insert_file function takes 6 arguments passed into it. The first arg is service.
In the comment section of the example code, it is indicated that the service arg takes the Drive API service instance as the input.
Args:
service: Drive API service instance.
title: Title of the file to insert, including the extension.
description: Description of the file to insert.
parent_id: Parent folders ID.
mime_type: MIME type of the file to insert.
filename: Filename of the file to insert.
What is the Drive API service instance? I have no idea what that is or what the valid settings are. Is it the authorization scope that is expressed as a URL? I do know what the title and description are. The title is the new name of the file being written, and the description is a detail, presumably put into the files metadata. I'm not sure how to get the parent_id or the Parent folder either. How is that info obtained? Do I get that manually from Google Drive? I know what the MIME type setting is.
If someone could give an explanation of what the Drive API service instance is, and give an example, that would be great. I did a search for Drive API service instance, and couldn't find an explanation. I searched the internet. I searched Google Developers. I found nothing.
Quickstart provides more boilerplate and a full working walk-through.
http = httplib2.Http()
http = credentials.authorize(http)
service = build('drive', 'v2', http=http)
The service is the API service that you want to instantiate. There are lots of services. An app can communicate with Google Maps, or Google tasks, or email, or Drive.
Google API's for Python
So, the service is the API service. Build instantiates the API service. This is from the video, minute 12:46.
YouTube example for Google Drive API Service
I found something about Parent Folders in the documentation.
Google Drive API
The Google Drive API has a files:insert API. The files:insert API makes a request with various parameters. There is, what is called, the Request body which has it's own parameters. One of the parameters for the Request Body is parents[]. It is an optional parameter. For insert, if the parents[] parameter is blank, the file gets created in the users root directory. So, I guess if you want the file to be written to a particular folder, you need to give the parents[] parameter a name. I'm assuming that is what the parent_id arg in the insert_file function is for, but I'm not sure. I need to look at the actual function, but that's not given.
After doing searches on Parent ID it looks like that is the folder ID. When you go to your Google Drive, and click on a folder, the URL in the browsers address field changes. Just click on the folder and the URL will look something like this:
https://drive.google.com/?tab=wo&authuser=0#folders/0B52YKjuEE44yUVZfdDNzNnR3SFE
The parentID is the long part on the end after the forward slash.
I guess I need to look at the Google Quickstart files again.
There are at least three examples that I've found:
Quickstart example. Google Drive SDK
Dr Edit. Google Drive SDK examples
Another Quickstart example Google Drive API
The first one is the simpliest. Dr Edit has the most files maybe? The last one looks like its more current? I don't know. It's kind of confusing about which example to use. The Drive SDK and the Drive API examples only deal with authorization of an account for some outside app to access a users account.