S3 view Files in browser - amazon-web-services

I want to view files Such as excel or zip or any other files in the browser without getting downloaded.
I am able to display image and pdf files in the browser but unable to view any other format's such as zip or xls.
I am storing my files in S3.
What should i do?

Web browsers are not able to natively display most file types. They can render HTML and can display certain types of images (eg JPG, PNG), but only after these files are actually downloaded to your computer.
The same goes for PDFs -- they are downloaded, then a browser plug-in renders the content.
When viewing file (eg Excel spreadsheets and PDF files) within services like Gmail and Google Drive, the files are typically converted into images on the server-end and those images are sent to your computer. Amazon S3 is purely a storage service and does not offer a conversion service like this.
Zip files are a method of compressing files and also storing multiple files within a single archive file. Some web services might offer the ability to list files within a Zip, but again Amazon S3 is purely a storage service and does not offer this capability.
To answer your "What should I do?" question, some options are:
Download the files to your computer to view them, or
Use a storage service that offers these capabilities (many of which store the actual files in Amazon S3, but add additional services to convert the files for viewing online)

I might be a bit too late but did you try Filestash? (I made it)
That's what it looks like when you open a xls document on S3:
I aim to support all the common formats and the already supported list is rather big already

Related

Export csv from Microsoft Teams to Google Cloud Storage

I am receiving csv files from different users (from the same organisation) over Microsoft Teams. I have to download each file and import them into a bucket on Google Cloud Storage.
What would be the most efficient way to directly store those files directly into Google Cloud Storage everytime I am receiving a file from a given user over Teams? Files must be imported using Microsoft Teams.
I was thinking to trigger from Pub/Sub using Cloud Run but I am a bit confused how to connect this with teams.
I imagine you should be able to do this fine using Power Automate, but it might depend on how you're receiving the files (for instance are users sending them 1-1 to you directly, or uploading them into a Files tab in a specific Team/Channel).
Here's an example template for moving files from OneDrive for Business to Google Drive, that sounds like it should help: https://flow.microsoft.com/en-us/galleries/public/templates/02057296acac46e9923e8a842ab9911d/sync-onedrive-for-business-files-to-google-drive-files/

AWS S3 hidden flag lost after uploading files

I am having a problem with uploading hidden files to s3 and getting them back.
Basically, I upload the hidden files to s3 and download them back. The files become visible (hidden checkbox is unchecked).
Btw, this is Windows.
Is there anything I am missing out or this is the way it's supposed to work?
Thanks
The hidden flag is a Windows file system thing. S3 isn't a Windows filesystem, so when you upload a file to S3, it can't retain that Windows-specific flag. I think you would have to write some custom code to set extra, custom metadata on the S3 objects if they were marked as hidden in Windows, and custom download code to check for that metadata and mark the files as hidden if they are being downloaded onto a Windows machine.
There might be some specific S3 applications for Windows that would manage this. What tool are you using right now to upload to S3?

Storing and "Streaming" from Cloud Storage

I am trying to grasp how to store video files. I know I can store .mp4's on Google Cloud Store. However, I have had a hard time interfacing my application to stream these video files.
I have found video URLs like:
http://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4
Versus what the file on the Cloud Store is, which probably somehow refers to the mp4 I uploaded (right?)
https://firebasestorage.googleapis.com/v0/b/packfeed-e027b.appspot.com/o/Stories%2F0%2FM41WiOceiQTs3ELETIT5evcfsJm1_1520646187885.mp4?alt=media&token=201a831b-c239-4563-8178-cec3c4567212
Is there a difference between these two URLs, one points directly to the mp4, and then the other URL which is a "downloadlink"? is there a difference?
Are there any options to store files in the Google Cloud Platform like this?
Your first link points to the file stored on clips.vorwaerts-gmbh.de server. The second link points to the file stored on Google Cloud Storage server.
You can upload your files to Google Cloud Storage, then share the file publicly by checking the box "Share publicly" on the file. The "Public link" appeared will be the link available to public, similar to the second link you posted.
https://cloud.google.com/storage/docs/access-control/making-data-public#objects

Use AWS Elastic Transcoder and S3 to stream HLSv4 without making everything public?

I am trying to stream a video with HLSv4. I am using AWS Elastic Transcoder and S3 to convert the original file (eg. *.avi or *.mp4) to HLSv4.
Transcoding is successful, with several *.ts and *.aac (with accompanying *.m3u8 playlist files for each media file) and a master *.m3u8 playlist file linking to the media-file specific playlist files. I feel fairly comfortable everything is in order here.
Now the trouble: This is a membership site and I would like to avoid making every video file public. The way to do this typically with S3 is to generate temporary keys server-side which you can append to the URL. Trouble is, that changes the URLs to the media files and their playlists, so the existing *.m3u8 playlists (which provide references to the other playlists and media) do not contain these keys.
One option which occurred to me would be to generate these playlists on the fly as they are just text files. The obvious trouble is overhead, it seems hacky, and these posts were discouraging: https://forums.aws.amazon.com/message.jspa?messageID=529189, https://forums.aws.amazon.com/message.jspa?messageID=508365
After spending some time on this, I feel like I'm going around in circles and there doesn't seem to be a super clear explanation anywhere for how to do this.
So as of September 2015, what is the best way to use AWS Elastic Transcoder and S3 to stream HLSv4 without making your content public? Any help is greatly appreciated!
EDIT: Reposting my comment below with formatting...
Thank you for your reply, it's very helpful
The plan that's forming in my head is to keep the converted ts and aac files on S3 but generate the 6-8 m3u8 files + master playlist and serve them directly from app server So user hits "Play" page and jwplayer gets master playlist from app server (eg "/play/12/"). Server side, this loads the m3u8 files from s3 into memory and searches and replaces the media specific m3u8 links to point to S3 with a freshly generated URL token
So user-->jwplayer-->local master m3u8 (verify auth server side)-->local media m3u8s (verify auth server side)-->s3 media files (accessed with signed URLs and temporary tokens)
Do you see any issues with this approach? Such as "you can't reference external media from a playlist" or something similarly catch 22-ish?
Dynamically generated playlists is one way to go. I actually implemented something like this as a Nginx module and it works very fast, though it's written in C and compiled and not PHP.
The person in your first link is more likely to have issues because of his/hers 1s chunk duration. This adds a lot of requests and overhead, the value recommended by Apple is 10s.
There are solutions like HLS encrypted with AES-128 (supported on the Elastic Transcoder), which also adds overhead if you do it on the-fly, and HLS with DRM like PHLS/Primetime which will most likely get you into a lot of trouble on the client-side.
There seems to be a way to do it with Amazon CloudFront. Please note that I haven't tried it personally and you need to check if it works on Android/iOS.
The idea is to use Signed Cookies instead of Signed URLs. They were apparently introduced in March 2015. The linked blog entry even uses HLS as an example.
Instead of dynamic URLs you send a Set-Cookie header after you authenticate the user. The cookie (hopefully) gets passed around with every request (playlist and segments) and CloudFront decides whether to allow the access to your S3 bucket or not:
You can find the documentation here:
http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/PrivateContent.html

How to upload files to site without ftp?

My company uses a customer management system that is sort of terrible. We have to upload tons of files to it but it has no ftp server for us to use and only allows one file upload at a time through it's uploader. Is there anyway to write a program to automate something like this? Thanks.