Download File from AWS S3 using Presigned URL - amazon-web-services

I am trying to download a large file using the AWS S3 URL that looks like the following,
https://downloadfiles.s3.amazonaws.com/client_for....
Supposedly these files can be downloaded by just clicking them or through a Linux terminal. I've tried both but they are not working, getting the AccessDenied message. Does anyone have any insight into solving this problem?

Related

AWS S3 - automatically change response header

Our team is building a web app that allows users to download video files. We currently host our files on AWS S3, but since our site doesn't reside on AWS, we can't use <a href="blah"> to prompt download. If we use that html element, users simply get redirected to a video player - which is fine, but Safari on mobile doesn't allow for users to download the video file via the video player.
We found that manually setting the file's content disposition to attachment on S3 works, but we have not found a way to automate that. We tried adding a content-disposition: attachment key-value pairing in our payload, which works, but adds a "User defined" meta data in the form of x-amz-meta-content-disposition, which doesn't work as the file could not be downloaded as an attachment. It seems only "System defined" works.
Has anyone ever encountered this issue before and found a workaround?
see screenshot for what I'm referencing
You can set the content disposition when the file is created.
This is done by uploading the file via a presigned url.
See https://docs.aws.amazon.com/AmazonS3/latest/API/API_GetObject.html for details on the presigned urls.
Alternatively you can use a presigned url to return to get the file from S3 and override the content disposition header on the GET request.

How to upload a file to Amazon S3 using an already made presigned-url through the command line interface?

Is it possible to upload to an Amazon S3 bucket using an already prepared presigned-url through the command line. If so, how do I do this?
Running aws s3 cp <local_file> <presigned_url> doesn't work. I can't find any documentation related to presigned urls on the internet, so I would imagine that this isn't possible, but I'm still hopeful.
Yes. If you already have the presigned url for put-object, just use curl:
curl "presignedurl" --upload-file yourfile.txt
I just tested and works great. For example:
curl "https://test-bucket.s3.amazonaws.com/test.txt?AWSAccessKeyId=ASDFLWKAS0F123&Signature=667S%22F%24%ASDasdfasdf%2Fasdfl&Expires=1634314495" --upload-file test.txt
In case you want to generate a presigned url, you can just check here for examples:
https://docs.aws.amazon.com/AmazonS3/latest/userguide/PresignedUrlUploadObject.html

Why we're getting random CORS error on Amazon Cloudfront?

We're using cloudfront to serve our assets (images, fonts, translations json files) stored in s3. We're using the configuration in the images. We noticed that we get CORS error randomly and it's a bit frustrating coz we were unable to replicate the issue. What could be wrong in this configuration that might cause it?

Download zip file from S3 using API Gateway directly

Is it possible to download Zip file present in S3 using API Gateway alone.
Configurations:
Integration type : AWS Service (S3)
endpoint : GET
Content-Type: application/zip (or) application/octet-stream
A corrupted zip file is getting downloaded.
I could able to do a workaround using S3 presigned url and don't want to make the bucket public.
The file can get corrupted when some headers are missing from the upload PUT request. Please make sure the file is not-corrupted by downloading it manually and checking. If it's corrupted from the GET request only please make sure you include all the needed headers in the GET request too.

Uploading data to Amazon S3 directly from a URL [duplicate]

Is it possible to upload a file to S3 from a remote server?
The remote server is basically a URL based file server. Example, using http://example.com/1.jpg, it serves the image. It doesn't do anything else and can't run code on this server.
It is possible to have another server telling S3 to upload a file from http://example.com/1.jpg
upload from http://example.com/1.jpg
server -------------------------------------------> S3 <-----> example.com
If you can't run code on the server or execute requests then, no, you can't do this. You will have to download the file to a server or computer that you own and upload from there.
You can see the operations you can perform on amazon S3 at http://docs.amazonwebservices.com/AmazonS3/latest/API/APIRest.html
Checking the operations for both the REST and SOAP APIs you'll see there's no way to give Amazon S3 a remote URL and have it grab the object for you. All of the PUT requests require the object's data to be provided as part of the request. Meaning the server or computer that is initiating the web request needs to have the data.
I have had a similar problem in the past where I wanted to download my users' Facebook Thumbnails and upload them to S3 for use on my site. The way I did it was to download the image from Facebook into Memory on my server, then upload to Amazon S3 - the full thing took under 2 seconds. After the upload to S3 was complete, write the bucket/key to a database.
Unfortunately there's no other way to do it.
I think the suggestion provided is quite good, you can SCP the file to S3 Bucket. Giving the pem file will be a password less authentication, via PHP file you can validate the extensions. PHP file can pass the file, as argument to SCP command.
The only problem with this solution is, you must have your instance in AWS. You can't use this solution if your website is hosted in other Hosting Providers and you are trying to upload files straight to S3 Bucket.
Technically it's possible, using AWS Signature Version 4, Assuming your remote server is the customer in the image below, you could prepare a form in the main server, and send the form fields to the remote server, for it to curl it. Detailed example here.
you can use scp command from Terminal.
1)using terminal, go to the place where there is that file you want to transfer to the server
2) type this:
scp -i yourAmazonKeypairPath.pem fileNameThatYouWantToTransfer.php ec2-user#ec2-00-000-000-15.us-west-2.compute.amazonaws.com:
N.B. Add "ec2-user#" before your ec2blablbla stuffs that you got from the Ec2 website!! This is such a picky error!
3) your file will be uploaded and the progress will be shown. When it is 100%, you are done!