S3 - Video, uploaded with getSignedUrl link, does not play and is downloaded in wrong format - amazon-web-services

I am using AWS SDK in Server Side with Node.JS and having issue with uploading files as formData from client side.
On the server side I have simple route, which creates upload link, where video will be uploaded later directly from client side.
I am using S3 getSignedUrl method for generating that link with putObject, which creates PUT request for client, but causes very strange issue with formData.
Video uploaded as formData is not behaving correctly - instead of playing it S3 uploaded url downloads that video and it is also broken.
Here is simple how i configure that method on server side:
this.s3.getSignedUrl(
'putObject',
{
Bucket: '<BUCKET_NAME>',
ContentType: `${contentType}` -> video/mp4 as a rule,
Key: key,
},
(err, url) => {
if (err) {
reject(err)
} else {
resolve(url)
}
},
)
axios put request with blob is actually working, but not for formData.
axios.put(url, file, {
headers: {
'Content-Type': file.type,
},
onUploadProgress: ({ total, loaded }) => {
setProgress((loaded / total) * 100)
},
})
This is working version, but when I try to add file to formData, it is uploaded to S3, but video downloads instead of playing.
I do not have big experience in AWS, so if somebody knows how to handle that issue, I will be thankfull

Related

Image downloaded from AWS S3 is not opening correctly in any viewer

I created a simple api using API Gateway to upload images to my S3 bucket.
And when i try to upload a jpeg file using curl, its getting uploaded into S3.
curl --request PUT -H "Content-Type: image/jpeg" --data-binary "#s.jpeg" https://173xxxxxxxf.execute-api.eu-west-2.amazonaws.com/Test/memxxxx/s.jpeg
But I am noticing multiple issues here
the file size is almost doubled(85KB original size, 153KB in S3)
If I download the file from S3, I am not able to view the image in any default image viewer.
If I upload the same image to S3 using drag & drop and then download it, it works without any issues.
What's the best way to handle it properly( I assume the issue is with the header type)
Edit:
I also tried to set the content as base64 encoded string and still facing the same issue
const https = require("https");
var fs = require("fs");
var data = base64_encode("filename.jpg");
const options = {
hostname: "173ixxxxxxf.execute-api.eu-west-2.amazonaws.com",
port: 443,
path: "/Test/xxxx/filename.jpeg",
method: "PUT",
headers: {
"Content-Type": "image/jpg",
"Content-Length": data.length,
},
};
const req = https.request(options, (res) => {
console.log(`statusCode: ${res.statusCode}`);
res.on("data", (d) => {
process.stdout.write(d);
});
});
req.on("error", (error) => {
console.error(error);
});
req.write(data);
req.end();
function base64_encode(file) {
// read binary data
var bitmap = fs.readFileSync(file);
// convert binary data to base64 encoded string
return new Buffer(bitmap).toString("base64");
}

Error while uploading image on AWS S3 - with Axios & React-Native

I'm trying to upload an image from my Mobile App (with React-Native) on AWS S3 with a presigned URL. I'm using axios to send the request.
The problem is that even if my image is uploaded on AWS, if I download it and try to open it says it's corrupted. I tried to open with Photoshop and it works :/
Creating the formData:
const createFormData = (photo) => {
const data = new FormData();
data.append('image', {
name: photo.fileName, // a name
type: photo.type, // image/jpg
uri: photo.uri, // the uri starting with file://....
});
return data;
};
My PUT request:
const formData = createFormData(responseImage)
axios({
method: "put",
url: awsURL.data.url_thumbnail,
data: formData,
headers: { "Content-Type": "multipart/form-data" },
})
This isn't how it works.
headers: { "Content-Type": "multipart/form-data" }
The content-type multipart/form-data also contains a field called boundary separated by a delimiter. You can get more details here. The article has the details for the format of boundary.
Another example for the same.
Hope it helps!
PS: There are some articles related to parsing data for multipart/form-data that I can't find right now, which explain how to parse the data before uploading so that the data isn't corrupted.

Uppy - How do you upload to s3 via multipart? Using companion?

https://uppy.io/docs/aws-s3-multipart/
Uppy multipart plugin sounds like exactly what I need but I can't see how to do the backend part of things. The impression I get is that I need to setup a companion to route the upload to S3 but can't find any details on setting up the companion for this.
I can see lots of references about using Companion to fetch external content but none on the multipart S3 uploading.
I neither see anywhere inside Uppy to provide AWS credentials which makes me think Companion even more.
But there are 4 steps to complete a multipart upload and I can't see how providing one companion url will help Uppy.
Thanks in advance to anyone who can help or jog me in the right direction.
Providing Uppy a companion URL makes it so that Uppy will fire off a series of requests to the-passed-url.com/s3/multipart. You then need to configure your server to handle these requests. Your server will be where your credentials are handled for AWS.
In short when you click the upload button in Uppy, this is what happens:
Uppy sends a post request to /s3/multipart to create/initiate the multipart upload.
Using the data returned from the previous request, Uppy will send a get request to /s3/multipart/{uploadId} to generate AWS S3 pre-signed URLs to use for uploading the parts.
Uppy will then upload the parts using the pre-signed URLs from the previous request.
Finally, Uppy will send a post request to /s3/multipart/{uploadId}/complete to complete the multipart upload.
I was able to accomplish this using Laravel/Vue. I don't know what your environment is but I've posted my solution which should help, especially if your server is using PHP.
Configuring Uppy to Use Multipart Uploads with Laravel/Vue
I am sharing code snippets for AWS S3 Multipart [github]
If you add Companion to the mix, your users will be able to select files from remote sources, such as Instagram, Google Drive, and Dropbox, bypassing the client (so a 5 GB video isn’t eating into your users’ data plans), and then uploaded to the final destination. Files are removed from Companion after an upload is complete, or after a reasonable timeout. Access tokens also don’t stick around for long, for security reasons.
Setup companion server:
1: Setup s3 configuration.
Uppy automatically generates the upload URL and puts the file in the uploads directory.
s3: {
getKey: (req, filename) =>{
return `uploads/${filename}`;
},
key: 'AWS KEY',
secret: 'AWS SECRET',
bucket: 'AWS BUCKET NAME',
},
2: Support upload from a remote resource
Uppy handles everything for us. We just need to provide a secret key and token from different remote resources like Instagram, drive, etc.
example: Drive upload
Generate google key and secrete from google and add it to code
Add redirect URL for authentication
3: Run node server locally
const fs = require('fs')
const path = require('path')
const rimraf = require('rimraf')
const companion = require('#uppy/companion')
const app = require('express')()
const DATA_DIR = path.join(__dirname, 'tmp')
app.use(require('cors')({
origin: true,
credentials: true,
}))
app.use(require('cookie-parser')())
app.use(require('body-parser').json())
app.use(require('express-session')({
secret: 'hello planet',
}))
const options = {
providerOptions: {
drive: {
key: 'YOUR GOOGLE DRIVE KEY',
secret: 'YOUR GOOGLE DRIVE SECRET'
},
s3: {
getKey: (req, filename) =>{
return `uploads/${filename}`;
} ,
key: 'AWS KEY',
secret: 'AWS SECRET',
bucket: 'AWS BUCKET NAME',
},
},
server: { host: 'localhost:3020' },
filePath: DATA_DIR,
secret: 'blah blah',
debug: true,
}
try {
fs.accessSync(DATA_DIR)
} catch (err) {
fs.mkdirSync(DATA_DIR)
}
process.on('exit', () => {
rimraf.sync(DATA_DIR)
})
app.use(companion.app(options))
// handle server errors
const server = app.listen(3020, () => {
console.log('listening on port 3020')
})
companion.socket(server, options)
Setup client:
1: client HTML code:
This code will allow upload from the drive, webcam, local, etc. You can customize it to support more remote places.
Add companion URL as your above node server running URL(http://localhost:3020)
<!doctype html>
<html>
<head>
<meta charset="utf-8">
<title>Uppy</title>
<link href="https://releases.transloadit.com/uppy/v1.29.1/uppy.min.css" rel="stylesheet">
</head>
<body>
<div id="drag-drop-area"></div>
<script src="https://releases.transloadit.com/uppy/v1.29.1/uppy.min.js"></script>
<script>
Uppy.Core({
debug: false,
autoProceed: false,
restrictions: {
maxNumberOfFiles: 5,
}
}).
use(Uppy.AwsS3Multipart, {
limit: 4,
companionUrl: 'http://localhost:3020'
}).
use(Uppy.Dashboard, {
inline: true,
showProgressDetails: true,
showLinkToFileUploadResult: false,
proudlyDisplayPoweredByUppy: false,
target: '#drag-drop-area',
}).use(Uppy.GoogleDrive, { target: Uppy.Dashboard, companionUrl: 'http://localhost:3020' })
.use(Uppy.Url, { target: Uppy.Dashboard, companionUrl: 'http://localhost:3020' })
.use(Uppy.Webcam, { target: Uppy.Dashboard, companionUrl: 'http://localhost:3020' });
</script>
</body>
</html>

Cannot upload file from React Native: "Network request failed"?

When trying to upload a selected image from my React Native project I get a nondescript error message:
Network request failed
Seems to be a common issue, but most people are just forgetting their file types or are on Android and have an issue with Flipper. Nothing that has worked for anyone I've found with the same symptoms has worked for me.
Code:
const localUri = result.uri;
const filename = localUri.split("/").pop();
const type = mime.lookup(localUri) || "image";
const formData = new FormData();
formData.append("file", { uri: localUri, name: filename, type });
try {
const file = await fetch(`${SERVER_URL}/api/upload`, {
method: "POST",
body: formData,
}).then((res) => {
console.log(res);
return res.status === 200 ? res.text() : res.json();
});
} catch (e) {
console.log(e);
}
Considerations:
Using a physical IOS device. Iphone.
Using Expo 40.0.0 with corresponding RN SDK, not ejected.
Using expo-image-picker to get image.
Using NGROK to get requests through to my localhost server from my phone.
All other requests to my server from React Native work fine, it's only when I try to uplaod a file
Image renders fine from supplied URI, so it's getting the right source.
Form Data source from above:
{ "name": "CAPS-FILE-NAME.jpg", "type": "image/jpeg", "uri": "file:///var/mobile/Containers/Data/Application/CAPS-PATHING/Library/Caches/ExponentExperienceData/project-src-pathing/ImagePicker/CAPS-FILE-NAME.jpg", }
Things tried:
Using Content-Type header: "multipart/form-data"
Using /private instead of file://
Using Postman to hit my server through NGROK, which works
Changing my Expo/RN to 38.0.0
Getting base64 -> blob -> formData, same result
Many other things I've forgotten now. If it's on Google results, I've tried it.
For anyone who gets stuck with this also, I switched to using XMLHttpRequest instead of fetch and it miraculously works now. Not sure why fetch is broken in RN, but at least there's a solution.

Upload binary data from react-native to AWS S3 using PUT presigned URL

I want to upload a file to AWS S3 using a pre-signed url.
I tried to send the file using form-data, but i got the following response with status 403 :
The request signature we calculated does not match the signature you
provided. Check your key and signing method.
After further investigation, i found that AWS S3 does not support form-data and suggests to use binary instead.
How should i do that in react-native?
It turns out you can send the file in multiple ways, including base64 and as a Buffer.
Using react-native-fs and buffer:
Uploading as base64 worked, but i couldn't view the image. So i uploaded using a buffer:
export const uploadToAws = async (signedRequest, file) => {
const base64 = await fs.readFile(file.uri, 'base64')
const buffer = Buffer.from(base64, 'base64')
return fetch(signedRequest, {
method: 'PUT',
headers: {
'Content-Type': 'image/jpeg; charset=utf-8',
'x-amz-acl': 'public-read',
},
body: buffer,
})
}
Note that on the server, you need to make sure you are setting the correct Content-Type: { ContentType: "image/jpeg; charset=utf-8", 'x-amz-acl': 'public-read' } as it seems fetch adds the charset to Content-Type.