Amazon S3 multipart upload through Flutter - amazon-web-services

We're working on a flutter app where we want to upload single large files (100-150 MBs) directly to S3. The upload is too slow for our use case even though we're sending contentType: 'multipart/form-data'. We've looked through AWS docs and found out that we should be using multipart upload.
Now the problem is doesn't contentType: 'multipart/form-data' ensure that the file will be uploaded in chunks? If it doesn't, how can we break the file into parts at frontend using flutter and upload all of them to S3? (we're using Dio as our HttpClient)
Please help us to solve this problem.

Try with AWS Amplify SDK
https://docs.amplify.aws/start/q/integration/flutter/
You should be able to use standard Amplify Storage Library
https://docs.amplify.aws/lib/storage/getting-started/q/platform/flutter/
Multipart uploads should be handled by Storage library.

You might simply use some packages to upload files to S3 Bucket.
Simple file uploading -> amazon_s3_cognito: https://pub.dev/packages/amazon_s3_cognito
Some extra features -> minio: https://pub.dev/packages/minio

Related

can someone guide me on the use case for serving media files via uploadcare vs amazon S3 vs other solutions?

I am trying to make a django application (with the Rest framework) and I was just researching and came across different ways to serve media files.
what is the recommended approach for a project/app that lets you upload pictures so that other users can download them?
I came across CDN such as uploadcare, cloudinary and then ofcourse amazon S3 as well. Is there a difference in using these services like uploadcare vs S3 in terms of speed or scaling?
thanks in advance,

Using Amazon S3 SDK in the browser for multipart upload

It seems like the way to go is a class Upload in package #aws-sdk/lib-storage. It's a class that used to be called ManagedUpload and which supports multipart uploads. However when trying to import #aws-sdk/lib-storage, I get the error below, so apparently the package only works on Node, not the browser.
For the browser, what is the best alternative for someone who wants to implement multipart uploads to S3?

DJango use templates from seperrate AWS S3 Bucket

I have a Django server running in an elastic beanstalk environment. I would like to have it render HTML templates pulled from a separate AWS S3 Bucket.
I am using the Django-storages library, which lets me use static and media files from the bucket, but I can't figure out how to get it to render templates.
The reasoning for doing it like this is that once my site is running, I would like to be able to add these HTML templates without having to redeploy the entire site.
Thank you
To my best knowledge, Django-storages is responsible for managing static assets and media files, it doesn't mount the S3 bucket to the file system, what you might be looking for is something like S3Fuse which will mount the bucket on the File System, which will allow you to update the template and have them sync. This might not be the best solution because even if you got the sync to work, Django might not pick those changes and serve the templates from memory.
I believe what you're looking for is a Continuous Delivery pipeline, that way you won't be worried about hosting.
Good Question though.

Setting up a CDN with Wagtail CMS

I am looking into possibly setting up a CDN to use with my Wagtail sites. I am thinking that this will be a more efficient way to manage media uploads during stage/production pushes, since right now the media folder has to be manually copied from server to server on deploy. If all of the images were being accessed from a CDN then this wouldn't be an issue.
This would be my first time using a CDN so I'm looking for advice. There is lots of info on using a CDN with WordPress, but not a lot of documentation on setting one up with Wagtail/Django. I have the following questions about it:
Does anyone have any suggestions on the best way to implement the CDN with Wagtail?
How does it handle the uploads that the user submits through the CMS? Most of the images will be uploaded as part of the static files, but how does it work when the user uploads a photo as part of a post?
Which CDN companies have you had the best/worst experiences with? The sites I am planning to use this for are professional/business, but not e-commerce.
Also, if there is a more efficient way to handle the transfer of media uploads from one environment to another than using a CDN, I'd love to hear your suggestions for that too. As of right now I've had to copy the media folder over after doing the deploy, and I will have to do this every time I make a change to the site.
Thanks in advance for your assistance.
The following resources can be helpful for your required setup in Wagtail (later on today I can provide you some more details):
Frontend cache invalidator for pages (so not only for static and media files)
Link: http://docs.wagtail.io/en/latest/reference/contrib/frontendcache.html#frontendcache-aws-cloudfront
Storing media files in Amazon Web Services S3 buckets
This should be a better solution instead of copying media files from server to server. In this case Amazon Web Services CloudFront (CDN) would be a perfect choice.
Link: https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#amazon-s3
More info CloudFront: https://aws.amazon.com/cloudfront/
Static file cache invalidation with Django Whitenoise
Can be relevant to clear the cache for a new deployment (the static files will have a unique filename so CDN will have a new file cache from its origin after the deployment)
Link: http://whitenoise.evans.io/en/stable/django.html
CloudFront from AWS will have my personal choice for CDN. Besides the awesome resources/services AWS has to offer, CloudFront is simple to setup and has one of the best CDN's out there.
Finally a CDN for serving static- and media files has nothing to do with Wagtail specifically. There are some (see list above) nice apps available for Django itself, but you are free to choose another CDN solution (like Cloudflare).
So setting up a AWS S3 Bucket for each environment (tst/acc/stg/prd) and use it for uploading you media files (so the files aren't on the server anymore) and setup a CloudFront distribution for these buckets would be a proper solution for your problem.
Best regards,
Rob Moorman

How to set metadata while compressing files using gzip?

I'm hosting a static website on Amazon S3. It's basically a theme with all static files (HTML, CSS, JS & Image files)
Everything works fine. I've also activated Cloudfront. But the site is still sluggish and very slow while loading. So I wanted to gzip the content & started to follow this doc Serving Compressed Files from Amazon S3
But the part where I've to go to 2000+ files and update Metadata as "Content-Encoding" as "gzip" is too tedious and isn't practical for me.
Is there a way I can add this metadata while I compress the files using gzip.exe on my local system?
Or any other workaround?
Please guide me. I'm stuck. :(