How to upload images to AWS Amplify - amazon-web-services

I'm trying to upload images to Aws Amplify. First of all I'm not even sure if that's how it is done since I'm fairly new to AWS but I have a model in which there's an array of string that holds images links addresses that I use for display in the app. Now I'm trying to do the opposite by uploading images from the react native app to AWS amplify knowing a link needs to be created after uploading. How would I proceed to doing that is what I need to know

Amplify Storage actually utilizes S3, so you'd be creating an S3 storage bucket and uploading your images to that. The Amplify libraries provide an easy way to do all of this.
An overview of the upload process: https://docs.aws.amazon.com/AmazonS3/latest/API/browser-based-uploads-aws-amplify.html
Amplify JS documentation for the put method: https://docs.amplify.aws/lib/storage/upload/q/platform/js/

Related

Can users upload files into a S3 bucket without frontend experience or users having access to AWS account?

I am looking to create an AWS solution where a lambda function will transform some excel data from a S3 bucket. When thinking about how I'm going to create the architecture background, I need to think of a way where I can get non-technical users, which don't have access to the AWS account, to upload data into a S3 bucket. One possible solution is using an S3 API and creating a UI to allow the users to upload the data. However, I do not have much experience with front end programming skills such as JS and HTML. Are there any other possible solutions we can use?
I've thought about creating a simple UI and using a S3 API to ingest data into the bucket but I do not have front end programming experience.
Some ideas:
Serve a web page from somewhere (S3?) and have them upload via the browser, or
Give them a simple program like CyberDuck and they can drag & drop files, or
Setup the AWS Command-Line Interface (CLI) on their computer and have them double-click a script file to sync local disk folder to the s3 bucket, or
Use Dropbox and Invoke Lambda Function on New File from Dropbox - Pipedream
The main thing to think about is how you want to secure the upload. For example, do they need to authenticate first, or do you want anyone in the world to be able to upload to the bucket (not a good idea!). Should each user have their own IAM User credentials, or should they authenticate to an application that manages its own logins?
Also, have a think about what you want to happen after they upload the file. If they need to download something after the file has been processed, then you'll need a way to 'give back' the new file.

How do i create object approval from aws management console or aws cli for S3 object

I am using Django boto3 module to upload images and videos to AWS S3 and also using cloudfront CDN.
User create their account and upload images and videos to AWS S3 , but i want to put a check and implement admin approval for video and images .
Currently, the images and videos uploaded in AWS s3 via Django app is public by default.
Can it be possible via AWS management console or AWS cli to implement admin approval for images and videos?
Please help.
use some specific prefix (like "unapproved") when user uploads files .
create one application(Admin Panel) on web/mobile, there you can show/list of image files prefixed with "unapproved".
now check and approve ( one button) .. after approve, copy original file and rename it with "approved" prefix or simply without prefix to s3 and delete old one..

Streaming data to web from google cloud storage

I am trying to publish the data from my google cloud to a website. I have created a script which dynamically uploads the data from my device to google cloud. The issue i am facing is how can i publish the data from google cloud to a website. I just need to display the data which i have on google cloud.
As suggested in the comments, you can make your bucket public and just have your application fetch the object via an HTTP request or even just post the link in your website depending on what you are trying to do. If you don’t want to make your bucket public, or you just want a more personalized approach, you can just create your own application that retrieves data from the bucket by using the GCS client library. Here you can find some code samples on how you could download objects from a GCS bucket from within your own application.
I have found a way when you upload some data(lets say some images) you make the storage bucket public. After making it public you will get link of each object you upload. So, when you are uploading a image in gcp storage save them with continuous names(ex-1.jpg,2.jpg,3.jpg...). So you will get a link of each object in the format(https://storage.googleapis.com/bucket_name/1.jpg).
So when you will be working on its front-end you just need to set a loop and all the data would be streamed to web.
If there's a good way then please suggest.
Thank you

How to upload files from React-native to S3 and store filename via REST API

I have dilemma in how to architect my React-native app in the best way using Amazon AWS S3 as image/file storage and Django backend using REST API.
My react-native app has to be able to collect information from user together with couple of images and signatures. I am saving all information as props in redux and I can successfully transfer that to the database using Rest API in Django that I use as backend system.
I can also send images to Amazon AWS S3 bucket, but that is a separate operation.
My dilemma is if it is good practice to send images to S3 first and then send filename in the REST API call together with other info that is collected from the user in app?
In this way, I have files in the place on S3 and I can use them in the creation of a PDF file that is going to be done by Django backend system.
You should be using AmazonS3 as the storage backend for Django using S3Boto3Storage. That will make it a single operation as well as give django access to S3.
Other option is also to mount S3 as a file system on the machine running django and make the MEDIA path as the mounted location. Though, this would add a step of mounting S3 on startup of the machine every time.
Check out this link for the first option.

Best choice of uploading files into S3 bucket

I have to upload video files into an S3 bucket from my React web application. I am currently developing a simple react application and from this application, I am trying to upload video files into an S3 bucket so I have decided two approaches for implementing the uploading part.
1) Amazon EC2 instance: From the front-end, I am hitting the API and the server is running in the Amazon EC2 instance. So I can upload the files into S3 bucket from the ec2 instance.
2) Amazon API Gateway + Lambda: I am directly sending the local files into an S3 bucket through API + Lambda function by calling the https URL with data.
But I am not happy with these two methods because both are more costly. I have to upload files into an S3 bucket, and the files are more than 200MB. I don't know I can optimize this uploading process. Video uploading part is necessary for my application and I should be very careful to do this part and also I have to increase the performance and cost-effective.
If someone knows any solution please share with me, I will be very helpful for me to continue my process.
Thanks in advance.
you can directly upload files from your react app to s3 using aws javascript sdk and cognito identity pools and for the optimization part you can use AWS multipart upload capability to upload file in multiple parts I'm providing links to read about it further
AWS javascript upload image example
cognito identity pools
multipart upload to S3
also consider a look at aws managed upload made for javascript sdk
aws managed upload javascript
In order to bypass the EC2, you can use a pre-authenticated POST request to directly upload you content from the browser to the S3 bucket.