Not able to upload large files in amazon s3 - django

I am trying to upload large files like audio and video in amazon s3, django is throwing worker timeout. If so i use different credentials then its working fine. Just replacing the credentials of other account does the work. When I revert back the credentials to the other account it’s failing. But it works for small files less than 3mb. I hope I might have missed out some settings in my amazon s3 dashboard for this particular account. could anyone help me out?

Related

Can users upload files into a S3 bucket without frontend experience or users having access to AWS account?

I am looking to create an AWS solution where a lambda function will transform some excel data from a S3 bucket. When thinking about how I'm going to create the architecture background, I need to think of a way where I can get non-technical users, which don't have access to the AWS account, to upload data into a S3 bucket. One possible solution is using an S3 API and creating a UI to allow the users to upload the data. However, I do not have much experience with front end programming skills such as JS and HTML. Are there any other possible solutions we can use?
I've thought about creating a simple UI and using a S3 API to ingest data into the bucket but I do not have front end programming experience.
Some ideas:
Serve a web page from somewhere (S3?) and have them upload via the browser, or
Give them a simple program like CyberDuck and they can drag & drop files, or
Setup the AWS Command-Line Interface (CLI) on their computer and have them double-click a script file to sync local disk folder to the s3 bucket, or
Use Dropbox and Invoke Lambda Function on New File from Dropbox - Pipedream
The main thing to think about is how you want to secure the upload. For example, do they need to authenticate first, or do you want anyone in the world to be able to upload to the bucket (not a good idea!). Should each user have their own IAM User credentials, or should they authenticate to an application that manages its own logins?
Also, have a think about what you want to happen after they upload the file. If they need to download something after the file has been processed, then you'll need a way to 'give back' the new file.

When is a file available to download from Amazon S3?

I can't find some information about Amazon S3, hope you will help me. When is a file available for user to download, after the POST upload? I mean some small JSON file that doesn't require much processing. Is it available to download immediately after uploading? Or maybe amazon s3 works in some sessions and it always takes a few hours?
According to the doc,
Amazon S3 provides strong read-after-write consistency for PUTs and DELETEs of objects in your Amazon S3 bucket in all AWS Regions.
This means that your objects are available to download immediately after it's uploaded.
An object that is uploaded to an Amazon S3 bucket is available right away. There is no time period that you have to wait. That means if you are writing a client app that uses these objects, you can access them as soon as they are uploaded.
In case anyone is wondering how to programmatically interact with objects located in an Amazon S3 bucket through code, here is an example of uploading and reading objects in an Amazon S3 bucket from a client web app....
Creating an example AWS photo analyzer application using the AWS SDK for Java

Uploading log file from client app to Amazon S3 is safe?

My application runs on the client PC. It produces log files including error reports and user's action.
To collect and analyze log files, I try to upload log files to Amazon S3 from the client PC.
But is it safe? My app has no authentication so that users can upload unlimited number of files. I am concerned with maricious user upload a fake error report and huge file. I'd like s3 bucket not to exceed free quota. Is there any best practice for this task?
Just make sure that the files you are uploading to Amazon S3 are kept as Private and the Amazon S3 bucket is kept as private. These are the default settings and are enforced by Amazon S3 block public access unless somebody has specifically changed the settings.
With this configuration, the files are only accessible to people with AWS credentials that have been granted permission to access the S3 bucket.
Additionally to John's answer you can use AWS KMS (https://aws.amazon.com/kms/?nc1=h_ls) to encrypt your data at rest.
With regards of the file size, you should limit the size of the uploaded file in your application I would say.

Best choice of uploading files into S3 bucket

I have to upload video files into an S3 bucket from my React web application. I am currently developing a simple react application and from this application, I am trying to upload video files into an S3 bucket so I have decided two approaches for implementing the uploading part.
1) Amazon EC2 instance: From the front-end, I am hitting the API and the server is running in the Amazon EC2 instance. So I can upload the files into S3 bucket from the ec2 instance.
2) Amazon API Gateway + Lambda: I am directly sending the local files into an S3 bucket through API + Lambda function by calling the https URL with data.
But I am not happy with these two methods because both are more costly. I have to upload files into an S3 bucket, and the files are more than 200MB. I don't know I can optimize this uploading process. Video uploading part is necessary for my application and I should be very careful to do this part and also I have to increase the performance and cost-effective.
If someone knows any solution please share with me, I will be very helpful for me to continue my process.
Thanks in advance.
you can directly upload files from your react app to s3 using aws javascript sdk and cognito identity pools and for the optimization part you can use AWS multipart upload capability to upload file in multiple parts I'm providing links to read about it further
AWS javascript upload image example
cognito identity pools
multipart upload to S3
also consider a look at aws managed upload made for javascript sdk
aws managed upload javascript
In order to bypass the EC2, you can use a pre-authenticated POST request to directly upload you content from the browser to the S3 bucket.

File upload API on EC2 with ELB and S3

I am developing app server with NodeJS and AWS.
I am setting up the server environment with ELB and EC2s.
I am using ELB as load balancer and attached several app server EC2 instances to it.
And one EC2 instance is used for MongoDB.
My question is about request including file upload.
I think uploaded file should not be in app server (EC2 instance), so I will try to save uploaded files in S3 and allow app servers (EC2 instances) to access it.
The rough solution is that if app servers accept a file from client, move it to S3 and delete the file on the app server.
But then it will cause some performance loss and I don't feel it's a clean way.
Is this a best way? or there is another way to solve it.
I think it's best way to upload file to S3.
But file is uploaded with other data. (For example, profile upload - name: String, age: Number, profileImage: File)
I need to process other data on app server, so client should not upload to S3 directly.
Is there any better idea?
Please save me.
P.S: Please let me know if you cannot understand my expression because I am not native. If so, I will add some explanation for it with my best!
You can directly upload to S3 using temporary credentials that allow the end user to write to you bucket.
There is a good article with detailed code for doing exactly what you are trying to do with node.js here.
Answers that refer to external links are frowned upon on SO, so in a nutshell:
include the aws sdk in your application
provide it with appropriate credentials
use those credentials to generate a signed URL with a short lifespan
provide the end user with the signed URL they can then use to upload, preferably asynchronously with progress feedback