I have a website www.theraplounge.co that allows users to upload videos. The problem is our limit on file sizes are to small. How can I increase the file size users are able to upload through my forms.py FileField? By the way I’m currently using Amazon S3.
You should be able to configure it via DATA_UPLOAD_MAX_MEMORY_SIZE or/and FILE_UPLOAD_MAX_MEMORY_SIZE setting attributes (remember that you should give the value in bytes).
Check it out:
https://docs.djangoproject.com/en/4.0/ref/settings/#std-setting-DATA_UPLOAD_MAX_MEMORY_SIZE
https://docs.djangoproject.com/en/4.0/ref/settings/#std-setting-FILE_UPLOAD_MAX_MEMORY_SIZE
The problem here is heroku 30 second timeout as i found large files upload on development server but not the production server. The solution would be to upload the files in the front-end using javascript as it doesn't load the browser. I am currently trying to use Javascript AWS SDK to upload video files in the front-end. I got the code to work on small files under 30MB but i face the problem of uploading large files. Posting the working Javascript AWS SDK code below:
<div>
<input type="file" id="fileUpload">
</div>
<div>
<button onclick="s3upload()">Submit</button>
</div>
<script type="text/javascript">
function s3upload() {
AWS.config.region = 'us-east-1'; // Region
AWS.config.credentials = new AWS.CognitoIdentityCredentials({
IdentityPoolId: 'MY-COGNITO-CREDITIONS',
});
var files = document.getElementById('fileUpload').files;
if (files)
{
var file = files[0];
var fileName = file.name;
var fileUrl = 'https://theraplounge.s3.amazonaws.com/' + fileName;
var s3 = new AWS.S3({apiVersion: '2006-03-01'});
var params = {
Bucket: 'MY-BUCKET-NAME',
Key: fileName,
Body: file,
};
var s3 = new AWS.S3({apiVersion: '2006-03-01'});
var options = {partSize: 5 * 1024 * 1024, queueSize: 1};
s3.upload(params, options, function(err, data) {
if(err) {
alert('error');
} else{
alert('uploaded suceessfully')
};
});
console.log(s3)
}
};
</script>
This is the right approach as it doesn't load the browser at all but i cant get it to upload large files. Once I do find working Javascript code to upload large files I'll update this answer.
Edit:
My problem with uploading large files was my cors policy at Amazon AWS. But also when launched to production I found that Javascript AWS SDK does load the browser and bypass heroku 30 timeout. So I added some knowledge to the process and created a two step form upload process that pass the videos s3 url to another view which would reset the 30s timeout and upload the other information. Test it out on my site www.theraplounge.co/ I manually set the max file size to 500MB but I could be more.
Related
I am working with AWS S3 Bucket, and trying to upload image from react native project managed by expo. I have express on the backend. I have created a s3 file on backend that handles getting the presigned url, and this works, and returns the url to the front end inside this thunk function from reduxjs toolkit. I used axios to send request to my server, this works. I have used axios and fetch to try the final put to the presigned url but when it reached the s3 bucket there is nothing in the file just an empty file with 200 bytes everytime. When I use the same presigned url from postman and upload and image in binary section then send the post request the image uploads to the bucket no problems. When I send binary or base64 to bucket from RN app it just uploads those values in text form. I attempted react-native-image-picker but was having problems with that too. Any ideas would be helpful thanks. I have included a snippet from redux slice. If you need more info let me know.
redux slice projects.js
// create a project
// fancy funtion here ......
export const createProject = createAsyncThunk(
"projects/createProject",
async (postData) => {
// sending image to s3 bucket and getting a url to store in d
const response = await axios.get("/s3")
// post image directly to s3 bucket
const s3Url = await fetch(response.data.data, {
method: "PUT",
body: postData.image
});
console.log(s3Url)
console.log(response.data.data)
// make another request to my server to store extra data
try {
const response = await axios.post('/works', postData)
return response.data.data;
} catch (err) {
console.log("Create projects failed: ", err)
}
}
)
I am currently creating a basic app with React-Native (frontend) and Flask/MongoDB (backend). I am planning on using AWS S3 as cheap cloud storage for all the images and videos that are going to be uploaded and viewed. My current idea (and this could be totally off), is when a user uploads content, it will go through my Flask API and then to the S3 storage. When a user wants to view content, I am not sure what the plan of attack is here. Should I use my Flask API as a proxy, or is there a way to simply send a link to the content directly on S3 (which would avoid the extra traffic through my API)?
I am quite new to using AWS and if there is already a post discussing this topic, please let me know, and I'd be more than happy to take down this duplicate. I just can't seem to find anything.
Should I use my Flask API as a proxy, or is there a way to simply send a link to the content directly on S3 (which would avoid the extra traffic through my API)?
If the content is public, you just provide an URL which points directly to the file on the S3 bucket.
If the content is private, you generate presigned url on your backend for the file for which you want to give access. This URL should be valid for a short amount of time (for example: 15/30 minutes). You can regenerate it, if it becomes unavailable.
Moreover, you can generate a presigned URL which can be used for uploads directly from the front-end to the S3 bucket. This might be an option if you don't want the upload traffic to go through the backend or you want faster uploads.
There is an API boto3, try to use it.
It is not so difficult, I have done something similar, will post code here.
I have done like #Ervin said.
frontend asks backend to generate credentials
backend sends to frontend the credentials
Frontend upload file to S3
Frontend warns backend it has done.
Backend validate if everything is ok.
Backend will create a link to download, you have a lot of security options.
example of item 6) To generate a presigned url to download content.
bucket = app.config.get('BOTO3_BUCKET', None)
client = boto_flask.clients.get('s3')
params = {}
params['Bucket'] = bucket
params['Key'] = attachment_model.s3_filename
params['ResponseContentDisposition'] = 'attachment; filename={0}'.format(attachment_model.filename)
if attachment_model.mimetype is not None:
params['ResponseContentType'] = attachment_model.mimetype
url = client.generate_presigned_url('get_object', ExpiresIn=3600, Params=params)
example of item 2) Backend will create presigned credentials to post your file on S3, send s3_credentials to frontend
acl_permission = 'private' if private_attachment else 'public-read'
condition = [{'acl': acl_permission},
["starts-with", "$key", '{0}/'.format(folder_name)],
{'Content-Type': mimetype }]
bucket = app.config.get('BOTO3_BUCKET', None)
fields = {"acl": acl_permission, 'Bucket': bucket, 'Content-Type': mimetype}
client = boto_flask.clients.get('s3')
s3_credentials = client.generate_presigned_post(bucket, s3_filename, Fields=fields, Conditions=condition, ExpiresIn=3600)
example of item 5) Here are an example how backend can check if file on S3 are ok.
bucket = app.config.get('BOTO3_BUCKET', None)
client = boto_flask.clients.get('s3')
response = client.head_object(Bucket=bucket, Key=s3_filename)
if response is None:
return None, None
md5 = response.get('ETag').replace('"', '')
size = response.get('ContentLength')
Here are an example how frontend will ask for credentials, upload file to S3 and inform backend it is done.
I tried to remove a lot of particular code.
//frontend asking backend to create credentials, frontend will send some file metadata
AttachmentService.createPostUrl(payload).then((responseCredentials) => {
let form = new FormData();
Object.keys(responseCredentials.s3.fields).forEach(key => {
form.append(key, responseCredentials.s3.fields[key]);
});
form.append("file", file);
let payload = {
data: form,
url: responseCredentials.s3.url
}
//Frontend will send file to S3
axios.post(payload.url, payload.data).then((res) => {
return Promise.resolve(true);
}).then((result) => {
//when it is done, frontend informs backend
AttachmentService.uploadSuccess(...).then((refreshCase) => {
//Success
});
});
});
We've built a custom front end for users to post threads and upload images on top of Sitecore 9. Recently, we moved the user generated content to it's own database, so the media files are no longer in the proper sitecore 'media gallery'. Also, file storage is on a networked share. Uploading images works just fine. It's immediately after where the problem lies.
Upon image upload, our rest api returns the media url from the MediaManager.GetMediaUrl(itemId). This also works. It returns a valid url, it is formatted correctly and should resolve. Unfortunately, for some time after, the url does the sitecore dance an 302s to our 404 page.
I can see the image through the content editor and our custom content handler injects the image folder node into both master and web databases.
Why would the link manager be able to find the url, but the image is not available? I uploaded something yesterday afternoon, an when I checked this morning, the image is now available on the site. Any information or suggestions are appreciated.
I have tried saving the image multiple times hoping that this might trigger whatever mystical Sitecore event that causes images to show. Since there isn't a publish due to the separate database, I can't try that. I've removed all versions thinking it couldn't default to a particular language. Nothing. Only time seems to make the images visible. The code below works just fine. I'm just putting it here to to show some work.
public MediaItem UploadSimpleMedia(MediaGallerySimpleUploadRequest request)
{
try
{
var destinationFolder = request.ParentItemId != null
? _content.GetItem<Item>(request.ParentItemId.Value)
: _content.GetItem<Item>(_publicLibraryPath + "/embeded");
var name = !string.IsNullOrEmpty(request.Name)
? ItemUtil.ProposeValidItemName(request.Name)
: ItemUtil.ProposeValidItemName(request.Files[0].FileName);
var creator = new MediaCreator();
var tags = request.Tags != null ? request.Tags.Split(',') : new string[0];
var tagIds = _tagService.GetTagIds(tags);
var tagIdString = string.Join(",", tagIds);
var options = new MediaCreatorOptions()
{
AlternateText = request.Name,
FileBased = true,
IncludeExtensionInItemName = false,
Versioned = false,
Destination = $"{destinationFolder.Paths.Path}/{name}",
Database = Factory.GetDatabase("content")
};
using (new SecurityDisabler())
using (new DatabaseCacheDisabler())
{
MediaItem item = creator.CreateFromStream(request.Files[0].InputStream, request.Files[0].FileName, options);
item.BeginEdit();
item.InnerItem["Title"] = request.Name;
item.InnerItem["Description"] = request.Description;
item.InnerItem["__Semantics"] = tagIdString;
// Sitecore won't calculate MIME Type or File Size automatically unless you upload directly
// to the Sitecore Media Library. We're uploading in-place for private groups for permission inheritance
// and now because of indexing, they are getting uploaded to /Community/Gallery/Files if not private.
item.InnerItem["Size"] = request.Files[0].ContentLength.ToString();
item.InnerItem["Mime Type"] = request.Files[0].ContentType;
item.EndEdit();
// Just pausing here to reflect on why Sitecore is so sexily complicated
// that it takes time to display an image on the CD, but has no problem giving
// up the media url
CacheManager.GetHtmlCache(Sitecore.Context.Site);
_searchService.RefreshIndex(item.ID.Guid);
var fromDatabase = _content.GetItem<Item>(item.ID.Guid);
return fromDatabase;
}
}
catch (Exception ex)
{
_logger.Error(this.GetType().AssemblyQualifiedName, ex);
_logger.Trace(ex.StackTrace);
}
return null;
}
I don't get any error messages either in our custom logs or the default sitecore logs. Images just aren't resolving. Is it caching? I've even nuked all caches by calling CacheManager.ClearAllCaches().
I am developing an application that needs to send files to Google Cloud Storage.
The webapp will have a HTML page that the user choose files to do upload.
The user do not have Google Account.
The amount files to send is 5 or less.
I do not want to send files to GAE and GAE send to GCS. I would like that my user to do upload directly to GCS.
I did this code for upload:
function sentStorage() {
var file = document.getElementById("myFile").files[0];
url = 'https://www.googleapis.com/upload/storage/v1/b/XXX/o?uploadType=resumable&name=' + file.name;
xhr = new XMLHttpRequest();
var token = 'ya29.XXXXXXXXXXXXXXX';
xhr.open('POST', url);
xhr.setRequestHeader('Content-Type', file.type);
// resumable
//url = 'https://www.googleapis.com/upload/storage/v1/b/XXXXXX/o?uploadType=resumable&name=' + file.name;
//xhr.setRequestHeader('Content-Type', 'application/json; charset=UTF-8');
//xhr.setRequestHeader('Content-Length', file.size);
xhr.setRequestHeader('x-goog-project-id', 'XXXXXXXXXX');
xhr.setRequestHeader('Authorization', 'Bearer ' + token);
xhr.send(file);
xhr.onreadystatechange = function () {
if (xhr.readyState === 4) {
var response = JSON.parse(xhr.responseText);
if (xhr.status === 200) {
alert('codigo 200');
} else {
var message = 'Error: ' + response.error.message;
console.log(message);
alert(message);
}
}
};
}
I get a serviceaccount information (Google Console) and generate a token Bearer for it. I used a python file that read the "json account information" and generate the token.
My requisit is that user do not need to confirm any Google Account information for send files, this obligation is from my application. (Users do not have Google Account) and the html page send the files directly to GCS without send to GAE or GCE, so, I need to use HTML form or Javascript. I prefer Javascript.
Only users of this application can do upload (the application has an authentication with database), so, anonymous user can not do it.
My questions are:
This token will expire? I used a serviceaccount for generate this token.
There is a better api javascript to do it?
This security solution is better or I should use a different approach?
Sending either a refresh or an access token to an untrusted end user is very dangerous. The bearer of an access token has complete authority to act as the associated account (within the scope used to generate it) until the access token expires a few minutes later. You don't want to do that.
There are a few good alternatives. The easiest way is to create exactly the upload request you want, then sign the URL for that request using the private key of a service account. That signed URL, which will be valid for a few minutes, could then be used to upload a single object. You'll need to sign the URL on the server side before giving it to the customer. Here's the documentation on signed URLs: https://cloud.google.com/storage/docs/access-control/signed-urls
hai i am try to upload an image to Amazon-s3 using react-native-aws-signature, here is my sample code i am attaching
var AWSSignature = require('react-native-aws-signature');
var awsSignature = new AWSSignature();
var source1 = {uri: response.uri, isStatic: true}; // this is uris which got from image picker
console.log("source:"+JSON.stringify(source1));
var credentials = {
SecretKey: ‘security-key’,
AccessKeyId: ‘AccesskeyId’,
Bucket:’Bucket_name’
};
var options = {
path: '/?Param2=value2&Param1=value1',
method: 'POST',
service: 'service',
headers: {
'X-Amz-Date': '20150209T123600Z',
'host': 'xxxxx.aws.amazon.com'
},
region: ‘us-east-1,
body: response.uri,
credentials
};
awsSignature.setParams(options);
var signature = awsSignature.getSignature();
var authorization = awsSignature.getAuthorizationHeader();
here i am declaring the source1 in that response.uri is passing in body which is coming from image picker,Can any one give suggestions that is there any wrong in my code, if there please tell me that how to resolve it,Any help much appreciated
awsSignature.getAuthorizationHeader(); will return the authorization header when given the correct parameters, and that's all it does.Just a step in the whole process of making a signed call to AWS API.
When sending POST request to S3, here is a link to the official documentation that you should read. S3 Documentation
It seems you need to send in the image as a form parameter.
You can also leverage the new AWS Amplify library on the official AWS repo here: https://github.com/aws/aws-amplify
This has a storage module for signing requests to S3: https://github.com/aws/aws-amplify/blob/master/media/storage_guide.md
For React Native you'll need to install that:
npm install aws-amplify-react-native
If you're using Cognito User Pool credentials you'll need to link the native bridge as outlined here: https://github.com/aws/aws-amplify/blob/master/media/quick_start.md#react-native-development