how to upload django files as background? Or how to increased file upload? - django

I need to increased file upload speed in Django. Any ways to do this? I guess about to upload files as a background, when user send POST request i just redirect them to some page and start uploading files. Any ways to do this? Or do you know any ways to increased upload speed? Thank you in advance

Low upload speed could be a result of several issues.
It is a normal situation and your client doesn't have a possibility to upload at a higher speed.
Your server instance using an old HDD and can't write quickly.
Your server works on another pool of requests and serves your clients as fast as it could but it is overloaded.
Your instance doesn't have free space on the hard drive
Your server redirects the file as a stream somewhere else.
You've written a not optimized code of the upload handler.
etc.
You don't use a proxy-server that works perfectly with slow clients and when a file is on the proxy server's side give it to Django in a moment.
You are trying to upload a very big file.
etc.
Maybe you have more details on how you handle the uploads and your environment.

Related

Uploading large files to server

The project I'm working on logs data on distributed devices that needs to be joined in a single database on a remote server.
The logs cannot be streamed as they are recorded (network may not be available etc) so they must be sent in bulky 0.5-1GB text based csv files occasionally.
As far as I understand this means having a web service receive the data in form of post requests is out of the question because of file sizes.
So far I've come up with this approach: Use some file transfer protocol (ftp or similar) to upload files from device to server. Devices would have to figure out a unique filename to do this with. Have the server periodically check for new files, process them by committing them to the database and deleting them afterwards.
It seems like a very naive way to go about it, but simple to implement.
However, I want to avoid any pitfalls before I implement any specifics. Is this approach scaleable (more devices, larger files)? Implementation will either be done using a private/company owned server or a cloud service (Azure for instance) - will it work for different platforms?
You could actually do this through web/http as well, after setting a higher value for post request in the web server (post_max_size andupload_max_filesize for PHP). This will allow devices to interact regardless of platform. Should't be too hard to make a POST request server from any device. A simple cURL request could get this job done.
FTP is also possible. Or SCP, to make it safer.
Either way, I think this does need some application on the server to be able to fetch and manage these files using a database. Perhaps a small web application? ;)
As for the unique name, you could use a combination of the device's unique ID/name along with current unix time. You could even hash this (md5/sh1) afterwards if you like.

Best back end file upload parallelism technique

In my web application (Django), I am sending multiple files to the backend. I need to implement a parallel file upload technique which will make use of maximum CPU core capacity. I am ready to implement it it any language or any tool.
Please someone suggest best tools or techniques to implement this, so that file saving to backend server (to hard disk or database) should be completed in much less time compared to normal file upload.
I don't think it is about making upload parallel in the back-end, and you might think of doing something on front-end, where you take your files split into chunks and send chunks to back-end by ajax. HTML5 allows you to split the file on front-end and send chunk-by-chunk to server. So you can try using some of JS plugins which will help to achieve this

How to speed up Django FileField upload speed?

I have a FileField that uses django-storages' S3BotoBackend to upload audio files to to Amazon S3. The audio files can be up to 10MB in size, and a user can upload multiple files in the same form. The upload time can be very long and blocks. In order to speed up processing, I thought about writing a custom storage backend that inherits S3BotoBackend and submits jobs to a beanstalk queue before uploading to S3.
Are there any easier alternatives to speed up the user experience?
If you want to speed things up, you'll want to have your Web server more engaged with handling uploads. You can check out out the Nginx upload module for Nginx, though you can accomplish much of the same using any Web server.
For this approach, you'll configure a view that's going to receive a request once a file has been successfully uploaded by the user, which would then be the opportune moment to queue the file to be uploaded to S3.
This will allow you to asynchronously receive multiple uploads from a user and asynchronously send the files to S3, which should cover just about everything you could do to improve the file upload experience.

File uploading with ColdFusion, too big of file timing out?

A client has the admin ability to upload a PDF to their respective directory and have it listed on their website. All of this works dandy until a PDF reaches a certain file size that makes the server time out. This causes an error and the file uploaded will not succeed.
As mentioned in the title, we are using ColdFusion with a command. Is there any java/jquery/flash modules or applications that could resolve this issue?
Edit: For clarification, this is the web server timing out and not ColdFusion.
You can change setting in CFadministrator > settings > Request Size Limits
On the action page, you can use CFSETTING to extend the timeout, allowing the page to run longer than it otherwise is allowed:
<cfsetting requesttimeout="{seconds}">
Obviously, replace {seconds} with the number of seconds you want to allow.
For clarification, this is only if it is CF timing out, and not the web server or client (browser).
Also, most web servers also have a file size limit set for uploads. Make sure this is set to a reasonable size.
You might want to consider using the cffileupload tag. It's a flash based uploader that might give you a better result.
Alternatively, you might be able to find some way using a flash upload tool to break up the upload into chunks and put it back together again somehow to meet the hard limits of your web server.
If you figure it out please be sure to share it here, this is an interesting problem!
Your will need to pay attention to the iiS configuration > RequestLimits > maxallowedcontentLength
setting, as well as the request timeout setting in ColdFusion administrator.
If a large file is uploaded that exceeds 30 MBytes, then iiS will throw a 404 error by default.
Suggest you increase the setting (I changed mine to 300 MBytes) to the maximum you might expect then change the timeout setting in ColdFusion to suit the size of file and the bandwidth that is available on your web hosting site in concert with bandwidth available to your clients (worst case).
You ought to test the upload with an approprate size file to make sure it all works but make sure that the site you test from has equivalent bandwidth to your clients. E.g. test from a site that uses ADSL.

handling large uploads on django, exceeding the max size on nginx

we have a django app on nginx where users upload media files. the media are huge such as 30min tv and radio programs resulting 100-300mb, and our shared hosting limits the upload to 30mb.
how to embed a smart uploader which will put chunks of 20-30mb instead of trying to upload the large file? we would like not to destroy our highly edited forms, so if there is an easy way to insert such tool as a bulletproof widget, you're awesome.
links, snippets, examples - highly appreciated, and any ideas are welcome. tx in advance.
You should consider alternative hosting (perhaps a virtual private server), as for any serious downloads you will quickly run into the limits of your shared hosting.