File uploading with ColdFusion, too big of file timing out? - coldfusion

A client has the admin ability to upload a PDF to their respective directory and have it listed on their website. All of this works dandy until a PDF reaches a certain file size that makes the server time out. This causes an error and the file uploaded will not succeed.
As mentioned in the title, we are using ColdFusion with a command. Is there any java/jquery/flash modules or applications that could resolve this issue?
Edit: For clarification, this is the web server timing out and not ColdFusion.

You can change setting in CFadministrator > settings > Request Size Limits

On the action page, you can use CFSETTING to extend the timeout, allowing the page to run longer than it otherwise is allowed:
<cfsetting requesttimeout="{seconds}">
Obviously, replace {seconds} with the number of seconds you want to allow.
For clarification, this is only if it is CF timing out, and not the web server or client (browser).
Also, most web servers also have a file size limit set for uploads. Make sure this is set to a reasonable size.

You might want to consider using the cffileupload tag. It's a flash based uploader that might give you a better result.
Alternatively, you might be able to find some way using a flash upload tool to break up the upload into chunks and put it back together again somehow to meet the hard limits of your web server.
If you figure it out please be sure to share it here, this is an interesting problem!

Your will need to pay attention to the iiS configuration > RequestLimits > maxallowedcontentLength
setting, as well as the request timeout setting in ColdFusion administrator.
If a large file is uploaded that exceeds 30 MBytes, then iiS will throw a 404 error by default.
Suggest you increase the setting (I changed mine to 300 MBytes) to the maximum you might expect then change the timeout setting in ColdFusion to suit the size of file and the bandwidth that is available on your web hosting site in concert with bandwidth available to your clients (worst case).
You ought to test the upload with an approprate size file to make sure it all works but make sure that the site you test from has equivalent bandwidth to your clients. E.g. test from a site that uses ADSL.

Related

Django REST framework Query Throughput on AWS

I've trying to run a simple django server on amazon lightsail.
The server is supposed to be a backend that serves game state to twitch viewers via an extension. I'm currently trying to have each viewer poll my backend every second.
Trying a very bear bones read to start I'm essentially storing the game state in memory using django's cache (there's only one tracked 'player' for now) and serving it back to users via the django REST framework.
Given that my requests and responses are quite small I'd expect to easily serve at least a few hundred users but I seem to top out of the sustainable CPU usage at around 50 users. Is this expected for this setup? Are there optimizations I can make or obvious bottlenecks?
Here's my backend that gets polled every second (I update settings post-git pull to debug=False and update the databse settings)
https://github.com/boardengineer/extension
Normal backend reads come in at about 800 bytes.
I tried returning empty responses to test if the response size should be optimized but it only made a small difference.
I thought of removing header content to further reduce response size but I don't think i found a way to do so correctly.
I also tried removing some of my middleware hoping to make the server more lean but I failed to find any middleware to remove

how to upload django files as background? Or how to increased file upload?

I need to increased file upload speed in Django. Any ways to do this? I guess about to upload files as a background, when user send POST request i just redirect them to some page and start uploading files. Any ways to do this? Or do you know any ways to increased upload speed? Thank you in advance
Low upload speed could be a result of several issues.
It is a normal situation and your client doesn't have a possibility to upload at a higher speed.
Your server instance using an old HDD and can't write quickly.
Your server works on another pool of requests and serves your clients as fast as it could but it is overloaded.
Your instance doesn't have free space on the hard drive
Your server redirects the file as a stream somewhere else.
You've written a not optimized code of the upload handler.
etc.
You don't use a proxy-server that works perfectly with slow clients and when a file is on the proxy server's side give it to Django in a moment.
You are trying to upload a very big file.
etc.
Maybe you have more details on how you handle the uploads and your environment.

Why does aws s3 getObject executes slowly even with small files?

I am relatively new to amazon web services. There is problem that came up while I was coding my new web app. I am currently storing profile pictures in an s3 bucket.
I don’t want these profile pictures to be seen by the public, only authorized members. So I have a php file like this:
This php file executes getObject and sends out a header to show the picture but only if the user is allowed to see the picture. I query the database and also check session to make sure that the currently logged in user has access to the picture. All is working fine, but it takes around 500 milliseconds to the get request to execute, even on small files (40kb). On bigger files it gets even longer as well as if I embed the php file in an img tag multiple times with different query string values.
I need to mention that I’m testing this in a localhost environment with apache webserver.
Could be the the problem is that getObject is optimized to be run from an ec2 instance and that if I would test this on an ec2 the response time is much better?
My s3 is based in London, and I’m testing it in Hungary with a good internet connection so I’m not sure if this response time is what I should get here.
I read that other people had similar issues, but from my understanding the time it takes from s3 to transfer the files to an ec2 should be minimal as they are all in the cloud and the latency between these services and all the other aws services should be minimal (At least if they are in the same region).
Please don’t tell me in comments that I should just make my bucket public and embed the direct link to the file as it is not a viable option for obvious reasons. I also don’t want to generate pre-signed urls for various reasons.
I also tested this without querying the database and essentially the only logic in my code is to get the object and show it to the user. Even with this I get 400+ milliseconds response time.
I also tried using doesObjectExist() and I still need to wait around 300-400 milliseconds for that to give me a response.
Multiple get request to the same php file as image source
UPDATE
I tested it on my ec2 instance and I've got much better response time. I tested it with multiple files and all is fine. It seems like that if you use getObject on localhost, the time it takes to connect to s3 and fetch the data multiplies.
Thank you for the answers!

How do websites set a maximum upload limit? Are there ways to bypass it?

For example: YouTube, before verifying your account, limits the maximum upload size to 15 minutes. How does it measure the size? Is there a way to edit the maximum upload size from your browser, make the site read the file as a 15-minute video though it's more, or is it all done on the server side, so there's no way to edit except from the server itself? This question also applies to other types of data, like text and images.
It's fixed server size, say in PHP. It takes the file, looks at it there, and rejects it based on what the file actually is. So no, it can't be bypassed.

finding out why a webapp is slow when hosted

I have a django web app that uses postgres db.It allows users to login and make some posts which get saved to db and later on the user can list how many posts he made on a particular day etc and list the posts belonging to a particular category etc.While this worked without any delay in my machine,it is taking a lot of time to load each page when hosted on a free host.
How do you find out why this is happening?which part of the app should I look first?Is there any meaning in using a profiler since this app used to run with no delays in my local machine?
I would like to find out how to approach this problem in general.I was able to access other apps hosted on the same free host without much delays ..so this may be a problem specific to my app
I would like some advice regarding this..If anyone can help..
thank you
p.s:(I intentionally left out the host's name because ,since that was a free service ,there was no point in complaining and also other apps on the same host works well)
The here is the free host bit, when on a free host you could be sharing a box with hundreds of other sites (that can equate to a very small amount of ram or CPU). Pay a little money, ($30 dollars / £22 a year) and get your self a better host.
You will find the performance and reliability so much better.
Failing that I would firstly find out what the latency between you and the server is, on a local machine there is no / little network traffic so your pages will appear to load a lot faster.
Next i would look at the actual download speeds you are getting. It could be that your site is limited to 20-30k, which means even a small site will take over a second to load.
Are you hosting many images? If this is the case are you serving them through django or is the webserver doing this. If it is django then make the webserver take this load.
Finally check the processing speed of the pages. Analise the queries which are being run and find out what is taking the time. Make sure that postgres is correctly configured and has enough resources. You can analyses the query speed using the django debug toolbar.