Setting Request Time Limit in Drupal - web-services

Anyone knows how to up the request time length in drupal? I'm trying to download large files via the web services module but my token keeps expiring because the request takes so long. I know there is a setting in drupal to do this but I just can't find it.
UPDATE
So I found out how to up the request time (/admin/build/services/settings) but that didn't work. I'm still getting "The request timed out" on files about 10mb large. Anyone has any ideas? Also, I'm using ASIHTTPRequest and drupal-ios-sdk and downloading the files to an iPad.

Turns out it was the default timeOutSeconds property on the ASIHTTPRequest was too small (10 seconds). When I uped it, my large files downloaded ok

Related

s3, need to download from browser, 10GB limit

I'm in a bit of a pinch. I'm stuck using a computer that doesn't have the necessary CLI capabilities to download files from S3 (IT restrictions on what I can put on the computer). Hence, I can only download a file via the browser.
Problem: when I try to download a file that is >10Gb, I get an error. I'm guessing that there's a limit to the size of file I'm downloading (I have plenty of drive space for this, so it isn't a space issue).
How can I resolve this? Is there a setting on the browser that I need to change? Or something in S3 that I need to change? Thanks!

Static Files and Upload, process, and download in Django

I've made a desktop app in Python to process .xls files with openpyxl, tkinter and other stuff. Now i'm looking to run this App on www.pythonanywhere.com. I was expecting to make an app to upload the file to the server, process it and then retrieve it changed to the user. But after long months struggling with Django i`ve reached the problem of static media. As I understood Django doesn’t serve files in production mode. Does this means that I can’t upload-process-download as I was planning? Or I can run in the view function the process algorithm on the request.file and then retrieve it changed, regardless of not serving static files? Am I missing something? Does Flask has the same issue? What’s the optimal solution? Apologize me for the many doubts.
PD: I took the time to read many similar questions and seems to be possible to upload and download, but then im missing something on handling static files and what that means

download speed in google cloud storage

I am using the MediaIoBaseDownload to implement download from GCS.
But I found that the download response is always about 5 seconds between each response.
if I download two files at the same time, the gap between each response will around 10 seconds.
upload speed is fine, it only occurred while downloading.
Is there any limitation about the download API cause I could not found the limitation.
after add some log information I found that the most time spent at the response.read() in httplib2
Could I think this is the limitation that GCS server holds or is there any setting of buckets(e.g. like DRA) that will affect the download speed?
I am using python of 2.7.8.
thanks!

Django-skel slow due to httplib requests to S3

G'day,
I am playing around with django-skel on a recent project and have used most of its defaults: Heroku for hosting and S3 for file storage. I'm mostly serving a static-y site except using sorl for thumbnail generation, however the response times are pathetic.
You can visit the site: http://bit.ly/XlzkXp
My template looks like: https://gist.github.com/cd15e320be6f4454a7fb
I'm serving the template using a shortcut from the URL conf, no database usage at all: https://gist.github.com/f9d1a9a191959dcff1b5
However, it's consistently taking 15+ seconds for the response. New relic shows this is because of requests going to S3 while processing the view. This does not make any sense to me.
New Relic data: http://i.imgur.com/vs9ZTLP.png?1
Why is something using httplib to request things from S3? I can see how collectstatic might be doing it, but not the processing of the view itself.
What am I not understanding about Django-skel and this setup?
Have the same issue, my guess is that:
django-compress with django-storage are both in use
which results the former saving cache it needs to render templates to S3 bucket
and then reading it (through network, so httplib) while rendering each template
My second guess was that instructions on django-compress with remote storage to implement "S3 Storage backend which caches files locally, too" would resolve this issue.
Though it makes sense to me: saving cache to both locations local and S3 and reading from local filesystem first should speed things up, it somehow does not work this way.. still the response time is around 8+ sec.
By disabling django-compress with COMPRESS_ENABLED = False i managed to get 1-1.3 sec average response time.
Any ideas?
(I will update this answer in case of any progress)

File upload too large

I use cffile action="Upload", and it works just fine for smaller files.
But yesterday, I tried uploading a 42Meg pdf and it just sat there all night long.
Q: Is there something I should know regarding uploading larger files? I define larger as: the one that sat there all night long and never timed out and never uploaded. For instance, is there a cflib or Riaforge routine that was written to handle larger files?
Right now I'm at version 8 at hosting.com.
In addition to the file upload limits in CF, most web servers (Apache, IIS, etc) have an upload limit. Check that as well.
You can change setting in CFadministrator > settings > Request Size Limits
Check the maximum post size set in ColdFusion Administrator. If it is set lower than 48 MB then you would have issues. I have uploaded much larger (200+ MB) with standard cffile with no issues.