File upload too large - coldfusion

I use cffile action="Upload", and it works just fine for smaller files.
But yesterday, I tried uploading a 42Meg pdf and it just sat there all night long.
Q: Is there something I should know regarding uploading larger files? I define larger as: the one that sat there all night long and never timed out and never uploaded. For instance, is there a cflib or Riaforge routine that was written to handle larger files?
Right now I'm at version 8 at hosting.com.

In addition to the file upload limits in CF, most web servers (Apache, IIS, etc) have an upload limit. Check that as well.

You can change setting in CFadministrator > settings > Request Size Limits

Check the maximum post size set in ColdFusion Administrator. If it is set lower than 48 MB then you would have issues. I have uploaded much larger (200+ MB) with standard cffile with no issues.

Related

Why does Google Cloud Storage freeze when I try to upload a large folder (2.5GB of images)?

After getting frustrated with Azure, I decided to try GCP. I wanted to try training a deep learning image classification model using GCP. To start off, I went to Cloud Storage through the Google Cloud Console UI and made a bucket to upload my images and... it freezes when I try to upload my image folder. Not a good first experience.
I've tried half a dozen times. I always get the little popup "upload started," but then it freezes. If I close it and reopen, the bucket will have made the folder and downloaded 3 or so images, but even if I let it run all night, it never makes progress beyond that. It just stops uploading and becomes nonresponsive. I have about 38,000 images, in total the folder size that I'm trying to upload is 2.5 GB.
There is no error message, and no sign that anything went wrong when I reload aside from the fact that only 3 or so images downloaded. Nothing else on my computer freezes. If I have other tabs open in the browser, they do not freeze. The tab with Google Console in it simply silently freezes. I tried Data Transfer to move the files to Google directly from Azure Blob Storage, but Azure wouldn't give Google permission to access the storage.
Can someone recommend a possible work-around to this? Should I be uploading my files some other way?
Whether or not a large number of files can be uploaded via the Console in-browser appears to be a bit sporadic. In my particular case, attempting to upload a large folder (not large files in a folder, a large NUMBER of files in a folder) at once results in the browser freezing due to memory problems in GCP's Cloud Storage AND in AWS's S3 when I try to upload it through their browser portals. I have tried this on Windows 10, on Linux Mint, and tried on both Google Chrome and Firefox to no avail.
Some GCP and AWS users report freezing or crashes when the number of files in the folder is large, but not necessarily when the size of the folder itself is large. This is my experience exactly--I can upload a large file or a folder with a few hundred files through Cloud Console, but not a folder with many thousands of files.
The only stable solution I've found is to surrender and abandon trying to upload a massive number of files in a folder via browser portals and instead upload via CLI. It isn't the solution I'd hoped for, but it is the only one I've found.
EDIT:
Since this correct (and only working) solution was downvoted for some reason,
See the comments to Kainax's answer to this question for multiple AWS users experiencing this problem.
See the question and answers to this question for the same issue I had in GCP.
Since this is an issue and not a feature, there is no official documentation on it.
Neither I nor anyone else as yet could solve this through the Cloud Console and this has been a recurring issue for years with some users. The only clear workaround is to avoid using the Console UI altogether when uploading a folder with a large number of files.
Tried to recreate image folder with 2G+ size and upload it on bucket using console. Below steps are working on my end.
Create Image files up to 2G+
Cloud Storage Browser > Select Bucket
Upload Folder > Select Folder > Upload
On/Off Show Deleted data on Bucket Details Page
Validate on Cloud Shell via gsutil command.
gsutil ls
gsutil du -h -a gs://$BUCKETNAME
gsutil ls gs://$BUCKETNAME/* | wc -l
At first upon uploading, not responding pop up appear. Tried to open new tab and try to check my bucket on console. Also I can see only 3 files upon checking the folder on the bucket. I give it some time for about 15 mins and tried to refresh still 3 Files on the folder. Tried to on and off the Show deleted data (Middle Right part of Bucket Details) and some of the files are showing. I Give it a little more time like 30 - 45 mins and it was completed.
==> standard input <==
3.02 KiB gs://testupload/testimage/download (992).jpeg#1651637939860508
3.02 KiB gs://testupload/testimage/download (993).jpeg#1651637579558398
3.02 KiB gs://testupload/testimage/download (994).jpeg#1651637761564156
3.02 KiB gs://testupload/testimage/download (995).jpeg#1651637080592200
3.02 KiB gs://testupload/testimage/download (996).jpeg#1651637475578587
3.02 KiB gs://testupload/testimage/download (997).jpeg#1651637634800927
3.02 KiB gs://testupload/testimage/download (998).jpeg#1651637897219133
3.02 KiB gs://testupload/testimage/download (999).jpeg#1651638427607588
3.02 KiB gs://testupload/testimage/download.jpeg#1651636931991159
2.26 GiB gs://testupload/testimage/
XXXXXXXXX#cloudshell:~ (XXXXXXX)$ gsutil ls gs://testupload/* | wc -l
4564
Source File image
Uploaded Files on Console
For some reason, uploading the folder along with the files was freezing. I have created a folder, accordingly, and uploaded all the files into the folder. It has worked!

s3, need to download from browser, 10GB limit

I'm in a bit of a pinch. I'm stuck using a computer that doesn't have the necessary CLI capabilities to download files from S3 (IT restrictions on what I can put on the computer). Hence, I can only download a file via the browser.
Problem: when I try to download a file that is >10Gb, I get an error. I'm guessing that there's a limit to the size of file I'm downloading (I have plenty of drive space for this, so it isn't a space issue).
How can I resolve this? Is there a setting on the browser that I need to change? Or something in S3 that I need to change? Thanks!

Speeding up a transfer of a file to Google Cloud VM

I am uploading a file to my Google Cloud Platform VM using scp on Linux. However, after initially uploading it at a speed of 900 kb/s it quickly falls to 20kb/s. My internet upload speed should be around 20mbps. I wanted to upload an SQLite database clocking in at 20gb, but this is unfeasible at this point.
Right now I used 54 minutes to upload a 94 MB file. It cannot be that slow?
I had the same issue multiple times with GCP, the solution I use is to compress all my files, upload it to dropbox and then wget the file from there. The speeds should go back to normal.
This answer should help you ae well, though I don't know if your paticular issue is related to gcp , scp or both.
https://askubuntu.com/questions/760509/how-to-make-scp-go-faster

Sitecore Database and App_Data Size

We have 5 relatively small sites running on top of Sitecore. The oldest has been hosted within the environment for 3 years. Currently both the master and web databases are roughly 8 GB a piece - surprising in size but also that they are nearly identical in size (I would expect the web database to be much smaller). Also surprising is the App_Data is over 50 GB in size (MediaCache is 15 GB and MediaFiles is 37 GB). Any ideas or suggestions on ways to reduce files on disc - even temporarily?
Media Files - media items stored on disk (keep this folder)
Media Cache - is where sitecore caches image versions (e.g rezised images)
You can delete all the contents of the Media Cache folder. Sitecore will be gradually recreate the image cache of images that are being used on the sites.
If you use item versioning then you can run use the Version Manager and archive old versions. However as you Master and Web database are almost the same size I don't think that will help you. The web database only holds 1 version of each item.
The last thing would be to crawl through the media library and find items that don't have any referrers in the LinkDatabase and delete them. Make sure you back everything up first.
http://trac.sitecore.net/VersionManager
If you are storing media assets in the Database, I assume you are based on the sizes, I believe you can delete the App_Data folder and it will re-add files there as needed. If you're storing media assets on the file system, they're stored in App_Data which would explain the large size. Is it possible you're storing some assets in the DB and others on the file system? In any case, you should analyze what types of files are in the App_Data and compare it to an out-of-the-box Sitecore instance to see what is site content vs. generated cache files.
You can try to shrink databases as well
DBCC UPDATEUSAGE (web)
DBCC SHRINKDATABASE(web, 0);

Setting Request Time Limit in Drupal

Anyone knows how to up the request time length in drupal? I'm trying to download large files via the web services module but my token keeps expiring because the request takes so long. I know there is a setting in drupal to do this but I just can't find it.
UPDATE
So I found out how to up the request time (/admin/build/services/settings) but that didn't work. I'm still getting "The request timed out" on files about 10mb large. Anyone has any ideas? Also, I'm using ASIHTTPRequest and drupal-ios-sdk and downloading the files to an iPad.
Turns out it was the default timeOutSeconds property on the ASIHTTPRequest was too small (10 seconds). When I uped it, my large files downloaded ok