Uploading files with classic ASP - Cant do larger than 120 KB? - amazon-web-services

This is driving me crazy. We recently migrated over to Amazons EC2 service, and now file uploads (using Motobit Huge ASP Upload) in our application fail for any files larger than 120kb. I know the default limit is 200kb, but its failing for anything larger than 120 with an error number of -2147024893 (0x80070003) with no description/details at all. I cannot find anything in the event viewer or IIS logs to help point me in any direction.
I did go into IIS under the ASP properties for the site in question, and changed the Maximum Requesting Entity Body Limit from 200000 to 400000, and it made no difference at all. I even tried an iisreset, as well as rebooted the server after making the change just to ensure it applied out of hopeless desperation.
I have tried a multitude of different files and file sizes. The most I have been able to upload is a 120KB file. Its not a code issue, as this same code is working on my local box (IIS 7) without issue, and was simply copied over from our old server as-is.
EDIT: I have also tried the following with no results:
set the request filtering settings of Maximum allowed content length to 30000000 (30 Mb).
Manually edited web.config and added requestLimits maxAllowedContentLength="15728640" and httpRuntime maxRequestLength="2147483647"

I hope this can help someone in the future, because I was banging my head against my desk trying to resolve this.
This component has a maximum memory setting of 128K. Any files that are larger than this are not stored in memory, but instead stored in the temp folder. The default temp folder is C:\Windows\Temp.
The culprit was an Application Pool Setting, under advanced settings. 'Load User Profile' was set to true. Resulting in the component trying to use a temp path in C:\Users, which was failing. There are 3 solutions:
Set Load User Profile = False
Assign a temp folder in code (Form.TempPath = "")
Create the folder in C:\Users (use response.write form.CheckRequirements)
For our case, #1 was the perfect solution. #2 would have required edited over 100 files, and #3 would only be needed if you are required to have #1 set to true.

Related

opencart: I can edit order but cannot delete it. (with Error log)

I use opencart version 2.1.0.1
Everytime I click admin > sales > order, it will pop up "error undefined." By closing that popup window, I can still edit order but cannot delete order (no response).
In my log, there is:
PHP Notice: Undefined variable: order_id in
/var/www/html/opencart2101/system/storage/modification/admin/view/template/sale/order_list.tpl on line 821
The line 821 is:
url: 'index.php?route=extension/openbay/addorderinfo&token=<?php echo $token; ?>&order_id=<?php echo $order_id; ?>&status_id=' + status_id,
However, I haven't installed any openbay related module. Also, line 821 is inside <!-- --> mark. It should have no effect.
Help!
Although this is now an older version of opencart, I still see this being reported a lot around and about.
The problem occurs due to the store front adding the http url rather than the https url to the order. So firstly you need to fix that. If you dont want to read all of my explanation, you can just hit up the bold points :)
Either way BACKUP EVERYTHING actually not really, back up the file you are going to edit and backup your whole database.
open:
catalog/controller/checkout/confirm.php at around line 100
Find:
$order_data['store_url'] = HTTP_SERVER;
Change to:
$order_data['store_url'] = HTTPS_SERVER;
Now you will want to fix your database because for reasons I cannot fathom, the domain name is placed in the order along with the stores id. and when editing orders it is the usage of that directly within your admin order page that throws up the undefined notice. Basically the browser blocks the request because its trying to make an insecure request from a secure page.
Crack open phpmyadmin or whatever database tool you have on hand.
locate the table, default is oc_orders
Browsing the table, look for the column that contains your store url (i cant remember the name off hand, i think its just store_url but it will be obvious anyway. if you are multi store you will need to run the query for each
I am sure somebody can come up with a clever way to automatically convert just the http into https with a single use sql query on the one column, but this works for me.
Run SQL: adjust as appropriate
UPDATE `oc_orders` SET `store_url` = 'https://example.com' WHERE store_id = 0;

How to undo move files from filezilla

I have one problem about filezilla. I accidentally move one folder to wrong directory, so it shows the errors when I view my website.
How can I solve it? Please help me.
Many thanks in advance for your answer.
You cannot undo, but you should understand why that happened and how to prevent that from happening again in the future.
This happened because Filezilla allows Drag-and-drop move functionality on both folders (directories) and files, very dangerous on production servers.
There is actually a feature-request for 11 years now, please add your vote to the list to get this done: https://trac.filezilla-project.org/ticket/2191
In the mean time, please consider using another software that allows the user to set this behavior as an option:
WinSCP: http://winscp.net/eng/docs/screenshots
WS-FTP Pro: https://trac.filezilla-project.org/attachment/ticket/2191/ws_ftp-professional-options.gif
EDIT: Filezilla team responded (sort of) to the feature request and you can block drag and drop in the xml config file. It's better than nothing.
You cannot undo ftp moves. The only way to rectify the problem is to manually move the folder to it's original location.
I suggest you be more careful from next time.
If you don't know where the folder belongs, download the x-cart script package and check where the directory belongs.
Sorry for late, but I am up to date. I get logs from filezilla of moved files.
Status: Renaming '/var/www/html/brb/abc.js' to '/var/www/html/brb/node_modules/abc.js'
Status: /var/www/html/brb/abc.js -> /var/www/html/brb/node_modules/abc.js
Status: Renaming '/var/www/html/brb/xyz.html' to '/var/www/html/brb/node_modules/xyz.html'
Status: /var/www/html/brb/xyz.html -> /var/www/html/brb/node_modules/xyz.html
I write script in js to build command
let x = ['/var/www/html/brb/abc.js -> /var/www/html/brb/node_modules/abc.js',
'/var/www/html/brb/xyz.html -> /var/www/html/brb/node_modules/xyz.html'];
let cmd = [];
x.forEach(p => {
let path = p.split('->');
cmd.push(`mv ${path[1]} ${path[0]}`);
})
console.log(cmd);
Output:
['mv /var/www/html/brb/node_modules/abc.js /var/www/html/brb/abc.js'
'mv /var/www/html/brb/node_modules/xyz.html /var/www/html/brb/xyz.html']
Use any editor like vscode etc and remove string quotes and execute command in server terminal etc
mv /var/www/html/brb/node_modules/abc.js /var/www/html/brb/abc.js
mv /var/www/html/brb/node_modules/xyz.html /var/www/html/brb/xyz.html
A simple answer is,
Copy the folder into the desired location and then delete from the current location where you moved it mistakenly.
Now, what if you overwrite a file.
I just edited a file in local, then downloaded the file from my server into the local, and my all the local updated data is gone. There seems no solution of this, but there may be one possibility, I may be the lucky enough that this is my case.
If you were working on that local file, so most probably it is opened in your browser. Do not refresh it. Copy the content one by one, and update the file again. You can also open developers tools of both old and new page. Compare them line by line and do the job.
I had the same issue and resolved it manually.
The log panel was helpful in this. It is the large panel below the connection form in the top menu.
From that log panel, I was able to figure out all the files which were moved with their current and previous location.
I copied those all log lines and paste them somewhere in notepad and then manually selected all files and move those all at once to their original directory.
Screenshot: The log panel showing last actions

Django FileField, how to avoid long file copy delays?

I have the following class:
class VideoFile(models.Model):
media_file = models.FileField(upload_to=update_filename, null=True)
And when I try to upload large files to it (from 100mb up to 2Gb) using the following request, it can take quite a long time after the upload process, and during the VideoFile.save() process.
def upload(request):
video_file = VideoFile.objects.create(uploader=request.user.profile)
video_file.media_file = uploaded_file
video_file.save()
On my Macbook Pro Core i7, 8Gb RAM, a 300mb uploaded file can take around 20 seconds or so to run video_file.save()
I suspect this delay is relating to a disk copy operation from /tmp to the files permanent location? I've proven this by running watch ls -l on the target directory, and as soon as video_file.save() runs, I can see the file appear and grow throughout the delay.
Is there a way to eliminate this file transfer delay? Either by uploading the file directly to the target filename or just by moving the original file instead of copying? This is not the only upload operation across the site however so any solution needs to be localized to this model.
Thanks for any advice!
UPDATE:
Just further evidence to support a copy instead of a move, i can watch lsof during the upload and see a file within /private/var/folders/... written from python which maps exactly to upload progress. After upload is complete, another lsof entry appears for the ultimate file location which grows over time. After that's complete, both entries disappear.
Ok after a bit of digging I've come up with a solution. It turns out Django's default storage already attempts to move the file instead of copy, which it first tests:
hasattr(content, 'temporary_file_path')
This attribute exists for the class TemporaryUploadedFile which is the object returned to the Upload View, however the field itself is created as the class specified by FileField.attr_class
So instead I decided to subclass FieldFile and FileField and slot in the temporary_file_path attribute:
class VideoFieldFile(FieldFile):
_temporary_file_path = None
def temporary_file_path(self):
return self._temporary_file_path
class VideoFileField(FileField):
attr_class = VideoFieldFile
Finally in the view, before saving the model I manually assigned the temp path:
video_file.media_file._temporary_file_path = uploaded_file.temporary_file_path()
This now means my 1.1Gb test file becomes available in about 2-3 seconds rather than the 50 seconds or so I was seeing before. It also comes with the added benefit that if the files exist on different file systems, it appears to fall back to the copy operation.
As a side note however, my site is not utilizing MemoryFileUploadHandler which some sites may use to handle smaller file uploads, so I'm not sure how nice my solution might work with that, but I'm sure it would be simple enough to detect the uploaded file's class and act accordingly.
I would caution that there are quite a few reasons why uploading to /tmp and then cping over is best practice, and that uploading large files directly to their target is a potentially dangerous operation.
But, what you're asking is absolutely possible. Django defines upload handlers:
You can write custom handlers that customize how Django handles files. You could, for example, use custom handlers to enforce user-level quotas, compress data on the fly, render progress bars, and even send data to another storage location directly without storing it locally.

How do I CFINCLUDE without creating a cached template in /WEB-INF/cfclasses/?

Are there any solution or alternative ColdFusion tag to include static text file without creating template cache under /WEB-INF/cfclasses ?
The problem is I have dynamic pages growing over the time.
Each page need to include one single static file.
e.g.
<cfinclude template="mapping/static_1.txt> for page 1
<cfinclude template="mapping/static_2.txt> for page 2
<cfinclude template="mapping/static_3.txt> for page 3
....etc.
Since the number of pages are growing to 2000 pages, it cause the server down as the system generate 2000 cache tempaltes which exceed the server limit.
I can ask hosting support to extend the limitation but that will not be the long a term solution for dynamic pages that growing over the time.
Obviously, there is no calculation required as the file to include is static text (.txt) which contain static HTML tags (no script involve).
Is there any alternative tag apart from <cfinclude > that will just
show the file content without binary calculation and cache creation?
Or is there any solution to prevent server from caching .txt file?
Sorry for question that may simple but I'm new to CF here.
Your pointer would be really appreciate.
Cheers
Chanon
my hosting support do not recommend disable caches all together.
Anyway, I came out with a simple solution using <cffile> instead of <cfinclude>.
When using <cffile> server will not execute each lines and create cache. Instead, it just grab the whole chuck of file and put it in variable.
Why use CFINCLUDE if those are static HTML files? Use FileRead() for example (or longer version with FileOpen/FileReadLine/FileIsEOF) - or even CFFILE with action="read".
<cfset variables.content = FileRead("mapping/static_1.txt")>
<cfoutput>
#variables.content#
</cfoutput>
There's no point using CFINCLUDE if there's no CFML/CFScript to process.
You don't need to cache any compiled class files at all: there's a setting in CFAdmin to switch this caching off (on the Cache page: "Save Class Files"). Those cached files are only really a benefit at server start-up time: it saves files being re-compiled when they're first accessed. This overhead is neglible, really. It used to be considerable back in the days of CFMX6 & 7, but not so much since then.
There is - as far as I know - no way to pick & choose which files have their compiled classes saved. It's all or nothing.
What one could do, I suppose, is to switch the setting on, compile all the apps "main" files so their classes are saved, then switch the setting off. One would need to repeat this process whenever one adds new files to the application though. Still: that's not such a hardship.
But I see no benefit in having these files saved at all, these days.

Coldfusion CFC Mapping to external directories with CFCs that reference other folders

I've done some poking around and trial and error but I'm not coming up for a solution to this problem I have.
I have a folder structure like this (example)
Application.cfc
Objects\
Object.cfc
Utilities\
Util.cfc
API\
Resources\
index.cfm
Application.cfc
I have one site that points to the API folder (http://api.site.com) and another that points to the overall root (http://site.com)
From Api\Resource\index.cfm. I'm trying to createObject() on Objects\Object.cfc. I set up a mapping, either in CF Admin, or API\Application.cfc with this.mappings["/SiteRoot"] = "C:\wwwroot". Inside the index.cfm I do createObject("component","SiteRoot.Objects.Object"). This correctly access the Object.cfc.
The issue I'm having is that it fails because Object.cfc instantiates the Utilities\Util.cfc just by createObject("component","Utilities.Util"). The error is that Utilities.Util cannot be found.
There are other files in the very bottom root that can obviously call Object.cfc with no problems since it just goes into the Utilities folder naturally.
Any suggestions Or do I really need to just break the API Folder out of this root entirely?
Thanks!
UPDATE
It's not letting me answer my own question just yet but I wanted to post here before others chimed in.
Despite reiniting the application and restarting the application server, once or twice it wasn't working. Then suddenly, it just went and worked as I would have expected. Object.cfc could find Util.cfc correctly based on it's relative path.
I gave upvotes to those who responded as they were perfectly viable alternatives and solutions and would have gone with one of them had this not just started working. Demons, I tell you. Demons.
Thanks!
I think I would change your second create object call (the utilities one) to createObject("SiteRoot.Utilities.Util") ? Making sure that one mapping "governs" the starting point for all the objects no matter where instantiated.
If you really cannot change your code then just create a ColdFusion mapping called Utilities pointed at the Utilities folder.