In CF7, can anyone tell me if there's a way around the file being automatically uploaded to the /tmp/ folder before being moved to a destination location I provide? I'm trying to use cffile in a shared server location. I don't have access to the tmp directory, and am hoping to find a work around.
<cffile action="upload" destination="#expandpath('./')#/myFiles" fileField="myFile">
I'm working in a Linux environment, and like I said, have no access outside my webroot. The oddest part is that I seem to not make it to this point. From my I get a 500 page or directory not found error. Even if the post.cfm does NOT have any code, just text, so it seems that the problem is coming from the enctype is trying to "place" the file on the server before I ever get to .
I'm at a total loss and hopeful someone can help.
My understanding is that where the file is uploaded is actually a function of the webserver, not CF itself. CF copies/moves/etc. the file from the temp directory once it is on the server. You will have to get the server admin to allow your process access to the directory.
Related
I'm in a bit of a pinch. I'm stuck using a computer that doesn't have the necessary CLI capabilities to download files from S3 (IT restrictions on what I can put on the computer). Hence, I can only download a file via the browser.
Problem: when I try to download a file that is >10Gb, I get an error. I'm guessing that there's a limit to the size of file I'm downloading (I have plenty of drive space for this, so it isn't a space issue).
How can I resolve this? Is there a setting on the browser that I need to change? Or something in S3 that I need to change? Thanks!
I have over 30 Leaflet maps hosted on my Google Cloud Platform bucket (for example) and it has always been an easy process to upload my folder (which includes an html file with sub-folders including .js and .css files) and share the map publicly.
I tried uploading another map today, but within the folder there are no files showing and I get the following message "There are no live objects in this folder. If you have object versioning enabled, this folder may contain archived versions of objects, which aren't visible in the console. You can list archived object versions using gsutil or the APIs."
Does anyone know what is going on here?
We have also seen this problem, and it seems that the issue is limited to buckets that have spaces in the name.
It's also not reproducible through the gcloud web console, but if you use gsutil to upload a file to a bucket with a space in the name then it won't be visible on the web UI.
I can see from your screenshot that your bucket also has spaces (%20 in the url).
If you need a workaround asap, you could rename your bucket...
But google should fix this soon, I hope.
There is currently open issue on GCS/Console integration
If files have any symbols that needs urlencoding - they are not visible in console - but accessible via gsutil/API (which is currently recommended as workaround)
Issue has been resolved as of 8-May-2018 10:00 UTC
This can happen if the file doesn't have an extension, the UI treats it as a folder and lets you navigate into it, showing a blank folder instead of the file contents.
We had the same symptom (files show up in API but invisible on the web and via CLI).
The issue turned out to be that we were saving files to "./uploads", which Google interprets as "create a directory literally called '.' and then a subdirectory called uploads."
The fix was to upload to "uploads/" instead of "./uploads". We also just ran a mass copy operation via the API for everything under "./uploads". All visible now!
I also had spaces in my url and it was not working properly yesterday. Checked this morning and everything is working as expected. I still have the spaces in my URL btw.
I am getting the following error:
The cause of this exception was:
java.io.FileNotFoundException:
//server/c$/folder1/folder2/folder3/folder4/folder5/login.cfm
(Access is denied).
When doing this:
<cffile action="copy"
destination="#copyto#\#apfold#\#applic#\#files#"
source="#path#\#apfold#\#applic#\#files#">
If I try to write to C:\folder1\folder2\folder3\folder4\folder5\login.cfm, it works fine. The problem with doing it this way is that this is a script for developers to be able to manually sync files to their application folder. We have multiple servers for each instance that is randomly picked by BigIP. So just writing to the C:\ drive would only copy the file to the server the developer is currently accessing. So if the developer were to close out the browser and go right back in to make sure their changes worked, if they happen to get sent to a different server, they won't see their change.
Since it works with writing to C:\, I know the permissions are correct. I've also copied the path out of the error message and put it in the address bar on the server and it got to the folder/file fine. What else could be stopping it from being able to access that server?
It seems that you want to access a file via UNC notation on a network folder (even if it incidentally refers to a directory on the local c:\ drive). To be able to do this, you have to change the user the ColdFusion 9 Application Server Service runs on. By default, this service runs with the user "Local System Account" which you need to change to an actual user. Have a look at the following link to find out how to do this: http://mlowell.hubpages.com/hub/Coldfusion-Programming-Accessing-a-shared-network-drive
Note that you might have to add a user with the same name as the one used for the CF 9 service to all of the file servers.
If you don't want to enable ftp on your servers another option would be to use RoboCopy to keep the servers in sync. I have had very good luck using this tool. You will need access to the cfexecute ColdFusion tag and you will need to create share(s) on your servers.
RoboCopy is an executable that comes with Windows. You can read some documentation here and here. It has some very powerful features and can be set to "mirror" the contents of directories from one server to the other. In this mode it will keep the folders identical (new files added, removed files deleted, updated files copied, etc). This is how I have used it.
Basically, you will create a share on your destination servers and give access to a specific user (can be local or domain). On your source server you will run some ColdFusion code that:
Logically maps a drive to the destination server
Runs the RoboCopy utility to copy files to the destination server
Then disconnects the mapped drive
The ColdFusion service on your source server will need access to C:\WINDOWS\system32\net.exe and C:\WINDOWS\system32\robocopy.exe. If you are using ColdFusion sandbox security you will need to add entries for these executables (on the source server only). Here are some basic code examples.
First, map to the destination server:
<cfexecute name="C:\WINDOWS\system32\net.exe"
arguments="use {share_name} {password} /user:{username}"
variable="shareLog"
timeout="30">
</cfexecute>
The {share_name} here would be something like \\server\c$. {username} and {password} should be obvious. You can specify username as \\server\username. NOTE I would suggest using a share that you create rather than the administrative share c$ but that is what you had in your example.
Next, copy the files from the source server to the destination server:
<cfexecute name="C:\WINDOWS\system32\robocopy.exe"
arguments="{source_folder} {destination_folder} [files_to_copy] [options]"
variable="robocopyLog"
timeout="60">
</cfexecute>
The {source_folder} here would be something like C:\folder1\folder2\folder3\folder4\folder5\ and the {destination_folder} would be \\server\c$\folder1\folder2\folder3\folder4\folder5\. You must begin this argument with the {share_name} from the step above followed by the desired directory path. The [files_to_copy] is a list of files or wildcard (*.*) and the [options] are RoboCopy's options. See the links that I have included for the full list of options. It is extensive. To mirror a folder structure see the /E and /PURGE options. I also typically include the /NDL and /NP options to limit the output generated. And the /XA:SH to exclude system and hidden files. And the /XO to not bother copying older files. You can exclude other files/directories specifically or by using wildcards.
Then, disconnect the mapped drive:
<cfexecute name="C:\WINDOWS\system32\net.exe"
arguments="use {share_name} /d"
variable="shareLog"
timeout="30">
</cfexecute>
Works like a charm. If you go this route and have not used RoboCopy before I would highly recommend playing around with the options/functionality using the command line first. Then once you get it working to your liking just paste those options into the code above.
I ran into a similar issue with this and it had me scratching my head as well. We are using an Active Directory along with a UNC path to SERVERSHARE/webroot. The application was working fine with the exception of using CFFILE to create a directory. We were running our CFService as a Domain account and permissions were granted onto the webroot folder (residing on the UNC Server). This same domain account was also being used to connect to the UNC path within IIS. I even went so far as to grant FULL Control on the webroot folder but still had no luck.
Ultimately what I found was causing the problem was that the Inetpub Folder (parent folder to our webroot) had sharing turned on but that sharing did not include 'Read/Write' sharing for our CFService domain account.
So while we had Sharing on Inetpub and more powerful user permissions turned on for Inetpub/webroot folder, the sharing permissions (or lack thereof) took precedence over the more granular webroot user security permissions.
Hope this helps someone else.
http://www.keciadesign.dk
I am trying to set up table rates in Magento 1.6.2.0. The problem occurs when I try to upload the file with table rates (CSV-file). Then the error "Unable to list current working directory" appears and I can't go any further.
TMP, Media and Var folders have perm.777.
I have read everything there was to find on the Internet on this problem - many seem to have had this problem but I have yet to see a solution.
Note:
Probably not very relevant, but I am on Unoeuro hosting on a shared serverspot.
With some extensions (Wyomind Simple Google Shopping) the error shows up when var/tmp is missing in Magento directory structure.
The most popular reason of this problem - wrong permissions for media directory. It should be writeable by web server. More information can be checked here.
Look to your php.ini and find upload_tmp_dir option (or use echo ini_get('upload_tmp_dir') in your code. Seems like PHP can't list files in this directory where apache uploads files. I'm afraid you can't change permissions of this folder on shared hosting.
This error can also be reported if you have ran out of disk space.
I have an "Images" folder on my remote server with 20 images. In my local copy of the Images folder I updated some images and deleted a few, and ended up with 15 images left.
Now I want to re-upload the whole folder on the server, so the new Images folder completely replaces the one that's currently on the server, and so I end up with Images folder on the server that has 35 images.
I use FileZilla, and when I simply drag the new Images folder onto the remote directory to replace the existing one, and hit 'Overwrite', it overwrites the images that were changed, but keeps the total of 20 images on the server.
What I want FileZilla to do is to replace the WHOLE folder with the new one that has updated images AND the new image quantity (15).
Is there a simple way to do that in FileZilla?
If anybody can point me in the right direction I'll appreciate it.
Now in FileZilla View -> Directory comparison is a really good solution, you can visually see the files that are only on the server and delete them quickly. I found I liked it better than something that would delete them automatically since for example there are actually extra subfolders on the server side I need to keep.
Delete the remote files and reupload? If your're expecting advanced fileset operations, FTP is the wrong protocol.
Ipswitch FTP does exactly that - compare files and let's you replace missing files, replace whole directories etc.Yes you pay for it but its not very expensive.
Wish Filezilla would do it - as some of you say, it's a simple procedure. And why show the differences if you then can't do anything about them? That's why Filezilla mess me off a little.