Is there a way to check and find all in once all clouded folder that are on a computer without enumerate all possibilities ?
What I need is : I take a path and it tells me if it's a clouded folder.
Example : I use Google Drive, OneDrive, DropBox, and other. I would like to know if it's one of them without having to enumerate all possibilities of each services like :
Onedrive can be check on registry and have "Drive" on his path;
Google Drive have a .ini file on root folder and have "Drive" on his path;
DropBox have a file on root folder (I think) ;
Can't find the magic solution that check all services that exists.
Is there a secret tag or info hidden on folder saying "I'm related to a cloud service" ?
Can't found nothing about it :(
I've checked GetDriveTypeW but OneDrive and GoogleDrive are detect as fixed drive (and it's normal) and FILE_REMOTE_PROTOCOL_INFO but no sign of answer there either..
Please help !
Thanks.
I think you're out of luck this time. I don't believe there is anything fundamental in the Windows operating system or filesystem that tells you if an application installed on the system is mirroring or otherwise syncing files and folders. There is no magic bullet, to the best of my knowledge.
You'll have to tackle these on an application-by-application basis using awareness of each type of service and which folders it is syncing as you are currently doing.
Related
some time ago I set up Firebase Functions & Hosting (never used the Hosting but extensively using Functions) to work with my Android App that I'm building. When I set this up, I didn't know very much about the environment and didn't realize "where" I was creating the directories. I ended up creating the directory under /home/myusername/firebase. While I still don't know much more (lol) about the environment, I came to realize that other users in the project are unable to see these directories, because they are in my home/user directory. (right?) I now have another user in the project, that needs to be able to access these directories.
I know that I can probably "easily" just move these directories with the mv command, but is that the correct/proper way? I assume it is not.
How do I go about safely moving these directories to a higher level directory --- OR: How do I go about putting these directories in a place where other users can access them, as well?
Thank you for any help and guidance.
It sounds like you are overthinking things. Just move files around with mv like you would normally. Cloud Shell doesn't really operate much differently than your desktop with respect to moving files and directories around.
If you want to share your source code, you should use some form of source control, such as GitHub or Google Cloud Source Repositories. Cloud Shell is not a good place to share data.
I'll give a bit of a background as to the setup we have and why. Currently myself and a friend want to collaborate on an Unreal Engine Project. To do this I've set up an Amazon Lightsail Instance with Windows Server running. I've then installed Perforce onto this Server and added two users. Both of us are able to connect to this server from our local machines (great I thought!). Our goal was to attach two 'virtual' disks of 32gb to this server via Lightsails Storage option. I've formatted these discs and they are detected as Disk D and E on the Server. Our goal was to have two depots, one on Disk E and one on Disk D, the reason for this being the C disk was only 20gb (12gb Free after Windows).
I have tried multiple things (not got much hair left after this) to try and map the depots created to each HDD but have had little success and need your wisdom!
I've followed both the process indicated in this support guide (https://community.perforce.com/s/article/2559) via CMD as well as changing the depot storage location in P4Admin on the Server via RDP to the virtual disks D and E respectively.
Example change is from //UE_WIP/... to D:/UE_WIP/... (I have create a folder UE_WIP and UE_LIVE on each HDD).
When I open up P4V on my local machine I'm able to happily connect (as per screenshot) and set workstation to my local machine (detects both depots). This is when we're getting stuck. I then open up a new unreal engine file and save the unreal engine file to the the following local directory E:/DELETE/Perforce/Test/ and open up source control (See image 04). This is great, it detects the workspace and all is connecting to the server.
When I click submit to source control I get the following 'Failed Checking Source Control' when I try adding via P4V manually marking the new content folder for add I get the following 'file(s) not in client view.
All we want is the ability to send an Unreal Engine up to either the WIP Drive Depot or the Live Drive Depot. To resolve this does it require:
Two different workstations (one set up for LIVE and one for WIP)
Do we need to add some local folders to our directory? E:/DELETE/Perforce/UE_WIP & E:/DELETE/Perforce/UE_LIVE?
Do we need to tweak something on the Perforce Server?
Do we need to tweak something in Unreal Engine?
Any and all help would be massively appreciated.
Best,
Ben
https://imgur.com/a/aaMPTvI - Image gallery of issues
Your screenshots don't show how (or if?) you set up your local workspace (i.e. the thing that tells Perforce where the files are on your local workstation).
See: https://www.perforce.com/perforce/r13.1/manuals/p4v/Defining_a_client_view.html
The Perforce server acts as a layer of abstraction between the backend storage (i.e. the depots you've set up) and the client machines where you actually do your work. The location of the depot files doesn't matter at all to the client (any more than, say, the backend filesystem of a web server matters to your web browser); all that matters is how you set up the workspace, which is a simple matter of "here's where my local files are" (the Root) and "here's how my local paths map to depot paths" (the View).
You get the "file not in view" error if you try to add a local file to the depot and it's not in the View you've defined. The fix there is generally to simply fix the Root and/or View to accurately describe where you local files are. One View can easily map to multiple depots (as long as they're on a single server).
(edit)
Specifically, in your case, all of the files you're trying to add are under the path:
E:\DELETE\Perforce\Test\Saved\...
Since you've set up your workspace as:
Client: bsmith
Root: E:\DELETE\Perforce\bsmith
View:
//WIP/... //bsmith/WIP/...
//LIVE/... //bsmith/LIVE/...
then your bsmith workspace consists of these two local paths:
E:\DELETE\Perforce\bsmith\WIP\...
E:\DELETE\Perforce\bsmith\LIVE\...
The files you're trying to add aren't even under your Root, much less under either of the View mappings. That's what the "not in client view" error messages mean.
If you want to add the files where they are, modify your Root and View so that you define your workspace as being where your files are; if you want to have the files in one of the local directories above that you've already defined as being where your workspace lives, you'll have to move them there. If you put your files in bsmith\WIP, then when you add them they'll go to the WIP depot; if you put them in bsmith\LIVE, then they'll go to the LIVE depot, per your View.
Either way, once they're in your workspace, you can add them to the depot. Simple as that!
please, could someone point me into the right direction? Here's the problem, I created my own installer for my synth plugins: VST, VST3 and Component.
On Windows I just set it to ask for Administrator rights and it them can save the plugins anywhere. On OSX the story is different, I couldn't find an EASY way to put files into folders when they don't have the right permissions. Not only that, but I can't open the folder for reading after it is saved, so that's another problem.
Resuming:
A) how to elevate file privileges on OSX so that I can save plugin files (.vst, .vst3, .components) into any folder
B) how can I read data files (such as skin files) from a folder that requires elevated rights?
C) I'm using JUCE, you can check my work at www.Wusik.com
Thank you for any advice.
Cheers, WilliamK
I believe Authorization Services is what you would need.
Or you might consider creating an installer package using Packages.
I am getting the following error:
The cause of this exception was:
java.io.FileNotFoundException:
//server/c$/folder1/folder2/folder3/folder4/folder5/login.cfm
(Access is denied).
When doing this:
<cffile action="copy"
destination="#copyto#\#apfold#\#applic#\#files#"
source="#path#\#apfold#\#applic#\#files#">
If I try to write to C:\folder1\folder2\folder3\folder4\folder5\login.cfm, it works fine. The problem with doing it this way is that this is a script for developers to be able to manually sync files to their application folder. We have multiple servers for each instance that is randomly picked by BigIP. So just writing to the C:\ drive would only copy the file to the server the developer is currently accessing. So if the developer were to close out the browser and go right back in to make sure their changes worked, if they happen to get sent to a different server, they won't see their change.
Since it works with writing to C:\, I know the permissions are correct. I've also copied the path out of the error message and put it in the address bar on the server and it got to the folder/file fine. What else could be stopping it from being able to access that server?
It seems that you want to access a file via UNC notation on a network folder (even if it incidentally refers to a directory on the local c:\ drive). To be able to do this, you have to change the user the ColdFusion 9 Application Server Service runs on. By default, this service runs with the user "Local System Account" which you need to change to an actual user. Have a look at the following link to find out how to do this: http://mlowell.hubpages.com/hub/Coldfusion-Programming-Accessing-a-shared-network-drive
Note that you might have to add a user with the same name as the one used for the CF 9 service to all of the file servers.
If you don't want to enable ftp on your servers another option would be to use RoboCopy to keep the servers in sync. I have had very good luck using this tool. You will need access to the cfexecute ColdFusion tag and you will need to create share(s) on your servers.
RoboCopy is an executable that comes with Windows. You can read some documentation here and here. It has some very powerful features and can be set to "mirror" the contents of directories from one server to the other. In this mode it will keep the folders identical (new files added, removed files deleted, updated files copied, etc). This is how I have used it.
Basically, you will create a share on your destination servers and give access to a specific user (can be local or domain). On your source server you will run some ColdFusion code that:
Logically maps a drive to the destination server
Runs the RoboCopy utility to copy files to the destination server
Then disconnects the mapped drive
The ColdFusion service on your source server will need access to C:\WINDOWS\system32\net.exe and C:\WINDOWS\system32\robocopy.exe. If you are using ColdFusion sandbox security you will need to add entries for these executables (on the source server only). Here are some basic code examples.
First, map to the destination server:
<cfexecute name="C:\WINDOWS\system32\net.exe"
arguments="use {share_name} {password} /user:{username}"
variable="shareLog"
timeout="30">
</cfexecute>
The {share_name} here would be something like \\server\c$. {username} and {password} should be obvious. You can specify username as \\server\username. NOTE I would suggest using a share that you create rather than the administrative share c$ but that is what you had in your example.
Next, copy the files from the source server to the destination server:
<cfexecute name="C:\WINDOWS\system32\robocopy.exe"
arguments="{source_folder} {destination_folder} [files_to_copy] [options]"
variable="robocopyLog"
timeout="60">
</cfexecute>
The {source_folder} here would be something like C:\folder1\folder2\folder3\folder4\folder5\ and the {destination_folder} would be \\server\c$\folder1\folder2\folder3\folder4\folder5\. You must begin this argument with the {share_name} from the step above followed by the desired directory path. The [files_to_copy] is a list of files or wildcard (*.*) and the [options] are RoboCopy's options. See the links that I have included for the full list of options. It is extensive. To mirror a folder structure see the /E and /PURGE options. I also typically include the /NDL and /NP options to limit the output generated. And the /XA:SH to exclude system and hidden files. And the /XO to not bother copying older files. You can exclude other files/directories specifically or by using wildcards.
Then, disconnect the mapped drive:
<cfexecute name="C:\WINDOWS\system32\net.exe"
arguments="use {share_name} /d"
variable="shareLog"
timeout="30">
</cfexecute>
Works like a charm. If you go this route and have not used RoboCopy before I would highly recommend playing around with the options/functionality using the command line first. Then once you get it working to your liking just paste those options into the code above.
I ran into a similar issue with this and it had me scratching my head as well. We are using an Active Directory along with a UNC path to SERVERSHARE/webroot. The application was working fine with the exception of using CFFILE to create a directory. We were running our CFService as a Domain account and permissions were granted onto the webroot folder (residing on the UNC Server). This same domain account was also being used to connect to the UNC path within IIS. I even went so far as to grant FULL Control on the webroot folder but still had no luck.
Ultimately what I found was causing the problem was that the Inetpub Folder (parent folder to our webroot) had sharing turned on but that sharing did not include 'Read/Write' sharing for our CFService domain account.
So while we had Sharing on Inetpub and more powerful user permissions turned on for Inetpub/webroot folder, the sharing permissions (or lack thereof) took precedence over the more granular webroot user security permissions.
Hope this helps someone else.
http://www.keciadesign.dk
I am trying to set up table rates in Magento 1.6.2.0. The problem occurs when I try to upload the file with table rates (CSV-file). Then the error "Unable to list current working directory" appears and I can't go any further.
TMP, Media and Var folders have perm.777.
I have read everything there was to find on the Internet on this problem - many seem to have had this problem but I have yet to see a solution.
Note:
Probably not very relevant, but I am on Unoeuro hosting on a shared serverspot.
With some extensions (Wyomind Simple Google Shopping) the error shows up when var/tmp is missing in Magento directory structure.
The most popular reason of this problem - wrong permissions for media directory. It should be writeable by web server. More information can be checked here.
Look to your php.ini and find upload_tmp_dir option (or use echo ini_get('upload_tmp_dir') in your code. Seems like PHP can't list files in this directory where apache uploads files. I'm afraid you can't change permissions of this folder on shared hosting.
This error can also be reported if you have ran out of disk space.