How to find out the actual folder with hdfs link - hdfs

I have a link like hdfs://hostname/folder1/sub_folder.db/table_detail/p_year=2015
Can anyone tell me if I log into the host, where can I find the actual folder?

Related

how to read and update the SACL properties of a folder in a remote machine in active directory

I am trying to read and update the SACL properties of a folder in a domain machine from the domain controller.
I came across this link but I don't know how to use the IADs::Get to get the object of the folder from the active directory.
I am struggling to find the ldap query to get the folder, I searched all over the internet but I couldn't find a single example for this use case.
Can anyone help me with an example or a reference?
IADs::Get is only for objects in Active Directory itself. You can't use it for files on a file system.
To modify permissions a file on a remote computer, you treat it pretty much the same as a file on the local system. You can use GetNamedSecurityInfo, where pObjectName would be the path to the file in the format of \\server\share\directory\file.txt and ObjectType is SE_FILE_OBJECT.
The credentials being used to run your program will need to already have rights to access that file on the remote system.
Some more reading here: File Security and Access Rights

Detect if a folder is a clouded folder

Is there a way to check and find all in once all clouded folder that are on a computer without enumerate all possibilities ?
What I need is : I take a path and it tells me if it's a clouded folder.
Example : I use Google Drive, OneDrive, DropBox, and other. I would like to know if it's one of them without having to enumerate all possibilities of each services like :
Onedrive can be check on registry and have "Drive" on his path;
Google Drive have a .ini file on root folder and have "Drive" on his path;
DropBox have a file on root folder (I think) ;
Can't find the magic solution that check all services that exists.
Is there a secret tag or info hidden on folder saying "I'm related to a cloud service" ?
Can't found nothing about it :(
I've checked GetDriveTypeW but OneDrive and GoogleDrive are detect as fixed drive (and it's normal) and FILE_REMOTE_PROTOCOL_INFO but no sign of answer there either..
Please help !
Thanks.
I think you're out of luck this time. I don't believe there is anything fundamental in the Windows operating system or filesystem that tells you if an application installed on the system is mirroring or otherwise syncing files and folders. There is no magic bullet, to the best of my knowledge.
You'll have to tackle these on an application-by-application basis using awareness of each type of service and which folders it is syncing as you are currently doing.

hdfs copy folder to network file system ends up in error

One of my clients is used to save to a NFS volume the contents of some HDFS folders. Sometimes during the copy I'll receive the following error
copying /folder/foo
copyToLocal: Input/output error
I thought that some of the files inside the foo folder might be corrupt, but a check with hdfs fsck didn't report anything strange. In the foo folder there are many many files so it's not feasible to manually search for something strange. I tried to enable debug mode for the copyToLocal command but there is no clue about any type of error. How could I debug this issue ?
I have the gut feeling that there are some networking issues on the NFS, but I also don't know how to debug this kind of problems.
The HDFS folders are also very large, I don't know if this could cause any additional problem.
We are in a Kerberos environment and we're executing commands after the kinit as the hdfs superuser.
p.s. Probably SO isn't the right place to ask this question, feel free to redirect me to the right website :)

Getting workspace of a specific folder on a specific machine in Perforce

I'm writing a script to find all folders on a machine, which not connected to any workspace,
How can I get the workspace name (or null) of a specific folder on a specific host?
(I already examined the answers here, but that doesn't answer my question)
The best way to do this is to use 'p4 where'. Running 'p4 -c where ' should tell you how Perforce is interpreting the foldername in question for that particular workspace.
If you want to do the detailed analysis yourself:
The root directory of the workspace on the host is stored in the Root: field of the workspace spec and can be displayed with 'client -o'.
Note that there can be AltRoots, that is, multiple valid root directories for a single workspace.
So as you iterate through the workspaces for a specific host, look at their Root and AltRoot directories, and see if any of those directories are the parent of the folder you're interested in.

Magento "Unable to list current working directory"

http://www.keciadesign.dk
I am trying to set up table rates in Magento 1.6.2.0. The problem occurs when I try to upload the file with table rates (CSV-file). Then the error "Unable to list current working directory" appears and I can't go any further.
TMP, Media and Var folders have perm.777.
I have read everything there was to find on the Internet on this problem - many seem to have had this problem but I have yet to see a solution.
Note:
Probably not very relevant, but I am on Unoeuro hosting on a shared serverspot.
With some extensions (Wyomind Simple Google Shopping) the error shows up when var/tmp is missing in Magento directory structure.
The most popular reason of this problem - wrong permissions for media directory. It should be writeable by web server. More information can be checked here.
Look to your php.ini and find upload_tmp_dir option (or use echo ini_get('upload_tmp_dir') in your code. Seems like PHP can't list files in this directory where apache uploads files. I'm afraid you can't change permissions of this folder on shared hosting.
This error can also be reported if you have ran out of disk space.