SCP won't find file even though it exists - amazon-web-services

I'm trying to SCP some files from a remote server. The command I'm using is:
scp -r -i ~/path/to/key ubuntu#address:/home/ubuntu/analysis .
I receive an error:
scp: /home/ubuntu/analysis: No such file or directory.
If I scp another file in my home directory, it does work, e.g.
scp -r -i ~/path/to/key ubuntu#address:/home/ubuntu/.viminfo .
If I create a new file, e.g. with touch new_file.txt, I also cannot download that file.
The permissions and owners for .viminfo and the directory analysis are standard.
Why isn't the SCP working? I have been able to download files from this server before, but something has changed.
Quite confusing - any advice would be appreciated!
Thanks!

Related

cannot open path of the current working directory: Permission denied when trying to run rsync command twice in gcloud

I am trying to copy the data from the file stores to the google cloud bucket.
This is the command I am using:
gsutil rsync -r /fileserver/demo/dir1 gs://corp-bucket
corp-bucket: Name of my bucket
/fileserver/demo/dir1: Mount point directory (This directory contain the data of the file store)
This command works fine in the first time, It copies the data of the directory /fileserver/demo/dir1 to the cloud bucket but then I delete the data from the cloud bucket and again run the same command without any changes then I get this error:
cannot open path of the current working directory: Permission denied
NOTE: If I made even a small changes to the file of the /fileserver/demo/dir1 and run the above command then again it works fine but my question is why it is not working without any changes and is there any way to copy file without making any changes
Thanks.
You may be hitting the limitation #2 of rsync " The gsutil rsync command considers only the live object version in the source and destination buckets"; You can exclude /dir1... with the -x pattern and still let rsync makes the clean up work as the regular sync process.
Another way to copy those files will be to use cp with -r option to make it recursively instead of rsync.

Trying to copy folder into AWS EC2 instance: Getting no directory found?

So I am trying to SSH into ec2 instance and copy folder from my desktop into instance.
Command Typed: scp -i -r prac1.pem SocialTrends ubuntu#[ec2-54-1....amazonaws.com]:socialtrendsApp/app
Error: Warning: Identity file -r not accessible: No such file or directory
I am typing this command from ~/SocialTrends directory which is what I am trying to copy. This folder has code files and the prac1.pem file already
What am I doing wrong please help?!
You have placed the command line option -r in between -i and prac1.pem. Move it to be after prac1.pem

Error "No URLs matched" When copying Google cloud bucket data to my local computer?

I am trying to download a folder which is inside my Google Cloud Bucket, I read from google docs gsutil/commands/cp and executed below the line.
gsutil cp -r appengine.googleapis.com gs://my-bucket
But i am getting the error
CommandException: No URLs matched: appengine.googleapis.com
Edit
By running below command
gsutil cp -r gs://logsnotimelimit .
I am getting Error
IOError: [Errno 22] invalid mode ('ab') or filename: u'.\logsnotimelimit\appengine.googleapis.com\nginx.request\2018\03\14\14:00:00_14:59:59_S0.json_.gstmp'
What is the appengine.googleapis.com parameter in your command? Is that a local directory on your filesystem you are trying to copy to the cloud bucket?
The gsutil cp -r appengine.googleapis.com gs://my-bucket command you provided will copy a local directory named appengine.googleapis.com recursively to your cloud bucket named my-bucket. If that's not what you are doing - you need to construct your command differently.
I.e. to download a directory named folder from your cloud bucket named my-bucket into the current location try running
gsutil cp -r gs://my-bucket/folder .
-- Update: Since it appears that you're using a Windows machine (the "\" directory separators instead of "/" in the error message) and since the filenames contain the ":" character - the cp command will end up failing when creating those files with the error message you're seeing.
Just wanted to help people out if they run into this problem on Windows. As administrator:
Open C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\platform\gsutil\gslib\utils
Delete copy_helper.pyc
Change the permissions for copy_helper.py to allow writing
Open copy_helper.py
Go to the function _GetDownloadFile
On line 2312 (at time of writing), change the following line
download_file_name = _GetDownloadTempFileName(dst_url)
to (for example, objective is to remove the colons):
download_file_name = _GetDownloadTempFileName(dst_url).replace(':', '-')
Go to the function _ValidateAndCompleteDownload
On line 3184 (at time of writing), change the following line
final_file_name = dst_url.object_name
to (for example, objective is to remove the colons):
final_file_name = dst_url.object_name.replace(':', '-')
Save the file, and rerun the gsutil command
FYI, I was using the command gsutil -m cp -r gs://my-bucket/* . to download all my logs, which by default contain : which does not bode well for Windows files!
Hope this helps someone, I know it's a somewhat hacky solution, but seeing as you never need (should have) colons in Windows filenames, it's fine to do and forget. Just remember that if you update the Google SDK you'll have to redo this.
I got same issue and resolved it as below.
Open a cloud shell, and copy objects by using gsutil command.
gsutil -m cp -r gs://[some bucket]/[object] .
On the shell, zip those objects by using zip command.
zip [some file name].zip -r [some name of your specific folder]
On the shell, copy the zip file into GCS by using gsutil command.
gsutil cp [some file name].zip gs://[some bucket] .
On a Windows Command Prompt, copy the zip file in GCS by using gsutil command.
gsutil cp gs://[some bucket]/[some file name].zip .
I wish this information helps someone.
This is also gsutil's way of saying file not found. The mention of URL is just confusing in the context of local files.
Be careful, in this command, the file path is case sensitive. You can check if it is not a capitalized letter issue.

How to copy file from bucket GCS to my local machine

I need copy files from Google Cloud Storage to my local machine:
I try this command o terminal of compute engine:
$sudo gsutil cp -r gs://mirror-bf /var/www/html/mydir
That is my directory on local machine /var/www/html/mydir.
i have that error:
CommandException: Destination URL must name a directory, bucket, or bucket
subdirectory for the multiple source form of the cp command.
Where the mistake?
You must first create the directory /var/www/html/mydir.
Then, you must run the gsutil command on your local machine and not in the Google Cloud Shell. The Cloud Shell runs on a remote machine and can't deal directly with your local directories.
I have had a similar problem and went through the painful process of having to figuring it out too, so I thought I would provide my step by step solution (under Windows, hopefully similar for unix users) with snapshots and hope it helps others:
The first thing (as many others have pointed out on various stackoverflow threads), you have to run a local Console (in admin mode) for this to work (ie. do not use the cloud shell terminal).
Here are the steps:
Assuming you already have Python installed on your machine, you will then need to install the gsutil python package using pip from your console:
pip install gsutil
The Console looks like this:
You will then be able to run the gsutil config from that same console:
gsutil config
As you can see from the snapshot bellow, a .boto file needs to be created. It is needed to make sure you have permissions to access your drive.
Also note that you are now provided an URL, which is needed in order to get the authorization code (prompted in the console).
Open a browser and paste this URL in, then:
Log in to your Google account (ie. account linked to your Google Cloud)
Google ask you to confirm you want to give access to GSUTIL. Click Allow:
You will then be given an authorization code, which you can copy and paste to your console:
Finally you are asked for a project-id:
Get the project ID of interest from your Google Cloud.
In order to find these IDs, click on "My First Project" as circled here below:
Then you will be provided a list of all your projects and their ID.
Paste that ID in you console, hit enter and here you are! You now have created your .boto file. This should be all you need to be able to play with your Cloud storage.
Console output:
Boto config file "C:\Users\xxxx\.boto" created. If you need to use a proxy to access the Internet please see the instructions in that file.
You will then be able to copy your files and folders from the cloud to your PC using the following gsutil Command:
gsutil -m cp -r gs://myCloudFolderOfInterest/ "D:\MyDestinationFolder"
Files from within "myCloudFolderOfInterest" should then get copied to the destination "MyDestinationFolder" (on your local computer).
gsutil -m cp -r gs://bucketname/ "C:\Users\test"
I put a "r" before file path, i.e., r"C:\Users\test" and got the same error. So I removed the "r" and it worked for me.
Check with '.' as ./var
$sudo gsutil cp -r gs://mirror-bf ./var/www/html/mydir
or maybe below problem
gsutil cp does not support copying special file types such as sockets, device files, named pipes, or any other non-standard files intended to represent an operating system resource. You should not run gsutil cp with sources that include such files (for example, recursively copying the root directory on Linux that includes /dev ). If you do, gsutil cp may fail or hang.
Source: https://cloud.google.com/storage/docs/gsutil/commands/cp
the syntax that worked for me downloading to a Mac was
gsutil cp -r gs://bucketname dir Dropbox/directoryname

EC2 - chmod: cannot access ‘mypemfile.pem’: No such file or directory

I download pem file while launching t2.small instance. When I try to connect it via ssh it say no such a file in directory. But am sure that pem file is in directory.
$ ls
mypemfile.pem
$ chmod 400 mypemfile.pem
chmod: cannot access ‘mypemfile.pem’: No such file or directory
$ ssh -i "mypemfile.pem" root#x.x.x.xx
Warning: Identity file mypemfile.pem not accessible: No such file or directory.
Permission denied (publickey).
How to track this issues? and any solutions?
Note: I created instance from AMI image shared by another account.
For sure the problem is because there is no "mypemfile.pem" file.
Recheck the availability of the file, if the file is available try to rename it or make a copy of it and try with the newly created file.
Hope it helps..
I figured it out on Mac. So, this is what I had to do. When you created private key, my Mac saved is as whatever.pem.txt, so in order to connect to the AWS instance just add .txt extension to whatever AWS instructions tell you to do. For example:
chmod 400 yourfile.pem.txt ssh -I "yourfile.pem.txt" ubuntu#ecX-XX-XX-XXX-XXX.compute-1.amazonaws.com
This is for Mac users.
Best,