How to copy files from google compute engine to local directory - google-cloud-platform

I am trying to copy files from my instance to my local directory using following command
gcloud compute scp <instance-name>:~/<file-name> ~/Documents/
However, it is showing error as mentioned below
$USER/Documents/: Is a directory
ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1].
Copying from local directory to GCE works fine.
I have checked Stanford's tutorial and Google's documentation as well.
I have one another instance where there is no issue like this.
I somewhat believe it might be issue with SSH keys.
What might have gone wrong?

Your command is correct if your source and destination paths are correct
The command as you've posted in your question works for me when copying a file from the Google Compute Engine VM to my local machine.
$ gcloud compute scp vm1:~/.bashrc ~/Documents/
.bashrc 100% 3515 3.4KB/s 00:00
I also tried doing the copy from other side (i.e. from my local machine to GCE VM) and it works:
$ gcloud compute scp ~/Documents/.bashrc vm1:~/temp/
.bashrc 100% 3515 3.4KB/s 00:00
$ gcloud compute scp ~/Documents/.bashrc vm1:~/.bashrc-new
.bashrc 100% 3515 3.4KB/s 00:00
gcloud relies on the scp executable present in your PATH. The arguments you provide to the gcloud scp command are passed through to the scp binary. Assuming your source and destination paths are correct, it should work.
Recursive copying using scp
Based on your particular error message though, I've seen that variation only appear when the source path you're trying to copy from is a directory instead of file. For that particular case, you can pass a --recurse argument (similar to the -r argument supported by regular scp) which will recursively copy all files and directories under the specified directory.
gcloud compute scp --recurse SRC_PATH DEST_PATH

To copy files from VM to your desktop you can simply SSH into the VM and on top right corner there is a settings button, there you will find the download file option just enter the path of file.
If it is folder then first zip the folder then download it.

Everything was perfect except I was trying to run these commands on the terminal connected to GCE instead of local terminal.
oyashi#oyashi-torch-instance:~$ gcloud compute scp oyashi-torch-instance:~/spring1617_assignment1.zip ~/Documents/
/home/oyashi/Documents/: Is a directory ERROR: (gcloud.compute.scp)
[/usr/bin/scp] exited with return code [1].
But when I tried this one on my local terminal. This happened.
oyashi#oyashi:~/Documents$ gcloud compute scp oyashi-torch-instance:~/spring1617_assignment1.zip ~/Documents/
spring1617_assignment1.zip 100% 42KB 42.0KB/s 00:00
Thank you everyone for their comments and help. I know its a silly mistake from my end. But I posted this answer so that others might learn from my silliness.

If you need to pass the information of zone, project name you may like to do as it worked for me:
the instance name is the name you chose in the GCP instances.
gcloud beta compute scp --project "project_name" --zone "zone_name" instance_name:~jupyter/file_name /home/Downloads

I met the same problem. The point is you should run the scp command from a local terminal, rather than cloud terminal.

For copying file to local machine from Ubuntu vmware
For ex: you have instance by name : bhk

Run a basic nginx server and copy all the files in /var/www/html (nginx serving dir) and then from your local machine simple run wget <vm's IP>/<your file path>
For example If my vm's IP is 1.2.3.4 and I want to copy /home/me/myFolder/myFile , then simply copy this file in /var/www/html
then run wget 1.2.3.4/myfile

this works for me:
gcloud compute scp --project "my-project" ./my-file.zip user#instance-1:~
--project - google cloud project name
my-file.zip - local file to send to VM
user - vm linux username
instance-1 - instance name (vm name)
~ - instance destination path

I use below script to upload directory from local to remote directory
gcloud compute scp --recurse myweb-app/www/* user#instant-name:/var/www/html/sub-sites/myweb-app/www/

Related

How to copy files from Google Cloud to local machine? (Cloud Shell)

I spent way too much time trying to figure out how to copy files from the allocated Google Cloud Shell VM to my local machine. The cloud shell is great, but so is having a local copy.
Most of the time was spent trying to get some form of gcloud compute scp to work, similar to this syntax:
$ gcloud compute scp some-instance:~/littlefile.zip ~/
However in addition to the compute command, there is alpha cloud-shell!
Approaches that did not work:
me#mylocalhost:~$ gcloud alpha cloud-shell scp ~/littlefile.zip .
ERROR: (gcloud.alpha.cloud-shell.scp) argument (cloudshell|localhost):SRC: Bad value [/home/me/littlefile.zip]: must start with cloudshell: or localhost:
me#mylocalhost:~$ gcloud alpha cloud-shell scp cloudshell:~/littlefile.zip localhost:~/littlefile.zip
~/littlefile.zip: No such file or directory
ERROR: (gcloud.alpha.cloud-shell.scp) [/usr/bin/scp] exited with return code [1].
Mention: How to copy files from google compute engine to local directory in case others find it but need Cloud Shell commands.
At the risk of "answering my own question," I actually just solved it! W00T!
Approaches that did work:
me#mylocalhost:~$ gcloud alpha cloud-shell scp cloudshell:littlefile.zip localhost:.
littlefile.zip 100% 16KB 253.1KB/s 00:00
me#mylocalhost:~$ gcloud alpha cloud-shell scp cloudshell:littlefile.zip localhost:littlefile.zip
littlefile.zip 100% 16KB 212.1KB/s 00:00
me#mylocalhost:~$ gcloud alpha cloud-shell scp cloudshell:~/adirectory/still-littlefile.zip localhost:.
still-littlefile.zip 100% 16KB 198.0KB/s 00:00

gcloud compute ssh stops

I am using gcloud ssh to connect gce.
> gcloud compute --project "first-medium-2****8" ssh --zone "us-east1-b" "instance-2"
I entered the above command to powershell ,but it replies
>Using username "hogehoge".
>Authenticating with public key "DESKTOP-****hogehoge"
and stops. Nothing happened after all.
Yesterday I did the same thing and there was no problem.
But today, I can't. I tried gcloud init and reinstalled the gcloud.
But nothing changed. What should I do to solve this problem?
Additonal information.
OS Windows10
Google Cloud SDK 237.0.0
PowerShell 5.1.17134.590
Putty 0.70 (only one installation)
note1:I found I could use cloud shell without problem.
But, cloud shell has timeout.So I prefer gcloud to cloud shell.
note2:When I use cloudshell, it connects as "tomotomo".
Not "hogehoge" which username when I use gcloud.
When I run "gcloud compute ssh VM_NAME --verbosity=debug --log-http"
it replies
>DEBUG: SSH Known Hosts File [C:\Users\hogehoge\.ssh\google_compute_known_hosts] could not be opened: Unable to read file
[C:\Users\hogehoge\.ssh\google_compute_known_hosts]: [Errno 2] No such file or directory: u'C:\\Users\\hogehoge\\.ssh\\goo
gle_compute_known_hosts'
DEBUG: Current SSH keys in project: [u'tomotomo:ssh-rsa AAAAB***
DEBUG: Running command [C:\Users\hogehoge\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin\sdk\putty.exe -t -i C:\User
s\hogehoge\.ssh\google_compute_engine.ppk hogehoge#3*****].
DEBUG: Executing command: [u'C:\\Users\\hogehoge\\AppData\\Local\\Google\\Cloud SDK\\google-cloud-sdk\\bin\\sdk\\putty.ex
e', u'-t', u'-i', u'C:\\Users\\hogehoge\\.ssh\\google_compute_engine.ppk', u'hogehoge#3*****']
It was very long, so I only extract which I think important.
Running
putty -cleanup
solves this problem.
Putty saves some information in registry.(IP address,public key and so on)
This command removes those registries and random seed file.
Running "putty -cleanup" as per #redpawn fixed the issue.

view contents of directory in google cloud

Does anyone know how to view the contents of a directory in gcloud.
I ran
gcloud compute ssh --zone=us-west1-b cs231-vm
from powershell and connected to my instance.
I am trying to navigate to like this:
cd cs231n/datasets
according to a tutorial here:
http://cs231n.github.io/assignments2018/assignment1/
But it says no such file or directory and so I want to know what is in the current directory. I tried ls and dir but get nothing.
ls or dir definitely works on gcloud, it seems probably you might have missed few steps of downloading folder/data. Please see if you have completed First time setup from http://cs231n.github.io/gce-tutorial/
You can also 'view gcloud command' by clicking ssh dropdown available at list of vm-instances page. Additionally you can pass --project='project-name' to your gcloud ssh command.

How to copy file from bucket GCS to my local machine

I need copy files from Google Cloud Storage to my local machine:
I try this command o terminal of compute engine:
$sudo gsutil cp -r gs://mirror-bf /var/www/html/mydir
That is my directory on local machine /var/www/html/mydir.
i have that error:
CommandException: Destination URL must name a directory, bucket, or bucket
subdirectory for the multiple source form of the cp command.
Where the mistake?
You must first create the directory /var/www/html/mydir.
Then, you must run the gsutil command on your local machine and not in the Google Cloud Shell. The Cloud Shell runs on a remote machine and can't deal directly with your local directories.
I have had a similar problem and went through the painful process of having to figuring it out too, so I thought I would provide my step by step solution (under Windows, hopefully similar for unix users) with snapshots and hope it helps others:
The first thing (as many others have pointed out on various stackoverflow threads), you have to run a local Console (in admin mode) for this to work (ie. do not use the cloud shell terminal).
Here are the steps:
Assuming you already have Python installed on your machine, you will then need to install the gsutil python package using pip from your console:
pip install gsutil
The Console looks like this:
You will then be able to run the gsutil config from that same console:
gsutil config
As you can see from the snapshot bellow, a .boto file needs to be created. It is needed to make sure you have permissions to access your drive.
Also note that you are now provided an URL, which is needed in order to get the authorization code (prompted in the console).
Open a browser and paste this URL in, then:
Log in to your Google account (ie. account linked to your Google Cloud)
Google ask you to confirm you want to give access to GSUTIL. Click Allow:
You will then be given an authorization code, which you can copy and paste to your console:
Finally you are asked for a project-id:
Get the project ID of interest from your Google Cloud.
In order to find these IDs, click on "My First Project" as circled here below:
Then you will be provided a list of all your projects and their ID.
Paste that ID in you console, hit enter and here you are! You now have created your .boto file. This should be all you need to be able to play with your Cloud storage.
Console output:
Boto config file "C:\Users\xxxx\.boto" created. If you need to use a proxy to access the Internet please see the instructions in that file.
You will then be able to copy your files and folders from the cloud to your PC using the following gsutil Command:
gsutil -m cp -r gs://myCloudFolderOfInterest/ "D:\MyDestinationFolder"
Files from within "myCloudFolderOfInterest" should then get copied to the destination "MyDestinationFolder" (on your local computer).
gsutil -m cp -r gs://bucketname/ "C:\Users\test"
I put a "r" before file path, i.e., r"C:\Users\test" and got the same error. So I removed the "r" and it worked for me.
Check with '.' as ./var
$sudo gsutil cp -r gs://mirror-bf ./var/www/html/mydir
or maybe below problem
gsutil cp does not support copying special file types such as sockets, device files, named pipes, or any other non-standard files intended to represent an operating system resource. You should not run gsutil cp with sources that include such files (for example, recursively copying the root directory on Linux that includes /dev ). If you do, gsutil cp may fail or hang.
Source: https://cloud.google.com/storage/docs/gsutil/commands/cp
the syntax that worked for me downloading to a Mac was
gsutil cp -r gs://bucketname dir Dropbox/directoryname

AWS ubuntu instance scp fail

I want to copy local files to aws, and I used following command but it doesn't work, can anyone help? etl is the folder I created and want to put all files in
scp -i C:/Users/Bonnie/Downloads/afinn.txt ubuntu#ec2-54-69-164-253.us-west- 2.compute.amazonaws.com:/home/ubuntu/etl
thanks!
Dan, you are not passing the files you intend to copy to the remote server.
scp -i key filestocopy user#server:/folder/destination/