Copy folders from VM gcp to local machine - google-cloud-platform

I see some question here but no work fo me, i think is a sintaxe error:
In my instance :
ss#si:~/public_html/msite.com$
I have some folders:
folder1 folder2 folder3
I need copy all folders to my local machine so i try, run the comand in my local machine.
$gcloud init
After i answer some question i go to prompt again, amd try run:
gcloud compute scp --recurse ss#si:~/public_html/msite.com/* magicnt#debian:/var/www/html/local-folder/ --zone southamerica-east1-a --project xxxxxxxxxxxxxxxx
And get error:
ERROR: (gcloud.compute.scp) All sources must be local files when destination is remote.
So my doubts are:
To copy from vm to local machine i need run the command in vm or local machine?

I believe you need to specify destination without machine name, just a path to local folder.
Not a
magicnt#debian:/var/www/html/local-folder/
but
/var/www/html/local-folder/
Then your complete command should be:
gcloud compute scp --recurse ss#si:~/public_html/msite.com/* /var/www/html/local-folder/ --zone southamerica-east1-a --project xxxxxxxxxxxxxxxx
See "gcloud compute scp" help example:
To copy a remote directory, ~/narnia, from example-instance to the
~/wardrobe directory of your local host, run:
$ gcloud compute scp --recurse example-instance:~/narnia ~/wardrobe

Related

scp from inside a GCP VM to another server

I am trying to transfer the file samtools.tar.bz2 from inside a VM instance to another server outside GCP called username#server.uni.no: However, I get the following error:
gcloud beta compute scp --project "absolute-bison-xxxx" --zone "europe-west4-a" samtools.tar.bz2 username#server.uni.no:/home/local/user
error
ERROR: (gcloud.beta.compute.scp) Could not fetch resource: - Invalid value 'server.uni.no'. Values must match the following regular expression: '[a-z](?:[-a-z0-9]{0,61}[a-z0-9])?|[1-9][0-9]{0,19}'
Take a look at the documentation:
(BETA) gcloud beta compute scp securely copies files between a virtual machine instance and your local machine using the scp command.
(...)
When the destination of your transfer is local, all source files must be from the same virtual machine.
When the destination of your transfer is remote instead, all sources must be local.
It won't work with two remote machines, even if they are both in GCP.
A crude workaround is to download the file to your local machine:
gcloud beta compute scp --project "absolute-bison-xxxx" --zone "europe-west4-a" USERNAME#INSTANCE_NAME:~/samtools.tar.bz2 samtools.tar.bz2
and then upload it to the external server:
scp samtools.tar.bz username#server.uni.no:/home/local/user

Persistent disk missing when I SSH into GCP VM instance with Jupyter port forwarding

I have created a VM instance on Google Cloud, and also set up a Notebook instance. In this instance, I have a bunch of notebooks, python modules as well as a lot of data.
I want to run a script on my VM instance by using the terminal. I tried running it in a Jupyter Notebook, but it failed several hours in and crashed the notebook. I decided to try from the command line instead. However, when I used the commands found in the docs to ssh into my instance:
gcloud beta compute ssh --zone "<Zone>" "<Instance Name>" --project "<Project-ID>",
or
gcloud compute ssh --project <Project-ID> --zone <Zone> <Instance Name>
or
gcloud compute ssh --project $PROJECT_ID --zone $ZONE $INSTANCE_NAME -- -L 8080:localhost:8080
I successfully connect to the instance, but the file system is missing. I can't find my notebooks or scripts. The only way I can see those files is when I use the GUI and select 'Open Jupyter Lab' from the AI Platform > Notebooks console.
How do I access the VM through the command line so that I can still see my "persistent disk" that is associated with this VM instance?
I found the answer on the fast.ai getting started page. Namely you have to specify the user name as jupyter in the ssh command:
Solution 1: Default Zone and Project Configured:
gcloud compute ssh jupyter#<instance name>
or if you want to use port forwarding to have access to your notebook:
gcloud compute ssh jupyter#<instance name> -- -L 8080:localhost:8080
Solution 2: No Default Zone or Project:
Note that I left out the zone and project id from both of these commands. They are not necessary if you set a default zone and project during your initial gcloud init stage. If you did not do this, then the commands become:
gcloud compute ssh --project <project ID> --zone <zone> jupyter#<instance name>
or if you want to use port forwarding to run a notebook:
gcloud compute ssh --zone <zone> jupyter#<instance name> -- -L 8080:localhost:8080

Cannot scp a file into gcloud: Permission denied (publickey)

I'm trying to copy a file into a compute instance using gcloud like this:
gcloud compute scp ./file.txt.bz2 root#instance-1:/home/mehran/
And I'm facing an error:
No zone specified. Using zone [us-central1-c] for instance: [instance-1].
root####.###.###.###: Permission denied (publickey).
lost connection
ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1].
I've also tested without root:
gcloud compute scp ./file.txt.bz2 instance-1:/home/mehran/
And facing the same error. This is in the case that I can ssh into the same instance without any issue:
gcloud compute ssh instance-1
Apparently, my user does not have the write permission to the home folder! Even a chmod 777 did not help. I ended up copying to /tmp folder and then pick it up from there!
For some reason, the ssh (web-based) and scp were using different users. That's why my scp user didn't have access to that folder. It was trying to write into another user's home folder.
Even though this post seems solved, I had a similar issue: I could SSH into the instance but could not scp. Turns out the folder on my instance did not have read/write permissions so I could not copy (write) into that folder.
Give permissions:
chmod 777 <folder-name>
Copy file/folder to folder in instance:
sudo gcloud compute scp <file-to-copy> <username>#<instance-name>:<folder-to-copy-to-which-requires-permissions> --zone <zone> --project <project-id>

How to use gcloud compute scp?

I want to download this file
to my computer, but I don't understand the command
gcloud compute scp --recurse [INSTANCE_NAME]:[REMOTE_DIR] [LOCAL_DIR]
The instance is the name of my file??
From https://cloud.google.com/sdk/gcloud/reference/compute/scp
[INSTANCE_NAME] refers to the name of the target instance that you are transferring files to. The following example is taken from the above reference.
To copy a remote directory, ~/narnia, from example-instance
to the ~/wardrobe directory of your local host, run:
$ gcloud compute scp --recurse example-instance:~/narnia ~/wardrobe
Edit:
After looking a bit closer at your image, it looks like you're trying to copy files from Cloud Storage? If you're looking to copy files from a Cloud Storage bucket you will need to use the gsutil cli tool. To copy a file from a directory you'll need to use the gsutil cp command.
See the following for more information and examples:
https://cloud.google.com/storage/docs/gsutil/commands/cp

Copy files from remote server to google cloud bucket

How would I copy files from a remote server to a google bucket? For example,
gcloud compute scp username#server:/path/to/file gs://my-bucket
This method gives the error: All sources must be local files when destination is remote.
Additionally, gsutil only provides support for cp and not scp.
Thanks in advance!
You can also directly execute gsutil command on your GCE VM (Most VM images have Cloud SDK preinstalled). For example:
gcloud compute ssh user#server --zone my_zone \
--command='gsutil cp path/to/my_file gs://MY_BUCKET'
Note that for this to work your service account associated with VM must have appropriate access scope to GCS. If you run
gcloud beta compute instances describe my_instance --zone my_zone \
--format="value(serviceAccounts.scopes)"
It will show list of scopes set for VM service account. Make sure you have https://www.googleapis.com/auth/cloud-platform or https://www.googleapis.com/auth/devstorage.full_control or https://www.googleapis.com/auth/devstorage.read_write. If not you can use
set-scopes beta command to reset them or go to console and edit VM in question.
gsutil supports streaming data, so you can try to pipe the data in.
Not 100% certain this works with arbitrary binary data (and I'm on a phone so I can't test it)
gcloud compute ssh username#server --command='cat /path/to/file' | gsutil cp - gs://my-bucket