Copy files from Google Cloud Shell to GCP VM - google-cloud-platform

My development environment is within Google Cloud Shell, whereas the production environment is a separate GCP VM instance. After testing I would like to move all files from my dist folder to the GCP VM instance.
As a test I have tried to copy just one file.
Attempt #1: within Cloud Shell terminal
gcloud alpha cloud-shell scp cloudshell:~/README-cloudshell.txt peter#myvm:~/README-cloudshell.txt
Error #1: All sources must be local files when destination is remote.
Attempt #2: within GCP VM terminal
gcloud compute scp cloudshell:~/README-cloudshell.txt myvm:~
Error #2: All sources must be local files when destination is remote.
I appreciate any help.

from cloud shell
gcloud compute scp ~/README-cloudshell.txt user#instanceName:~
Use a correct SSH key or set the correct permission in the ~ directory

Related

scp from inside a GCP VM to another server

I am trying to transfer the file samtools.tar.bz2 from inside a VM instance to another server outside GCP called username#server.uni.no: However, I get the following error:
gcloud beta compute scp --project "absolute-bison-xxxx" --zone "europe-west4-a" samtools.tar.bz2 username#server.uni.no:/home/local/user
error
ERROR: (gcloud.beta.compute.scp) Could not fetch resource: - Invalid value 'server.uni.no'. Values must match the following regular expression: '[a-z](?:[-a-z0-9]{0,61}[a-z0-9])?|[1-9][0-9]{0,19}'
Take a look at the documentation:
(BETA) gcloud beta compute scp securely copies files between a virtual machine instance and your local machine using the scp command.
(...)
When the destination of your transfer is local, all source files must be from the same virtual machine.
When the destination of your transfer is remote instead, all sources must be local.
It won't work with two remote machines, even if they are both in GCP.
A crude workaround is to download the file to your local machine:
gcloud beta compute scp --project "absolute-bison-xxxx" --zone "europe-west4-a" USERNAME#INSTANCE_NAME:~/samtools.tar.bz2 samtools.tar.bz2
and then upload it to the external server:
scp samtools.tar.bz username#server.uni.no:/home/local/user

Slurm cluster in Google cloud: Data in mounted directory in controller/login node not available in compute nodes

I have created a slurm cluster following this tutorial. I have also created a data bucket that stores some data that needs to be accessed in the compute nodes. Since the compute nodes share the home directory of the login node, I mounted the bucket in my login node using gcsfuse. However, if I execute a simple script test.py that prints the contents of mounted directory it is just empty. The folder is there as well as the python file.
Is there something that I have to specify in the yaml configuration file that enables having access to the mounted directory?
I have written down the steps that I have taken in order to mount the directory:
When creating the Slurm cluster using
gcloud deployment-manager deployments create google1 --config slurm-cluster.yaml
it is important that the node that should mount the storage directory has sufficient permissions.
Ucnomment/add the following in the slurm-cluster.yaml file if your login node should mount the data. (Do the same just with the controller node instead if you prefer).
login_node_scopes :
- https://www.googleapis.com/auth/devstorage.read_write
Next, log into the log-in node and install gcsfuse. After having installed gcsfuse, you can mount the bucket using the following command
gcsfuse --implicit-dirs <BUCKET-NAME> target/folder/
Note, the service account which is being attached to your VM has to have access rights on the bucket. You can find the name of the service account in the details of your VM in the cloud console or by running the following command on the VM:
gcloud auth list
I've just got a similar setup working. I don't have a definite answer to why yours isn't, but a few notes:
gcsfuse is installed per default, no need to explicitly install it.
You need to wait for the Slurm install to be fully finished before the bucket is available.
The "devstorage.read_write" appears to be needed.
I have the following under the login_machine_type in the yaml file:
network_storage :
- server_ip: none
remote_mount: mybucket
local_mount: /data
fs_type: gcsfuse
mount_options: file_mode=664,dir_mode=775,allow_other

Connect to particular GCP account

I have been using the GCP console to connect to a cloud instance and want to switch to using SSH through powershell as that seems to maintain a longer persistence. Transferring my public key through cloud shell into authorized_key file seems to be temporary since once cloud shell disconnects, the file doesn't persist. I've tried using os-login but that generates a completely different user from what I've been using through cloud shell (Cloud shell creates a user: myname while gcloud creates a user: myname_domain_com. Is there a way to continue using the same profile created by cloud shell when logging in through gcloud. I am using the same email and account in both the console and gcloud myname#domain.com. The alternative is to start all over from gcloud and that would be a pain.
If you want to SSH to different instances of a google cloud project (from a mac or Linux), do the following:
Step 1. Install SSH keys without password
Use the following command to generate the keys on your mac
ssh-keygen -t rsa -f ~/.ssh/ -C
For example private-key-name can be bpa-ssh-key. It will create two files with the following names in the ~/.ssh directory
bpa-ssh-key
bpa-ssh-key.pub
Step 2. Update the public key on your GCP project
Goto Google Cloud Console, choose your project, then
VMInstances->Metadata->SSH Keys->Edit->Add Item
Cut and paste the contents of the bpa-ssh-key.pub (from your mac) here and then save
Reset the VM Instance if it is running
Step 3. Edit config file under ~/.ssh on your mac Edit the ~/.ssh/config to add the following lines if not present already
Host *
PubKeyAuthentication yes
IdentityFile ~/.ssh/bpa-ssh-key
Step 4. SSHing to GCP Instance
ssh username#gcloud-externalip
It should create a SSH shell without asking for the password (since you have created the RSA/SSH keys without a password) on the gcloud instance.
Since Metadata is common across all instances under the same project, you can seam-lessly SSH into any of the instances by choosing the respective External IP of the gcloud instance.

Cannot Transfer files from my mac to VM instance on GCP

I have managed to set up a VM instance on Google cloud platform using the following instructions:
https://towardsdatascience.com/running-jupyter-notebook-in-google-cloud-platform-in-15-min-61e16da34d52
I am then able to run a Jupyter notebook as per the instructions.
Now I want to be able to use my own data in the notebook....this is where I am really struggling. I downloaded the Cloud SDK onto my mac and ran this from the terminal (as per https://cloud.google.com/compute/docs/instances/transfer-files)
My-MacBook-Air:~ me$ gcloud compute scp /Users/me/Desktop/my_data.csv aml-test:~/amlfolder
where aml-test is the name of my instance and amlfolder a folder I created on the VM instance. I don't get any error messages and it seems to work (the terminal displays the following after I run it >> 100% 66MB 1.0MB/s 01:03 )
However when I connect to my VM instance via the SSH button on the google console and type
cd amlfolder
ls
I cannot see any files! (nor can I see them from the jupyter notebook homepage)
I cannot figure out how to use my own data in a python jupyter notebook on a GCP VM instance. I have been trying/googling for an entire day. As you might have guessed I'm a complete newbie to GCP (and cd, ls and mkdir is the extent of my linux command knowledge!)
I also tried using Google Cloud Storage - I uploaded the data into a google storage bucket (as per https://cloud.google.com/compute/docs/instances/transfer-files) but don't know how to complete the last step '4. On your instance, download files from the bucket.'
If anyone can figure out what i am doing wrong, or an easier method to get my own data running into a python jupyter notebook on GCP than using gcloud scp command please help!
Definitely try writing
pwd
to verify you're in the path you think you are, there's a chance that your scp command and the console SSH command login as different users.
To copy data from a bucket to the instance, do
gsutil cp gcs://bucket-name/you-file .
As you can see in gcloud compute docs , gcloud compute scp /Users/me/Desktop/my_data.csv aml-test:~/amlfolder will use your local environment username, so the tilde in your command refers to the home directory of a username that is the same name as your local.
But when you SSH from the Browser as you can see from docs that your Gmail username will be used.
So, you should check the home directory of the user used by gcloud compute scp ... command.
The easiest way to check, SSH to your VM and run
ls /home/ --recursive

Copy files from remote server to google cloud bucket

How would I copy files from a remote server to a google bucket? For example,
gcloud compute scp username#server:/path/to/file gs://my-bucket
This method gives the error: All sources must be local files when destination is remote.
Additionally, gsutil only provides support for cp and not scp.
Thanks in advance!
You can also directly execute gsutil command on your GCE VM (Most VM images have Cloud SDK preinstalled). For example:
gcloud compute ssh user#server --zone my_zone \
--command='gsutil cp path/to/my_file gs://MY_BUCKET'
Note that for this to work your service account associated with VM must have appropriate access scope to GCS. If you run
gcloud beta compute instances describe my_instance --zone my_zone \
--format="value(serviceAccounts.scopes)"
It will show list of scopes set for VM service account. Make sure you have https://www.googleapis.com/auth/cloud-platform or https://www.googleapis.com/auth/devstorage.full_control or https://www.googleapis.com/auth/devstorage.read_write. If not you can use
set-scopes beta command to reset them or go to console and edit VM in question.
gsutil supports streaming data, so you can try to pipe the data in.
Not 100% certain this works with arbitrary binary data (and I'm on a phone so I can't test it)
gcloud compute ssh username#server --command='cat /path/to/file' | gsutil cp - gs://my-bucket