scp from inside a GCP VM to another server - google-cloud-platform

I am trying to transfer the file samtools.tar.bz2 from inside a VM instance to another server outside GCP called username#server.uni.no: However, I get the following error:
gcloud beta compute scp --project "absolute-bison-xxxx" --zone "europe-west4-a" samtools.tar.bz2 username#server.uni.no:/home/local/user
error
ERROR: (gcloud.beta.compute.scp) Could not fetch resource: - Invalid value 'server.uni.no'. Values must match the following regular expression: '[a-z](?:[-a-z0-9]{0,61}[a-z0-9])?|[1-9][0-9]{0,19}'

Take a look at the documentation:
(BETA) gcloud beta compute scp securely copies files between a virtual machine instance and your local machine using the scp command.
(...)
When the destination of your transfer is local, all source files must be from the same virtual machine.
When the destination of your transfer is remote instead, all sources must be local.
It won't work with two remote machines, even if they are both in GCP.
A crude workaround is to download the file to your local machine:
gcloud beta compute scp --project "absolute-bison-xxxx" --zone "europe-west4-a" USERNAME#INSTANCE_NAME:~/samtools.tar.bz2 samtools.tar.bz2
and then upload it to the external server:
scp samtools.tar.bz username#server.uni.no:/home/local/user

Related

Cannot create a TPU inside of a GCP VM

So, I created a GCP Compute optimized VM and gave it full access to all cloud apis as well as full HTTP and HTTPS traffic access. I now want to create a TPU from inside this VM i.e. run the following command:
gcloud compute tpus create node-1 --zone us-central1-a --project $PROJECT_NAME --version 2.5.0 --accelerator-type v3-8 --no-async
and it constantly errors with:
ERROR: (gcloud.compute.tpus.create) PERMISSION_DENIED: Permission 'tpu.nodes.create' denied on 'projects/$PROJECT_NAME/locations/us-central1-a/nodes/node-1'
I only ever get this error in the VM, but when I run this command on my local machine with my local install of gcloud, everything works fine. It is really weird because all other commands like gcloud list and gsutil all work fine, but creating TPUs doesn't work. I even tried adding a service account into ~/.credentials and setting that in my bashrc:
export GOOGLE_APPLICATION_CREDENTIALS=$HOME/.credentials/service-account.googleapis.com.json
but this doesn't solve the problem. I even tried with the execution groups as well:
gcloud compute tpus execution-groups create --name=node-1 --zone=us-central1-a --tf-version=2.5.0 --accelerator-type=v3-8 --tpu-only --project $PROJECT_NAME
but this also fails.
Below are two possible reasons why you have Permission denied Error:
Service Account does not have Allow full access to all Cloud APIs.
Account doesn't have a role TPU ADMIN.
I tried to create TPU using your command. I got the same error before modifying the service account. Here is the output that TPU has been created.
$ gcloud compute tpus create node-1 --zone us-central1-a --project $PROJECT_NAME --version 2.5.0 --accelerator-type v3-8 --no-async \
Create request issued for: [node-1]
Waiting for operation [projects/project-id/locations/us-central1-a/operations/operation-1634780772429-5ced30f39edf6-105ccd39-96d571fa] to complete...done.
Created tpu [node-1].
Try creating the TPU again after following these instructions:
a. Make sure to Enable TPU API
b. Go to VM Instance and stop/down VM before editing service account.
c. Refresh VM instance page and click Edit
d. At the bottom of Instance details page Select Compute Engine Service Account and Allow full Access to all Cloud APIs and Save.
(As recommended by #John Hanley)
e. On your Instance Page check and note your Service Account.
f. Go to IAM page and look for the Service Account and Edit
g. Click Add Role and select TPU ADMIN and Save
h. Start your VM instance and SSH to Server
i. Run this command
gcloud compute tpus create node-1 --zone us-central1-a --project $PROJECT_NAME --version 2.5.0 --accelerator-type v3-8 --no-async
I encountered error at first because there was existing TPU on the same zone I entered. Make sure that your TPU has not been created with the same zone.

Copy files from Google Cloud Shell to GCP VM

My development environment is within Google Cloud Shell, whereas the production environment is a separate GCP VM instance. After testing I would like to move all files from my dist folder to the GCP VM instance.
As a test I have tried to copy just one file.
Attempt #1: within Cloud Shell terminal
gcloud alpha cloud-shell scp cloudshell:~/README-cloudshell.txt peter#myvm:~/README-cloudshell.txt
Error #1: All sources must be local files when destination is remote.
Attempt #2: within GCP VM terminal
gcloud compute scp cloudshell:~/README-cloudshell.txt myvm:~
Error #2: All sources must be local files when destination is remote.
I appreciate any help.
from cloud shell
gcloud compute scp ~/README-cloudshell.txt user#instanceName:~
Use a correct SSH key or set the correct permission in the ~ directory

copy files from one linux VM instance to other in the same project on google cloud platform

I am new to google cloud. I have seen the similar question but I couldn't understand the answer. It will be great if someone could give easy instruction to tackle this problem.
I have two linux VM instances under same project on google cloud. I want to copy files from one VM to other VM.
I tried copy-files command. It threw error "deprecated, use scp instead"
I tried "gcloud compute scp user#vm2_instance_name:vm2_instance_file_path"
other answers say use "service account". I read about them and created one and created key as well in .json format but not sure what to do after that. Appreciate any suggestions.
If you are in one instance, don't worry about Google Cloud. Simply perform a scp to copy file from VM to another one.
If you don't have customize users on the VM, you can omit it
scp <my local file path> <vm name>:<destination path>
About service account, if your VM are in Google Cloud, they have the compute engine service account by default <projectNumber>-compute#developer.gserviceaccount.com
You can customize this service account if you want. This service account is mandatory to identify the VM which perform API call or gcloud command
Google's documentation addresses this. Personally, I have always preferred using gcloud compute scp as it provides both a simplistic way of performing transfers while not necessarily taking away any of the complexities and features that other transferring options provide.
In any case, in the documentation provided you will most likely find the method that are more in-line with what you want.
This is the solution that worked for me:
1. gcloud compute instances list
NAME ZONE MACHINE_TYPE PREEMPTIBLE
INTERNAL_IP EXTERNAL_IP STATUS
instance-1 us-central1-a n2-standard-8
10.128.0.60 34.66.177.187 RUNNING
instance-2 us-central1-a n1-standard-1
10.128.15.192 34.69.216.153 STAGING
2. gcloud compute ssh instance-1 --zone=us-central1-a
3. user#instance-1:~$ ls
myfile
4. usernstance-1:~$ gcloud compute scp myfile user#instance-2:myfile
5. gcloud compute ssh instance-2 --zone=us-central1-a
6. user#instance-2:~$ ls
myfile

Copy folders from VM gcp to local machine

I see some question here but no work fo me, i think is a sintaxe error:
In my instance :
ss#si:~/public_html/msite.com$
I have some folders:
folder1 folder2 folder3
I need copy all folders to my local machine so i try, run the comand in my local machine.
$gcloud init
After i answer some question i go to prompt again, amd try run:
gcloud compute scp --recurse ss#si:~/public_html/msite.com/* magicnt#debian:/var/www/html/local-folder/ --zone southamerica-east1-a --project xxxxxxxxxxxxxxxx
And get error:
ERROR: (gcloud.compute.scp) All sources must be local files when destination is remote.
So my doubts are:
To copy from vm to local machine i need run the command in vm or local machine?
I believe you need to specify destination without machine name, just a path to local folder.
Not a
magicnt#debian:/var/www/html/local-folder/
but
/var/www/html/local-folder/
Then your complete command should be:
gcloud compute scp --recurse ss#si:~/public_html/msite.com/* /var/www/html/local-folder/ --zone southamerica-east1-a --project xxxxxxxxxxxxxxxx
See "gcloud compute scp" help example:
To copy a remote directory, ~/narnia, from example-instance to the
~/wardrobe directory of your local host, run:
$ gcloud compute scp --recurse example-instance:~/narnia ~/wardrobe

Copy files from remote server to google cloud bucket

How would I copy files from a remote server to a google bucket? For example,
gcloud compute scp username#server:/path/to/file gs://my-bucket
This method gives the error: All sources must be local files when destination is remote.
Additionally, gsutil only provides support for cp and not scp.
Thanks in advance!
You can also directly execute gsutil command on your GCE VM (Most VM images have Cloud SDK preinstalled). For example:
gcloud compute ssh user#server --zone my_zone \
--command='gsutil cp path/to/my_file gs://MY_BUCKET'
Note that for this to work your service account associated with VM must have appropriate access scope to GCS. If you run
gcloud beta compute instances describe my_instance --zone my_zone \
--format="value(serviceAccounts.scopes)"
It will show list of scopes set for VM service account. Make sure you have https://www.googleapis.com/auth/cloud-platform or https://www.googleapis.com/auth/devstorage.full_control or https://www.googleapis.com/auth/devstorage.read_write. If not you can use
set-scopes beta command to reset them or go to console and edit VM in question.
gsutil supports streaming data, so you can try to pipe the data in.
Not 100% certain this works with arbitrary binary data (and I'm on a phone so I can't test it)
gcloud compute ssh username#server --command='cat /path/to/file' | gsutil cp - gs://my-bucket