How to use gcloud compute scp? - google-cloud-platform

I want to download this file
to my computer, but I don't understand the command
gcloud compute scp --recurse [INSTANCE_NAME]:[REMOTE_DIR] [LOCAL_DIR]
The instance is the name of my file??

From https://cloud.google.com/sdk/gcloud/reference/compute/scp
[INSTANCE_NAME] refers to the name of the target instance that you are transferring files to. The following example is taken from the above reference.
To copy a remote directory, ~/narnia, from example-instance
to the ~/wardrobe directory of your local host, run:
$ gcloud compute scp --recurse example-instance:~/narnia ~/wardrobe
Edit:
After looking a bit closer at your image, it looks like you're trying to copy files from Cloud Storage? If you're looking to copy files from a Cloud Storage bucket you will need to use the gsutil cli tool. To copy a file from a directory you'll need to use the gsutil cp command.
See the following for more information and examples:
https://cloud.google.com/storage/docs/gsutil/commands/cp

Related

Copy files from Google Cloud Shell to GCP VM

My development environment is within Google Cloud Shell, whereas the production environment is a separate GCP VM instance. After testing I would like to move all files from my dist folder to the GCP VM instance.
As a test I have tried to copy just one file.
Attempt #1: within Cloud Shell terminal
gcloud alpha cloud-shell scp cloudshell:~/README-cloudshell.txt peter#myvm:~/README-cloudshell.txt
Error #1: All sources must be local files when destination is remote.
Attempt #2: within GCP VM terminal
gcloud compute scp cloudshell:~/README-cloudshell.txt myvm:~
Error #2: All sources must be local files when destination is remote.
I appreciate any help.
from cloud shell
gcloud compute scp ~/README-cloudshell.txt user#instanceName:~
Use a correct SSH key or set the correct permission in the ~ directory

How to copy file from google cloud bucket to google cloud vm instance?

How can I copy google cloud bucket files to a google cloud vm instance?
I tried this command:
gsutil cp gs://xxx.com/photo.tar.gz myvm-vm:/var/www/html/
But it gives me this error:
Copying gs://xxx.com/poto.tar.gz... Skipping attempt to download to
filename ending with slash (myvm-vm::\var\www\html)). This typically
happens when using gsutil to download from a subdirectory created by
the Cloud Console (https://cloud.google.com/console)
Operation completed over 1 objects/11.2 GiB.
The error that is prompting you means that is trying to access to a file ending with slash (myvm-vm::\var\www\html). That's not the correct way to do that, instead, you have to ssh into your Compute Engine instance (with at least "Storage Read only" permissions) and then run the command
sudo gsutil cp gs://<YOUR_BUCKET_NAME>/<YOUR_FILE_NAME> /var/www/html/

Copy folders from VM gcp to local machine

I see some question here but no work fo me, i think is a sintaxe error:
In my instance :
ss#si:~/public_html/msite.com$
I have some folders:
folder1 folder2 folder3
I need copy all folders to my local machine so i try, run the comand in my local machine.
$gcloud init
After i answer some question i go to prompt again, amd try run:
gcloud compute scp --recurse ss#si:~/public_html/msite.com/* magicnt#debian:/var/www/html/local-folder/ --zone southamerica-east1-a --project xxxxxxxxxxxxxxxx
And get error:
ERROR: (gcloud.compute.scp) All sources must be local files when destination is remote.
So my doubts are:
To copy from vm to local machine i need run the command in vm or local machine?
I believe you need to specify destination without machine name, just a path to local folder.
Not a
magicnt#debian:/var/www/html/local-folder/
but
/var/www/html/local-folder/
Then your complete command should be:
gcloud compute scp --recurse ss#si:~/public_html/msite.com/* /var/www/html/local-folder/ --zone southamerica-east1-a --project xxxxxxxxxxxxxxxx
See "gcloud compute scp" help example:
To copy a remote directory, ~/narnia, from example-instance to the
~/wardrobe directory of your local host, run:
$ gcloud compute scp --recurse example-instance:~/narnia ~/wardrobe

How to transfer a file from Google Cloud Storage to Compute Engine instance using gcloud terminal?

I would like to verify my website in Google Search Console by uploading a file to my website - example.com which is hosted on Google Cloud Compute Engine (Wordpress site deployed with Google Cloud Launcher) . It has to be placed on a website as example.com/123googlesearchconsoleexample.html , but I cannot upload it via Cloud Shell from a Google Cloud Storage bucket.
Is there a way to do this?
I tried gcloud compute copy-files command
gcloud compute copy-files gs://example/123googlesearchconsoleexample.html example-prd-vm:~/
, but it seems that it does not work in the browser and this is how I would like to do it (without installing SDK).
The error I get is:
(gcloud.compute.copy-files) All sources must be local files when destination is remote.
I don't know if there's a way of doing it in a single step, but it looks like it should be easy to do in two steps:
$ gsutil cp gs://example/123googlesearchconsoleexample.html .
$ gcloud compute copy-files 123googlesearchconsoleexample.html example-prd-vm:~/
The first command copies from Google Cloud Storage to the VM running Cloud Shell. The second command then copies from that VM to example-prd-vm... it's now copying a local file, so the error you're currently getting won't be applicable.
First list all the buckets you have using the following command
gsutil ls
Use the following command to download a file from your Google Cloud Storage bucket to your local location
gsutil cp gs://bucketname/filename local-location
You can use the -r option to download a folder from GCS
gsutil cp -r gs://bucketname/folder-name local-location
You can also use the -m option to download large number of files which performs a parallel (multi-threaded/multi-processing) copy
gsutil -m cp -r gs://bucketname/folder-name local-location
In order to copy one file:
gsutil -m cp "gs://yourbucketnam/folder/../filename.txt"
In order to copy the whole folder:
gsutil -m cp "gs://yourbucketnam/folder"
In order to copy a folder and all its contents.

Copy files from remote server to google cloud bucket

How would I copy files from a remote server to a google bucket? For example,
gcloud compute scp username#server:/path/to/file gs://my-bucket
This method gives the error: All sources must be local files when destination is remote.
Additionally, gsutil only provides support for cp and not scp.
Thanks in advance!
You can also directly execute gsutil command on your GCE VM (Most VM images have Cloud SDK preinstalled). For example:
gcloud compute ssh user#server --zone my_zone \
--command='gsutil cp path/to/my_file gs://MY_BUCKET'
Note that for this to work your service account associated with VM must have appropriate access scope to GCS. If you run
gcloud beta compute instances describe my_instance --zone my_zone \
--format="value(serviceAccounts.scopes)"
It will show list of scopes set for VM service account. Make sure you have https://www.googleapis.com/auth/cloud-platform or https://www.googleapis.com/auth/devstorage.full_control or https://www.googleapis.com/auth/devstorage.read_write. If not you can use
set-scopes beta command to reset them or go to console and edit VM in question.
gsutil supports streaming data, so you can try to pipe the data in.
Not 100% certain this works with arbitrary binary data (and I'm on a phone so I can't test it)
gcloud compute ssh username#server --command='cat /path/to/file' | gsutil cp - gs://my-bucket