(gcloud.compute.scp) Could not fetch resource - google-cloud-platform

I'm using Google Cloud for the first time and I'm trying to upload a test file to my root folder on my instance. However, I'm getting this error:
ERROR: (gcloud.compute.scp) Could not fetch resource:
- Invalid value '[ua2r-website]'. Values must match the following regular expression: '[a-z](?:[-a-z0-9]{0,61}[a-z0-9])?'
I'm in the path for that file. Here's my command:
gcloud compute scp [testtext.txt] [ua2r-website]:~/
I've double-checked the spelling and the punctuation of the VM instance, and I can't find a difference.

You need to use gcloud auth login
gcloud auth login - authorize gcloud to access the Cloud Platform with Google user credentials
write the command
gcloud auth login
then you will get a link to click on from GCP. you will get a code from the link , copy it back to the VM. then you will be authorized to do the operation.
Here is more details
https://cloud.google.com/sdk/gcloud/reference/auth/login

Remove the ['s and ]'s
gcloud compute scp testtext.txt ua2r-website:~/

You can also drag and drop the files from your local computer filesystem to the open unix shell of your where your project is.

Related

Issue using gcloud scp

I am trying to transfer a file from a Google Compute Engine VM instance to my local machine using gcloud scp. I am able to confirm that I am logged in to the Google Cloud account using:
gcloud config list
However when I execute the gcloud scp command, I see following error message:
ERROR: (gcloud.compute.scp) You do not currently have an active account selected.
Not sure why I am seeing the error even if I am logged in.
Here is how I am executing the command:
gcloud scp <instance_name>: <remote_file_path> .
Thanks in advance!

getting get-credentials requires edit permission error on gcp

I'm trying to setup credentials for kubernetes on my local.
gcloud container clusters get-credentials ***** --zone **** --project elo-project-267109
This query works fine when I tried it from cloud shell, but I got this error when I tried run it from my terminal:
ERROR: (gcloud.container.clusters.get-credentials) get-credentials requires edit permission on elo-project-267109
I've tried this query from admin account as well as default service account also from new service account by assigning editor role and it still doesn't seem to work for me.
i am using macOs Mojave(10.14.6) and gcloud SDK version installed in my system is 274.0.1
i was able to resolve this issue on my local but i was actually trying to build a CI/CD from gitlab and the issue persists there, i have tried using gcloud(279.0.0) image version.
i am new to both gitlab and gcloud. i am trying to build CI/CD pipeline for the first time.
Do gcloud auth list to see which account are you logged into.
You need to login with the account which has the correct credentials to access the action that you're trying to perform.
To set the gcloud account: gcloud config set account <ACCOUNT>
It's turned out to be the image version mismatch issue on GitLab.

Unable to retrieve file from google cloud instance

I have a google cloud instance that has a program running which creates a file at the end of the run. Now the file that has been created needs to be retrieved to my local machine.
I tried the following commands as per this link. My command is as follows
gcloud compute scp root#<instance-name>:~/folder1/folder2/file_name.rds ~/Documents/
After entering the password, I get the following error.
Permission denied (publickey).
ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1].
I have set Compute Engine to Read Write. Is there anything else I have missed out? Thank you in advance!
Try to run on your command line first:
gcloud auth login
and then
gcloud beta auth application-default login
Log in with the google account owner of the project

Unable to connect to Google Container Engine

I've updated gcloud to the latest version (159.0.0)
I created a Google Container Engine node, and then followed the instructions in the prompt.
gcloud container clusters get-credentials prod --zone us-west1-b --project myproject
Fetching cluster endpoint and auth data.
kubeconfig entry generated for prod
kubectl proxy
Unable to connect to the server: error executing access token command
"/Users/me/Code/google-cloud-sdk/bin/gcloud ": exit status
Any idea why is it not able to connect?
You can try to run to see if the config was generated correctly:
kubectl config view
I had a similar issue when trying to run kubectl commands on a new Kubernetes cluster just created on Google Cloud Platform.
The solution for my case was to activate Google Application Default Credentials.
You can find a link below on how to activate it.
Basically, you need to set an environmental variable to the path of the .json with the credentials from GCP
GOOGLE_APPLICATION_CREDENTIALS -> c:\...\..\..Credentials.json exported from Google Cloud
https://developers.google.com/identity/protocols/application-default-credentials
I found this solution on a kuberenetes github issue: https://github.com/kubernetes/kubernetes/issues/30617
PS: make sure you have also set the environmental variables for:
%HOME% to %USERPROFILE%
%KUBECONFIG% to %USERPROFILE%
It looks like the default auth plugin for GKE might be buggy on windows. kubectl is trying to run gcloud to get a token to authenticate to your cluster. If you run kubectl config view you can see the command it tried to run, and run it yourself to see if/why it fails.
As Alexandru said, a workaround is to use Google Application Default Credentials. Actually, gcloud container has built in support for doing this, which you can toggle by setting a property:
gcloud config set container/use_application_default_credentials true
or set environment variable
%CLOUDSDK_CONTAINER_USE_APPLICATION_DEFAULT_CREDENTIALS% to true.
Using GKE, update the credentials from the "Kubernetes Engine/Cluster" management worked for me.
The cluster line provides "Connect" button that copy the credentials commands into console. And this refresh the used token. And then kubectl works again.
Why my token expired? well, i suppose GCP token are not eternal.
So, the button plays the same command automatically that :
gcloud container clusters get-credentials your-cluster ...
Bruno

Copy files between two Google Cloud instances

I have two projects in Google Cloud and I need to copy files from an instance in one project to an instance in another project. I tried to using the 'gcloud compute copy-files' command but I'm getting this error:
gcloud compute copy-files test.tgz --project stack-complete-343 instance-IP:/home/ubuntu --zone us-central1-a
ERROR: (gcloud.compute.copy-files) Could not fetch instance: - Insufficient Permission
I was able to replicate your issue with a brand new VM instance, getting the same error. Here are a few steps that I took to correct the problem:
Make sure you are authenticated and have rights to both projects with the same account!
$ gcloud config list (if you see the service account #developer.gserviceaccount.com, you need to switch to the account that is enabled on both projects. you can check that from the Devlopers Console > Permissions)
$ gcloud auth login (copy the link to a new window, login, copy the code and paste it back in the prompt)
$ gcloud compute scp test.tgz --project stack-complete-343 instance-IP:/home/ubuntu --zone us-central1-a (I would also use the instance name instead of the IP)
This last command should also generate your ssh keys. You should see something like this, but do not worry about entering a passphrase :
WARNING: [/usr/bin/ssh-keygen] will be executed to generate a key.
Generating public/private rsa key pair
Enter passphrase (empty for no passphrase):
Go to the permissions tab on the remote instance(i.e. the instance you WON'T be running gcloud compute copy-files on). Then go to service accounts and create a new one, give it a name and check the box to get a key file for it and leave JSON selected. Upload that key file from your personal machine using gcloud compute copy-files and your personal account to the local instance(i.e. the machine you're SSHing into and running the gcloud compute copy-files command on.) Then run this from the local instance via SSH. gcloud auth activate-service-account ACCOUNT --key-file KEY-FILE replacing ACCOUNT with the email like address that was generated and KEY-FILE with the path to the key file you uploaded from your personal machine earlier. Then you should be able to access the instance that setup the account. These steps have to be repeated on every instance you want to copy files between. If these instructions weren't clear let me know and I'll try to help out.
It's not recommended to auth your account on Compute Engine instances because that can expose your credentials to anybody with access to the machine.
Instead, you can let your service accounts use the Compute Engine API. First, stop the instance. Once stopped you can edit Cloud API access scopes from the console. Modify the Compute Engine scope from Disabled to Read Only.
You should be able to just use the copy-files command now. This lets your service account access the Compute Engine API.
The most simple way to to this will be using 'scp' command and .pem file. Here's as example
sudo scp -r -i your/path_to/.pem your_username#ip_address_of_instance:path/to/copy/file
If both of them are in the same project this is the simplest way
gcloud compute copy-files yourFileName --project yourProjectName instance-name:~/folderInInstance --zone europe-west1-b
Obviously you should edit the zone according to your instances.
One of the approaches to get permissions is to enable Cloud API access scopes. You may set them to Allow full access to all Cloud APIs.
In console click on the instance and use EDIT button above. Scroll to the bottom and change Cloud API access scopes. See also this answer.