What is Secret in Cloud Run equivalent in gcloud command line? - google-cloud-platform

I'm trying to use secret in Cloud Run for Anthos by gcloud command line
Is there any example about how to use this secret in any documents?
I'm already looking for it in https://cloud.google.com/sdk/gcloud/reference/run
but nowhere in the doc talking about secret
Secret in Cloud Run

Secret mounts in "Cloud Run for Anthos" are regular Kubernetes Secrets. https://kubernetes.io/docs/concepts/configuration/secret/ You can use kubectl create secret command to create it.
You can browse the list of ConfigMaps and Secrets at the Cloud Console https://console.cloud.google.com/kubernetes/config but you can't create them or edit them there. Currently, kubectl is the only option.

Related

GCP - Impersonate Service Account from Local Machine

I have used AWS in the past. So I might be trying to compare the behavior with GCP. Bear with me and provide the steps to do things in a correct way.
What I did:
I create a service account in GCP with storage object viewer role.
I also create a key-pair and downloaded the json file in my local.
If I have gcloud/gsutil installed in my local machine, How can I assume/impersonate the service account and work on GCP resources?
Where should i keep the downloaded json file? I already referred to this - https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating
I downloaded file as key.json and kept in my home directory.
Then I did this.
export GOOGLE_APPLICATION_CREDENTIALS="~/key.json"
Execute this command
gsutil cp dummy.txt gs://my-bucket/
Ideally, it should NOT work. but I am able to upload files.
You can set the path to your service account in env (when using Google Cloud SDKs):
#linux
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/path/to/serviceAccountFile.json"
#windows
$env:GOOGLE_APPLICATION_CREDENTIALS="C:\Users\username\Downloads\path\to\serviceAccountFile.json"
If you don't have the key file, run the following command that'll download the key:
gcloud iam service-accounts keys create ./serviceAccount.json --iam-account=svc_name#project.iam.gserviceaccount.com
You can then use activate-service-account to use given service account as shown below:
gcloud auth activate-service-account --key-file=serviceAccount.json

How to create a Secret using Google Cloud Secret Manager with gcloud command line tool?

I have started a Google Cloud Account and I want to access and use Google Secret Manager. How can I create a Secret using Google Secret Manager API using gcloud command line interface ?
You can see it here https://cloud.google.com/secret-manager/docs/creating-and-accessing-secrets#secretmanager-create-secret-cli
gcloud secrets create secret-id \
--replication-policy="automatic"

How to enable Google Cloud Secrets Manager using gcloud command line interface?

I have started a Google Cloud Account and I want to access and use Google Secret Manager. How can I enable Google Secret Manager API using gloud command line interface ?
Find the service name by using gcloud services list --available
gcloud services list --available | grep Secret
Enable API using gcloud services enable:
gcloud services enable secretmanager.googleapis.com

Unable to connect to Google Container Engine

I've updated gcloud to the latest version (159.0.0)
I created a Google Container Engine node, and then followed the instructions in the prompt.
gcloud container clusters get-credentials prod --zone us-west1-b --project myproject
Fetching cluster endpoint and auth data.
kubeconfig entry generated for prod
kubectl proxy
Unable to connect to the server: error executing access token command
"/Users/me/Code/google-cloud-sdk/bin/gcloud ": exit status
Any idea why is it not able to connect?
You can try to run to see if the config was generated correctly:
kubectl config view
I had a similar issue when trying to run kubectl commands on a new Kubernetes cluster just created on Google Cloud Platform.
The solution for my case was to activate Google Application Default Credentials.
You can find a link below on how to activate it.
Basically, you need to set an environmental variable to the path of the .json with the credentials from GCP
GOOGLE_APPLICATION_CREDENTIALS -> c:\...\..\..Credentials.json exported from Google Cloud
https://developers.google.com/identity/protocols/application-default-credentials
I found this solution on a kuberenetes github issue: https://github.com/kubernetes/kubernetes/issues/30617
PS: make sure you have also set the environmental variables for:
%HOME% to %USERPROFILE%
%KUBECONFIG% to %USERPROFILE%
It looks like the default auth plugin for GKE might be buggy on windows. kubectl is trying to run gcloud to get a token to authenticate to your cluster. If you run kubectl config view you can see the command it tried to run, and run it yourself to see if/why it fails.
As Alexandru said, a workaround is to use Google Application Default Credentials. Actually, gcloud container has built in support for doing this, which you can toggle by setting a property:
gcloud config set container/use_application_default_credentials true
or set environment variable
%CLOUDSDK_CONTAINER_USE_APPLICATION_DEFAULT_CREDENTIALS% to true.
Using GKE, update the credentials from the "Kubernetes Engine/Cluster" management worked for me.
The cluster line provides "Connect" button that copy the credentials commands into console. And this refresh the used token. And then kubectl works again.
Why my token expired? well, i suppose GCP token are not eternal.
So, the button plays the same command automatically that :
gcloud container clusters get-credentials your-cluster ...
Bruno

Copy files between two Google Cloud instances

I have two projects in Google Cloud and I need to copy files from an instance in one project to an instance in another project. I tried to using the 'gcloud compute copy-files' command but I'm getting this error:
gcloud compute copy-files test.tgz --project stack-complete-343 instance-IP:/home/ubuntu --zone us-central1-a
ERROR: (gcloud.compute.copy-files) Could not fetch instance: - Insufficient Permission
I was able to replicate your issue with a brand new VM instance, getting the same error. Here are a few steps that I took to correct the problem:
Make sure you are authenticated and have rights to both projects with the same account!
$ gcloud config list (if you see the service account #developer.gserviceaccount.com, you need to switch to the account that is enabled on both projects. you can check that from the Devlopers Console > Permissions)
$ gcloud auth login (copy the link to a new window, login, copy the code and paste it back in the prompt)
$ gcloud compute scp test.tgz --project stack-complete-343 instance-IP:/home/ubuntu --zone us-central1-a (I would also use the instance name instead of the IP)
This last command should also generate your ssh keys. You should see something like this, but do not worry about entering a passphrase :
WARNING: [/usr/bin/ssh-keygen] will be executed to generate a key.
Generating public/private rsa key pair
Enter passphrase (empty for no passphrase):
Go to the permissions tab on the remote instance(i.e. the instance you WON'T be running gcloud compute copy-files on). Then go to service accounts and create a new one, give it a name and check the box to get a key file for it and leave JSON selected. Upload that key file from your personal machine using gcloud compute copy-files and your personal account to the local instance(i.e. the machine you're SSHing into and running the gcloud compute copy-files command on.) Then run this from the local instance via SSH. gcloud auth activate-service-account ACCOUNT --key-file KEY-FILE replacing ACCOUNT with the email like address that was generated and KEY-FILE with the path to the key file you uploaded from your personal machine earlier. Then you should be able to access the instance that setup the account. These steps have to be repeated on every instance you want to copy files between. If these instructions weren't clear let me know and I'll try to help out.
It's not recommended to auth your account on Compute Engine instances because that can expose your credentials to anybody with access to the machine.
Instead, you can let your service accounts use the Compute Engine API. First, stop the instance. Once stopped you can edit Cloud API access scopes from the console. Modify the Compute Engine scope from Disabled to Read Only.
You should be able to just use the copy-files command now. This lets your service account access the Compute Engine API.
The most simple way to to this will be using 'scp' command and .pem file. Here's as example
sudo scp -r -i your/path_to/.pem your_username#ip_address_of_instance:path/to/copy/file
If both of them are in the same project this is the simplest way
gcloud compute copy-files yourFileName --project yourProjectName instance-name:~/folderInInstance --zone europe-west1-b
Obviously you should edit the zone according to your instances.
One of the approaches to get permissions is to enable Cloud API access scopes. You may set them to Allow full access to all Cloud APIs.
In console click on the instance and use EDIT button above. Scroll to the bottom and change Cloud API access scopes. See also this answer.