Can I automate `gcloud auth login`? - google-cloud-platform

I am unable to non-interactively activate my Google Cloud service account; even after reading several SO threads.
Creating a service account
gcloud iam service-accounts create my-awesome-acct ...
Creating a role for the service account
gcloud iam roles create AwesomeRole \
--permissions storage.objects.create,storage.objects.delete ....
Generating the keys
gcloud iam service-accounts keys create ~/awesome-key.json ...
Activating the service account
gcloud auth activate-service-account my-awesome-acct ~/awesome-key.json
My Issue
Even after following the above steps, when I run gsutil ... commands, I still get the error message:
$ gsutil cp my_file.tgz gs://my_bucket
Copying file://my_file.tgz [Content-Type=application/x-tar]...
Your credentials are invalid. Please run
$ gcloud auth login
The only way I could get this to work is to actually run gcloud auth login and allow the authentication in a web browser.
Am I doing something wrong? Or is this intended for every service account?

I'm going to answer my own question here.
My Solution
Instead of using gsutil, I decided to use the Google Cloud Client Libraries.
What I did:
gsutil cp my_file.tgz gs://my_bucket
What I am doing now:
from gcloud import storage
# key file is located in my current directory
os.environ.get('GOOGLE_APPLICATION_CREDENTIALS', 'gcloud-auth.json')
client = storage.Client()
bucket = client.get_bucket("my_bucket")
blob = bucket.blob("my_file.tgz")
blob.upload_from_filename("my_file.tgz")
Hindsight 20/20
After getting the above solution working, it seems if I also set the environment variable, GOOGLE_APPLICATION_CREDENTIALS, my gsutil should've worked too. (untested)

Related

GCP - Impersonate Service Account from Local Machine

I have used AWS in the past. So I might be trying to compare the behavior with GCP. Bear with me and provide the steps to do things in a correct way.
What I did:
I create a service account in GCP with storage object viewer role.
I also create a key-pair and downloaded the json file in my local.
If I have gcloud/gsutil installed in my local machine, How can I assume/impersonate the service account and work on GCP resources?
Where should i keep the downloaded json file? I already referred to this - https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating
I downloaded file as key.json and kept in my home directory.
Then I did this.
export GOOGLE_APPLICATION_CREDENTIALS="~/key.json"
Execute this command
gsutil cp dummy.txt gs://my-bucket/
Ideally, it should NOT work. but I am able to upload files.
You can set the path to your service account in env (when using Google Cloud SDKs):
#linux
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/path/to/serviceAccountFile.json"
#windows
$env:GOOGLE_APPLICATION_CREDENTIALS="C:\Users\username\Downloads\path\to\serviceAccountFile.json"
If you don't have the key file, run the following command that'll download the key:
gcloud iam service-accounts keys create ./serviceAccount.json --iam-account=svc_name#project.iam.gserviceaccount.com
You can then use activate-service-account to use given service account as shown below:
gcloud auth activate-service-account --key-file=serviceAccount.json

gcloud \ kubectl authentication problem: forget service account

I'm using gcloud and kubectl to handle my resources (Kubernetes, VM and so on). Everything worked find until I read some article that created a new service account and activate it via cloud. Something like this:
gcloud auth activate-service-account --key-file=path/to/key
The created service account has limited permissions to few resources. When I run commands, like:
kubectl --namespace production get pods
I'm getting back response like:
Error from server (Forbidden): pods is forbidden: User
"SA-USER#PROGECTNAME.iam.gserviceaccount.com"
cannot list resource "pods" in API group "" in the namespace
"production": requires one of ["container.pods.list"] permission(s).
The SA SA-USER#PROGECTNAME.iam.gserviceaccount.com is the service account that I created yesterday. For some reason, it took control on my default permissions and I'm locked out because this user almost has no permissions.
I tried to make the gcloud forget this service account without success. Things I tried:
Uninstall & Install of gcloud and kubectl
Remove the config directory ("~/.config/gcloud/")
gcloud auth login
All those tried was failed. I still getting the same message as above.
How I can make gcloud and kubectl forget this service account?
Thanks!
UPDATE for the new auth plugin:
Some time ago, gke adopted the new auth plugin architecture for kubernetes.
In the new plugin gke-gcloud-auth-plugin, the authentication cache is at
macOS: ~/.kube/gke_gcloud_auth_plugin_cache
(Please edit this answer to include locations in others operation systems.)
You can just delete that file.
There is a problem with the expiration of the authentication token used by kubectl.
When you choose a new user via gcloud and make a kubectl request, an authentication token is cached locally.
A workaround is to edit your ~/.kube/config and set a date in the past for the expiry field on the relevant user section.
You can perform a gcloud auth application-default login
You can also see the current configuration of your gcloud CLI by doing gcloud config list. You can change some default parameter with gcloud config set param_name param_value. For example (because you will use often it if you have several projects)
gcloud config set project MyProjectId
With these, you will be able to solve your issue

Is there a way to use gsutil while impersonating a service account?

I am in the process of attempting to adjust user permissions in Google Cloud and have created a service account that other users can impersonate to access various projects. The gcloud command has the --impersonate-service-account option to make API calls with the proper authentication, but I was wondering if anyone knows how to make such calls using gsutil.
Here's an example of what a successful call looks like using gcloud:
gcloud --impersonate-service-account=superuser#PROJECT1.iam.gserviceaccount.com iam service-accounts list --project PROJECT2
Yes, here's the option:
$ gsutil -i [SERVICE-ACCOUNT]#[PROJECT] [GSUTIL-COMMAND]
Example:
$ gsutil -i myserviceaccount#iam.gserviceaccount.com ls
There is no such option in the top-level gsutil command-line options (at least not a documented one).
By contrast the gcloud --impersonate-service-account is documented.
Things to try:
if you use the gsutil distributed with the gcloud SDK - it has some ability to use the credentials established by gcloud auth, see Configuring/Using Credentials Via Cloud Sdk Distribution Of Gsutil
if you use the standalone version, check the gsutil config command, which should allow specifying a service account credentials (see also Updating To The Latest Configuration File):
-e Prompt for service account credentials. This option requires that -a is not set.

Google Cloud SQL import - ERROR: HTTPError 403: The client is not authorized to make this request

I trying to import a database stored in the Cloud Storage using the command:
gcloud sql instances import instance-name gs://connect-to-the-cloud-sql.appspot.com/my-cloud-sql-instance-backup
But, I am getting error:
ERROR: (gcloud.sql.instances.import) HTTPError 403: The client is not authorized to make this request.
I've already logged in using:
gcloud auth login
Make sure the instance-name is correct. I had the same error, it went away as soon as I corrected the instance-name.
I had this problem, my instance name was correct. Turns out I was in the wrong GCP project. Make sure you switch to the correct [target] project or use the project argument:
gcloud sql instances export my-cloud-sql-instance gs://connect-to-the-cloud-sql.appspot.com/my-cloud-sql-instance-backup --project=<your target project>
In my case it was because the cloud sql instance service account didn't have the correct permissions on the storage bucket I was trying to import from.
From the docs:
Describe the instance you are importing to:
gcloud sql instances describe [INSTANCE_NAME]
Copy the serviceAccountEmailAddress field.
Use gsutil iam to grant the legacyBucketWriter and objectViewer Cloud IAM roles to the service account for the bucket.
Import the database:
gcloud sql import sql [INSTANCE_NAME] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME] \
--database=[DATABASE_NAME]
It might sound too obvious, but your service account really may be missing access rights for importing data. Check that it has correct cloudsql.instances.import policy installed on IAM&Admin page.
New step by step:
gcloud sql instances describe name-instance | grep serviceAccountEmailAddress
# output: serviceAccount:account#gcp-sa.com
gsutil iam ch serviceAccount:account#gcp-sa.com:roles/storage.legacyBucketWriter gs://bucket-destino
gsutil iam ch serviceAccount:account#gcp-sa.com:roles/storage.objectViewer gs://bucket-destino
# -----------en vm linux--gcp------------------------------------------------------------------------------------
gcloud init (id-project-bucket-destino hacer default en vm proyecto de bucket donde se guardara info)
gcloud config set project id-project-bucket-destino
gcloud sql export sql --project=id-project-instance name-instance gs://bucket-destino/sqldump.sql \
--database=name-database \
--offload
# ----------cron job in linux------------------------------------------------------------------------------------
#!/bin/sh
#make directory in Cloud storage
datedirect=$(date '+%d-%m-%Y')
echo $datedirect
touch file5
gsutil cp -r ./file5 gs://bucket-destino/$datedirect/
gcloud config set project id-project-bucket-destino
gcloud sql export sql --project=id-project-instance name-instance gs://bucket-destino/sqldump.sql \
--database=name-database \
--offload

When creating Google Cloud service accounts do you have to authorize the key after you create it?

When you make a Google Cloud service account using the gcloud command line interface there's a gcloud iam service-accounts keys create to create a key. Looking in the web console, it appears that command creates and registers the key with the account.
Is that sufficient to active the service account for use with they generated JSON key file? Or do you also have to call:
gcloud auth activate-service-account <IAM> --key-file=<JSON file from the keys create command>
The Google docs are a little unclear here as to whether that last step is necessary or not. The console shows no changes to the service account but the command executes successfully if you do make the call.
Creating a key via gcloud iam service-accounts keys create does NOT immediately make it available to use with gcloud commands. You indeed need to activate via gcloud auth activate-service-account.
Use
gcloud auth list
to view your set of credentials. Moreover gcloud uses currently active credentials. You can view your current settings by running
gcloud config list
Also it is possible to use various credentials just by adding extra --account flag to any gcloud command. For example:
gcloud compute zones list --account my_account#gmail.com
where account is previously was obtained via gcloud auth login or gcloud auth activate-service-account and appears in gcloud auth list.
You do not have to use activate-service-account. You can instead use the environment variable 'CLOUDSDK_AUTH_CREDENTIAL_FILE_OVERRIDE' to specify your service account json key.
CLOUDSDK_AUTH_CREDENTIAL_FILE_OVERRIDE="./service-account.json" \
gcloud deployment-manager deployments \
create $DEPLOYMENT \
--project $PROJECT \
--template resources.jinja \
--properties deployment:$DEPLOYMENT,project:$PROJECT
I found this option here:
https://serverfault.com/questions/848580/how-to-use-google-application-credentials-with-gcloud-on-a-server