I am in the process of attempting to adjust user permissions in Google Cloud and have created a service account that other users can impersonate to access various projects. The gcloud command has the --impersonate-service-account option to make API calls with the proper authentication, but I was wondering if anyone knows how to make such calls using gsutil.
Here's an example of what a successful call looks like using gcloud:
gcloud --impersonate-service-account=superuser#PROJECT1.iam.gserviceaccount.com iam service-accounts list --project PROJECT2
Yes, here's the option:
$ gsutil -i [SERVICE-ACCOUNT]#[PROJECT] [GSUTIL-COMMAND]
Example:
$ gsutil -i myserviceaccount#iam.gserviceaccount.com ls
There is no such option in the top-level gsutil command-line options (at least not a documented one).
By contrast the gcloud --impersonate-service-account is documented.
Things to try:
if you use the gsutil distributed with the gcloud SDK - it has some ability to use the credentials established by gcloud auth, see Configuring/Using Credentials Via Cloud Sdk Distribution Of Gsutil
if you use the standalone version, check the gsutil config command, which should allow specifying a service account credentials (see also Updating To The Latest Configuration File):
-e Prompt for service account credentials. This option requires that -a is not set.
Related
I have used AWS in the past. So I might be trying to compare the behavior with GCP. Bear with me and provide the steps to do things in a correct way.
What I did:
I create a service account in GCP with storage object viewer role.
I also create a key-pair and downloaded the json file in my local.
If I have gcloud/gsutil installed in my local machine, How can I assume/impersonate the service account and work on GCP resources?
Where should i keep the downloaded json file? I already referred to this - https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating
I downloaded file as key.json and kept in my home directory.
Then I did this.
export GOOGLE_APPLICATION_CREDENTIALS="~/key.json"
Execute this command
gsutil cp dummy.txt gs://my-bucket/
Ideally, it should NOT work. but I am able to upload files.
You can set the path to your service account in env (when using Google Cloud SDKs):
#linux
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/path/to/serviceAccountFile.json"
#windows
$env:GOOGLE_APPLICATION_CREDENTIALS="C:\Users\username\Downloads\path\to\serviceAccountFile.json"
If you don't have the key file, run the following command that'll download the key:
gcloud iam service-accounts keys create ./serviceAccount.json --iam-account=svc_name#project.iam.gserviceaccount.com
You can then use activate-service-account to use given service account as shown below:
gcloud auth activate-service-account --key-file=serviceAccount.json
I want to read from google cloud storage using gsutil using a service credentials file.
However I don't really understand how to pass it the permissions file.
I have tried running
"gcloud auth activate-service-account [ACCOUNT] --key-file=KEY_FILE"
and then running "gsutil list". However I get the following output
You are attempting to perform an operation that requires a project id, with none configured. Please re-run gsutil config and make sure to follow the instructions for finding and entering your default project id.
After authenticating with gcloud should I be able to use gsutil
Since you have done "gcloud auth activate-service-account [ACCOUNT] --key-file=KEY_FILE", the only thing you need to do is gcloud config set project <project id>
Note there's no -p option in gsutil
Add -p <project id> at your gsutil command or perform a gcloud config set project <project id> to set the project ID globally.
You can also refer to this documentation for more gsutil arguments.
I know I can copy files between projects like this...
gcloud auth activate-service-account --key-file project1.json
gsutil cp gs://bucket1/file .
gcloud auth activate-service-account --key-file project2.json
gsutil cp file gs://bucket2/file
But this is not good when I want to script it to copy a large amount of files. The activate-service-account command will apply to the current linux user so it's not thread safe. Is there any way the above commands can be run in a single command using parameters from gsutil?
I am not aware of any public supported method for gsutil. gcloud supports configurations which allow you to use different credentials on the command line via --configuration=CONFIGURATION_NAME, but gsutil does not.
gsutil just uses the default gcloud configuration. When you execute activate-service-account you are changing the default configuration to use the new credentials.
There is the undocumented environment variable CLOUDSDK_ACTIVE_CONFIG_NAME=CONFIGURATION_NAME but I do not know if gsutil checks this environment variable.
Another undocumented item is that the default configuration is stored in C:\Users\username\AppData\Roaming\gcloud\active_config on Windows and ~/.config/gcloud/active_config on Linux. You might test if gsutil is following this convention. This file just stores the configuration name for the current default configuration. This means that you will need to setup gcloud config configurations. I wrote an article on how to setup configurations:
Understanding Gcloud Configurations
Is there a way to set a custom service key file for a single invocation of "gcloud" tool without running gcloud auth activate-service-account ?
Similar to the related "gsutil" tool where this can be easily done using -o option:
gsutil -o Credentials:gs_service_key_file=path/to/credentials_file.json arg1 arg2 ...
I've not tried this but, it appears, yes:
https://cloud.google.com/sdk/gcloud/reference/
gcloud --account="..."
You can authenticate multiple service accounts using gcloud auth activate-service-account and you can enumerate the list with gcloud auth list. You would then switch commands between them using the account flag.
Until your question, I'd always done this rather tediously through reconfiguring gcloud config set account ..., so thanks for helping me learn something new!
I am unable to non-interactively activate my Google Cloud service account; even after reading several SO threads.
Creating a service account
gcloud iam service-accounts create my-awesome-acct ...
Creating a role for the service account
gcloud iam roles create AwesomeRole \
--permissions storage.objects.create,storage.objects.delete ....
Generating the keys
gcloud iam service-accounts keys create ~/awesome-key.json ...
Activating the service account
gcloud auth activate-service-account my-awesome-acct ~/awesome-key.json
My Issue
Even after following the above steps, when I run gsutil ... commands, I still get the error message:
$ gsutil cp my_file.tgz gs://my_bucket
Copying file://my_file.tgz [Content-Type=application/x-tar]...
Your credentials are invalid. Please run
$ gcloud auth login
The only way I could get this to work is to actually run gcloud auth login and allow the authentication in a web browser.
Am I doing something wrong? Or is this intended for every service account?
I'm going to answer my own question here.
My Solution
Instead of using gsutil, I decided to use the Google Cloud Client Libraries.
What I did:
gsutil cp my_file.tgz gs://my_bucket
What I am doing now:
from gcloud import storage
# key file is located in my current directory
os.environ.get('GOOGLE_APPLICATION_CREDENTIALS', 'gcloud-auth.json')
client = storage.Client()
bucket = client.get_bucket("my_bucket")
blob = bucket.blob("my_file.tgz")
blob.upload_from_filename("my_file.tgz")
Hindsight 20/20
After getting the above solution working, it seems if I also set the environment variable, GOOGLE_APPLICATION_CREDENTIALS, my gsutil should've worked too. (untested)