IAM permissions to run "gcloud compute images import" - google-cloud-platform

We are attempting to import an image into GCP with the following command
gcloud compute images import
under the context of a service account. When running this command, the message states that it wants to elevate the permissions of the service account to a "Service Account Actor". Since this role is deprecated (i.e. - https://cloud.google.com/iam/docs/service-accounts#the_service_account_actor_role ) and the recommendation of effectively setting the service account to a "service account user" and "service account token creator" does not work. What would be the correct role or set of roles for the execution of this command?
We are running the following version for the gcloud cli
Google Cloud SDK 232.0.0
alpha 2019.01.27
beta 2019.01.27
bq 2.0.40
core 2019.01.27
gsutil 4.35
kubectl 2019.01.27
Also, if this is not the correct forum to ask this type of question, please let me know which and I will be glad to move this to the correct location.

If this is a one-time operation, upload the image to a bucket and execute gcloud compute image import from the cloud shell--which will execute using your user permissions (likely owner). Reference the image in the shell like gs://my-bucket/my-image.vmd
The instructions below will be necessary if you are forced to use a service account on a VM or another resource.
You'll need to (a) identify the active service account and (b) grant the roles/compute.admin role.
(a) Identify the service Account
On the system running gcloud compute images import run this command to identify the active service account
gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* SERVICE_ACCOUNT#googlexxx.com
(b) Add the roles/compute.admin role
You'll need to add the role roles/compute.admin (once working, find a privileged role for POLP)
Open a separate Google Cloud Shell or another shell where you are authenticated with an "owner" role.
Grant the role.computeAdmin permission
# replace this with the active service acct above
ACTIVE_SERVICE_ACCOUNT=SERVICE_ACCOUNT#googlexxx.com
gcloud projects add-iam-policy-binding $GOOGLE_CLOUD_PROJECT \
--member="serviceAccount:${ACTIVE_SERVICE_ACCOUNT}" \
--role=roles/compute.admin

this is what worked for me (in my case, compute.admin was not enough):
# this project hosts the service account and the instance that the service account calls `gcloud compute images import ...` from.
worker_project=my-playground-for-building-stuff
# this project hosts your images (it can be the same project as ${worker_project} if that's how you roll)
image_project=my-awesome-custom-images
# this bucket will host resources required by, and artifacts created by cloudbuild during image creation (if you have already run `gcloud compute images import ...` as a normal user (not serviceaccount), then the bucket probably already exists in your ${image_project})
cloudbuild_bucket=${image_project}-daisy-bkt-us
# this is your service account in your ${worker_project}
service_account=my-busy-minion-who-loves-to-work#${worker_project}.iam.gserviceaccount.com
for legacy_role in legacyBucketReader legacyBucketWriter; do
gsutil iam ch serviceAccount:${service_account}:${legacy_role} gs://${cloudbuild_bucket}
done
for role in editor compute.admin iam.serviceAccountTokenCreator iam.serviceAccountUser; do
gcloud projects add-iam-policy-binding ${image_project} --member serviceAccount:${service_account} --role roles/${role}
done
for api in cloudbuild cloudresourcemanager; do
gcloud services enable ${api}.googleapis.com --project ${worker_project}
done

Related

Creating a custom service account for Cloud Run using the gcloud CLI

Background
By default, Cloud Run uses the Compute Engine default service account which grants a broad range of permissions which are not required by the container that I'm trying to run in it, and as a result I'd like to set up a new service account.
If I understand correctly, I'd need to do the following:
Create a role with the desired set of permissions (using gcloud iam roles create)
Create a service account (using gcloud iam service-accounts create)
Bind the role permissions to the service account.
Deploy an image with the service account set up in step 2 (using gcloud run deploy --service-account).
The aforementioned documentation doesn't mention how to achieve step 3. I found the gcloud iam service-accounts add-iam-policy-binding command, but I see this is a three way binding between an user (member), a service account and a role, whereas what I've described above seems to require only a two-way binding with the permission grant to the Cloud Run service occurring in the fourth step.
Questions
Do I have the right understanding with regards to the steps required to set up a custom service account for Cloud Run to use?
Assuming I have understood this correctly, what would be the correct way to set up the binding of permissions with the service account?
You can use a custom role in addition of user managed service account, but it's not mandatory. You can also create a user managed service account and bind it with predefined roles.
Anyway, if you want to bind a custom role to a service account (or a user account, no difference), you have to use the fully qualified path for the role
# Project level
projects/<projectID>/roles/<custom role name>
# Organization level
organizations/<organizationID>/roles/<custom role name>
And the gcloud command can be this one
gcloud projects add-iam-policy-binding <projectID> \
--member=serviceAccount:<service account email> \
--role=projects/<projectID>/roles/<custom role name>

How can we copy big folder from google cloud storage to compute engine (windows server 2019) hosted on google cloud platform using gsutil command?

I have saved BI tool setup files in a folder on google cloud storage . we have windows VM created on GCP where i want to move this folder containing all the setup files ( around 60 gb) from google cloud storage by using gsutil command but it is throwing error
I am using below command
gsutil cp -r gs://bucket-name/folder-name C:\Users\user-name\
getting error as AccessDeniedException: 403 sa-d-edw-ce-cognosserver#prj-edw-d-edw-7f58.iam.gserviceaccount.com does not have storage.objects.list access to the Google Cloud Storage bucket.
can someone please help me to understand where I am making mistake ?
There are two likely problems:
The CLI is using an identity that does not possess the required permissions.
The Compute Engine instance has restricted the permissions via scopes or has disabled scopes preventing all API access.
To modify IAM permissions/roles requires permissions as well on your account. Otherwise, you will need to contact an administrator for the ORG or project.
The CLI gsutil is using an identity (either a user or service account). That identity does not have an IAM role attached that contains the IAM permission storage.objects.list.
There are a number of IAM roles that have that permission. If you only need to list and read Cloud Storage objects, use the role Storage Legacy Bucket Reader aka roles/storage.legacyBucketReader. The following link provides details on the available roles:
IAM roles for Cloud Storage
Your Google Compute Engine Windows VM instance has a service account attached to it. The Google Cloud CLI tools can use that service account or the credentials from gcloud auth login. There are a few more methods.
To complicate this a bit more, each Compute Engine has scopes assigned which limit a service accounts permissions. The default scopes allow Cloud Storage object read. In the Google Cloud Console GUI lookup or modify the assigned scopes. The following command will output details on the VM which will include the key serviceAccounts.scope.
gcloud compute instances describe INSTANCE_NAME --project PROJECT_ID --zone ZONE
Figure out which identity your VM is using
gcloud auth list
Add an IAM role to that identity
Windows command syntax.
For a service account:
gcloud projects add-iam-policy-binding PROJECT_ID ^
--member="serviceAccount:REPLACE_WITH_SERVICE_ACCOUNT_EMAIL_ADDRESS" ^
--role="roles/storage.legacyBucketReader"
For a user account:
gcloud projects add-iam-policy-binding PROJECT_ID ^
--member="user:REPLACE_WITH_USER_EMAIL_ADDRESS" ^
--role="roles/storage.legacyBucketReader"

Trigger Google Cloud SQL export within VM

tl;dr: Cannot trigger an export with gcloud sql export sql ... on VM which always leads into a PERMISSION_DENIED even though I think that I have set all permissions for its Service Account.
The whole problem actually sounds relatively simple. I want to trigger an export of my Cloud SQL database in my Google Cloud Compute VM at certain times.
What I did so far:
Added the Cloud SQL Admin (just for the sake of testing) permission to the VMs service account in the IAM section.
Created and downloaded the service account key and used gcloud auth activate-service-account --key-file cert.json
Ran the following command:
gcloud sql export sql "${SQL_INSTANCE}" "gs://${BUCKET}/${FILENAME}" -d "${DATABASE}"
(this works without a problem with my own, personal account)
The command resulted in the following error:
ERROR: (gcloud.sql.export.sql) PERMISSION_DENIED: Request had insufficient authentication scopes.
What else I tried
I found this article from Google and used the Compute Service Account instead of creating a Cloud Function Service Account. The result is sadly the same.
You do not have the roles assigned to the service account that you think you have.
You need one of the following roles assigned to the service account:
roles/owner (Not recommended)
roles/viewer (Not recommended)
roles/cloudsql.admin (Not recommended unless required for other SQL operations)
roles/cloudsql.editor (Not recommended unless required for other SQL operations)
roles/cloudsql.viewer (Recommended)
Go to the Google Cloud Console -> Compute Engine.
Click on your VM instance. Scroll down and find the service account assigned to your VM instance. Copy the service account email address.
Run the following command (replace \ with ^ for Windows in the following command and specify your PROJECT ID (not PROJECT NAME) and the service account email address):
gcloud projects get-iam-policy <PROJECT_ID> \
--flatten="bindings[].members" \
--format="table(bindings.role)" \
--filter="bindings.members:<COMPUTE_ENGINE_SERVICE_ACCOUNT>"
Double-check that the roles you require are present in the output.
To list your projects to obtain the PROJECT ID:
gcloud projects list
Note: Do not assign permissions directly to the service account. Assign permissions to the project granting the required role to the service account IAM member.
gcloud projects add-iam-policy-binding <PROJECT_ID> \
--member serviceAccount:<COMPUTE_ENGINE_SERVICE_ACCOUNT> \
--role roles/cloudsql.viewer

Google Cloud Platform - AI Platform Instance Issue

I am trying to launch a notebook instance in AI platform but getting this error:
You are missing at least one of the following required permissions:
Project
compute.instances.list
But for the current project within the role as defined by project owner this permission has already been given apart from other compute instance permissions.
But still gives the permission error.
Thanks for help in advance
The service account used to create a notebook instance in Google AI platform is the default Compute Engine service account which has the primitive roles/editor.
Permission: Compute Engine default service account
The Compute Engine default service account is created with the Cloud
IAM project editor role, but you can modify the service account's
roles to securely limit which Google APIs the service account can
access.
You can check that the roles/editor includes compute.instances.list :
gcloud iam roles describe roles/editor | grep compute.instances.list
For troubleshooting check:
If you have the default compute service account:
gcloud iam service-accounts list | grep compute#developer.gserviceaccount.com
gcloud iam service-accounts describe your-project-number-compute#developer.gserviceaccount.com
Check the roles of the default compute service account:
gcloud projects get-iam-policy your-project --flatten="bindings[].members" --format='table(bindings.role)' --filter="bindings.members:your-project-number-compute#developer.gserviceaccount.com"
Assuming you are the owner of the project, you should be able to create a new notebook instance with the default compute engine service account.

What predefined IAM roles does a service account need to complete the Google Cloud Run Quickstart: Build and Deploy?

I want to compare Google Cloud Run to both Google App Engine and Google Cloud Functions. The Cloud Run Quickstart: Build and Deploy seems like a good starting point.
My Application Default Credentials are too broad to use during development. I'd like to use a service account, but I struggle to configure one that can complete the quickstart without error.
The question:
What is the least privileged set of predefined roles I can assign to a service account that must execute these commands without errors:
gcloud builds submit --tag gcr.io/{PROJECT-ID}/helloworld
gcloud beta run deploy --image gcr.io/{PROJECT-ID}/helloworld
The first command fails with a (seemingly spurious) error when run via a service account with two roles: Cloud Build Service Account and Cloud Run Admin. I haven't run the second command.
Edit: the error is not spurious. The command builds the image and copies it to the project's container registry, then fails to print the build log to the console (insufficient permissions).
Edit: I ran the second command. It fails with Permission 'iam.serviceaccounts.actAs' denied on {service-account}. I could resolve this by assigning the Service Account User role. But that allows the deploy command to act as the project's runtime service account, which has the Editor role by default. Creating a service account with (effectively) both Viewer and Editor roles isn't much better than using my Application Default Credentials.
So I should change the runtime service account permissions. The Cloud Run Service Identity docs have this to say about least privileged access configuration:
This changes the permissions for all services in a project, as well
as Compute Engine and Google Kubernetes Engine instances. Therefore,
the minimum set of permissions must contain the permissions required
for Cloud Run, Compute Engine, and Google Kubernetes Engine in a
project.
Unfortunately, the docs don't say what those permissions are or which set of predefined roles covers them.
What I've done so far:
Use the dev console to create a new GCP project
Use the dev console to create a new service account with the Cloud Run Admin role
Use the dev console to create (and download) a key for the service account
Create (and activate) a gcloud configuration for the project
$ gcloud config list
[core]
account = {service-account-name}#{project-id}.iam.gserviceaccount.com
disable_usage_reporting = True
project = {project-id}
[run]
region = us-central1
Activate the service account using the downloaded key
Use the dev console to enable the Cloud Run API
Use the dev console to enable Container Registry→Settings→Container Analysis API
Create a sample application and Dockerfile as instructed by the quickstart documentation
Run gcloud builds submit --tag gcr.io/[PROJECT-ID]/helloworld
...fails due to missing cloud build permissions
Add the Cloud Build Editor role to service account and resubmit build
...fails due to missing storage permissions. I didn't pay careful attention to what was missing.
Add the Storage Object Admin role to service account and resubmit build
...fails due to missing storage bucket permissions
Replace service account's Storage Object Admin role with the Storage Admin role and resubmit build
...fails with
Error: (gcloud.builds.submit) HTTPError 403:
<?xml version='1.0' encoding='UTF-8'?>
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
{service-account-name} does not have storage.objects.get access to
{number}.cloudbuild-logs.googleusercontent.com/log-{uuid}.txt.</Details>
</Error>
Examine the set of available roles and the project's automatically created service accounts. Realize that the Cloud Build Service Account role has many more permissions that the Cloud Build Editor. This surprised me; the legacy Editor role has "Edit access to all resources".
Remove the Cloud Build Editor and Storage Admin roles from service account
Add the Cloud Build Service Account role to service account and resubmit build
...fails with the same HTTP 403 error (missing get access for a log file)
Check Cloud Build→History in the dev console; find successful builds!
Check Container Registry→Images in the dev console; find images!
At this point I think I could finish Google Cloud Run Quickstart: Build and Deploy. But I don't want to proceed with (seemingly spurious) error messages in my build process.
Cloud Run PM here:
We can break this down into the two sets of permissions needed:
# build a container image
gcloud builds submit --tag gcr.io/{PROJECT_ID}/helloworld
You'll need:
Cloud Build Editor and Cloud Build Viewer (as per #wlhee)
# deploy a container image
gcloud beta run deploy --image gcr.io/{PROJECT_ID}/helloworld
You need to do two things:
Grant your service account the Cloud Run Deployer role (if you want to change the IAM policy, say to deploy the service publicly, you'll need Cloud Run Admin).
Follow the Additional Deployment Instructions to grant that service account the ability to deploy your service account
#1
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/run.developer"
#2
gcloud iam service-accounts add-iam-policy-binding \
PROJECT_NUMBER-compute#developer.gserviceaccount.com \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/iam.serviceAccountUser"
EDIT: As noted, the latter grants your service account the ability to actAs the runtime service account. What role this service account has is dependent on what it needs to access: if the only thing Run/GKE/GCE accesses is GCS, then give it something like Storage Object Viewer instead of Editor. We are also working on per-service identities, so you can create a service account and "override" the default with something that has least-privilege.
According to https://cloud.google.com/cloud-build/docs/securing-builds/set-service-account-permissions
"Cloud Build Service Account" - Cloud Build executes your builds using a service account, a special Google account that executes builds on your behalf.
In order to call
gcloud builds submit --tag gcr.io/path
Edit:
Please "Cloud Build Editor" and "Viewer" your service account that starts the build, it's due to the current Cloud Build authorization model.
Sorry for the inconvenience.