What is the GCP Service Account role to create Cloud Scheduler jobs? - google-cloud-platform

I have a service account that has the usual roles to deploy images to Cloud Run instances and all is working fine. This is done through a GitHub Actions workflows that setups gcloud using the provided setup-gcloud action.
I wrote a bash script that runs some gcloud commands to create some Scheduler's jobs required for my project.
#!/bin/bash
run_command() {
gcloud scheduler jobs describe $1 --location=europe-west1 && echo "Job $1 already exists, skipping..." ||
echo "Creating scheduler job $1 with schedule $2 and uri $3$4..." &&
gcloud scheduler jobs create http $1 --location=europe-west1 \
--schedule "$2" \
--uri "$3$4" \
--http-method $5 \
--time-zone "Europe/Rome"
}
When running this in my workflow, the Action returns this error:
ERROR: (gcloud.scheduler.jobs.describe) PERMISSION_DENIED: The principal (user or service account) lacks IAM permission "cloudscheduler.jobs.get" for the resource "projects/***/locations/europe-west1/jobs/***" (or the resource may not exist).
Creating scheduler job *** with schedule 30 * * * * and uri ***...
ERROR: (gcloud.scheduler.jobs.create.http) PERMISSION_DENIED: The principal (user or service account) lacks IAM permission "cloudscheduler.jobs.create" for the resource "projects/***/locations/europe-west1" (or the resource may not exist).
The service account used to auth the gcloud instance inside the Action has the Cloud Scheduler Service Agent role assigned, which does not have the required permissions to do what I need.
The problem is that when trying to grant the Cloud Scheduler Admin role to this service account, it does not appear to be available in the list.
Do I have to create a custom role or am I doing something wrong with my service account usage?
Screenshot to show that the role doesn't appear in the list:

It's quite odd that the role is not appearing the list, but you can make use of gcloud to attach the admin role to the service account:
gcloud iam service-accounts add-iam-policy-binding \
test-proj1#example.domain.com \
--member='serviceAccount:test-proj1#example.domain.com' \
--role='roles/cloudscheduler.admin'
EDIT:
It is possible that the Role may be somehow disabled, in order to enable it you can follow the next steps:
Open the Google Cloud Console and navigate to the IAM & Admin page for your project.
In the IAM & Admin page, click on the "Roles" tab.
Search for the "Cloud Scheduler Admin" role using the search bar or by scrolling through the list of roles.
Once you have found the "Cloud Scheduler Admin" role, click on the "Restore" button next to it.
A confirmation window will appear. Click on the "Restore" button to confirm that you want to restore the role.
Once you have confirmed, the role will be restored and you will be able to assign it to users or service accounts as needed.

As per this document Cloud Scheduler Service Agent role is enough for a service account to schedule a job by using the cloud scheduler. In addition to this if you are using the cloud run to perform the job as per your requirement you need to provide Cloud Run Service Agent Role as well. In case if you are using cloud functions then you need to provide a Cloud functions service agent role.
In your invoke method try passing the service account email id by giving the --oidc-service-account-email flag
$ gcloud scheduler jobs create http HELLO \
--schedule="30 * * * *" --uri=$URI \
--oidc-service-account-email=CLIENT_SERVICE_ACCOUNT_EMAIL
Refer this document to know more about cloud scheduler with http authentication.
In case a normal scheduler wont work then try using the alpha and beta scheduler commands as well
alpha
scheduler
beta
scheduler
For more detailed information about cloud schedulers refer these documents DOC1
DOC2

Related

Creating a custom service account for Cloud Run using the gcloud CLI

Background
By default, Cloud Run uses the Compute Engine default service account which grants a broad range of permissions which are not required by the container that I'm trying to run in it, and as a result I'd like to set up a new service account.
If I understand correctly, I'd need to do the following:
Create a role with the desired set of permissions (using gcloud iam roles create)
Create a service account (using gcloud iam service-accounts create)
Bind the role permissions to the service account.
Deploy an image with the service account set up in step 2 (using gcloud run deploy --service-account).
The aforementioned documentation doesn't mention how to achieve step 3. I found the gcloud iam service-accounts add-iam-policy-binding command, but I see this is a three way binding between an user (member), a service account and a role, whereas what I've described above seems to require only a two-way binding with the permission grant to the Cloud Run service occurring in the fourth step.
Questions
Do I have the right understanding with regards to the steps required to set up a custom service account for Cloud Run to use?
Assuming I have understood this correctly, what would be the correct way to set up the binding of permissions with the service account?
You can use a custom role in addition of user managed service account, but it's not mandatory. You can also create a user managed service account and bind it with predefined roles.
Anyway, if you want to bind a custom role to a service account (or a user account, no difference), you have to use the fully qualified path for the role
# Project level
projects/<projectID>/roles/<custom role name>
# Organization level
organizations/<organizationID>/roles/<custom role name>
And the gcloud command can be this one
gcloud projects add-iam-policy-binding <projectID> \
--member=serviceAccount:<service account email> \
--role=projects/<projectID>/roles/<custom role name>

(gcloud.dataflow.flex-template.build) PERMISSION_DENIED: The caller does not have permission

I'm trying to build a flex-template image using a service account:
gcloud dataflow flex-template build "$TEMPLATE_PATH" \
--image-gcr-path "$TEMPLATE_IMAGE" \
--sdk-language "JAVA" \
--flex-template-base-image JAVA11 \
--metadata-file "metadata.json" \
--jar "target/XXX.jar" \
--env FLEX_TEMPLATE_JAVA_MAIN_CLASS="XXX"
The service account has the following roles:
"roles/appengine.appAdmin",
"roles/bigquery.admin",
"roles/cloudfunctions.admin",
"roles/cloudtasks.admin",
"roles/compute.viewer",
"roles/container.admin",
"roles/dataproc.admin",
"roles/iam.securityAdmin",
"roles/iam.serviceAccountAdmin",
"roles/iam.serviceAccountUser",
"roles/iam.roleAdmin",
"roles/resourcemanager.projectIamAdmin",
"roles/pubsub.admin",
"roles/serviceusage.serviceUsageAdmin",
"roles/servicemanagement.admin",
"roles/spanner.admin",
"roles/storage.admin",
"roles/storage.objectAdmin",
"roles/firebase.admin",
"roles/cloudconfig.admin",
"roles/vpcaccess.admin",
"roles/compute.instanceAdmin.v1",
"roles/dataflow.admin",
"roles/dataflow.serviceAgent"
However, even with the dataflow.admin and dataflow.serviceAgent roles, my service account is still unable to perform this task.
The documentation https://cloud.google.com/dataflow/docs/guides/templates/using-flex-templates advises to grant the roles/owner role to the service account, but I'm hesitant to do that as this is meant to be part of a CI/CD pipeline and giving a service account an owner role doesn't really make sense to me unless I'm completely wrong.
Is there any way to circumvent this issue without granting the owner role to the service account?
I just ran into the exact same issue and spent a few hours figuring this out. We use terraform service account as well. As you mentioned there are 2 main issues: service account access and the build logs access.
By default, cloud build will use a default service account of form [project_number]#cloudbuild.gserviceaccount.com so you need to grant permissions to this service account to write to your gcs bucket backing the gcr container registry. I granted roles/storage.admin to my service account.
Like you mentioned, by default again, cloud build saves the logs at gs://[project_number].cloudbuild-logs.googleusercontent.com. This seems to be a hidden bucket in the project, at least I could not see it. In adddition, can't configure google_storage_bucket_iam_member for it, instead the recommendation as per this doc is to give roles/viewer at the project level to the service account running the gcloud dataflow ... command.
I was able to run the command successfully after the above changes.

What predefined IAM roles does a service account need to complete the Google Cloud Run Quickstart: Build and Deploy?

I want to compare Google Cloud Run to both Google App Engine and Google Cloud Functions. The Cloud Run Quickstart: Build and Deploy seems like a good starting point.
My Application Default Credentials are too broad to use during development. I'd like to use a service account, but I struggle to configure one that can complete the quickstart without error.
The question:
What is the least privileged set of predefined roles I can assign to a service account that must execute these commands without errors:
gcloud builds submit --tag gcr.io/{PROJECT-ID}/helloworld
gcloud beta run deploy --image gcr.io/{PROJECT-ID}/helloworld
The first command fails with a (seemingly spurious) error when run via a service account with two roles: Cloud Build Service Account and Cloud Run Admin. I haven't run the second command.
Edit: the error is not spurious. The command builds the image and copies it to the project's container registry, then fails to print the build log to the console (insufficient permissions).
Edit: I ran the second command. It fails with Permission 'iam.serviceaccounts.actAs' denied on {service-account}. I could resolve this by assigning the Service Account User role. But that allows the deploy command to act as the project's runtime service account, which has the Editor role by default. Creating a service account with (effectively) both Viewer and Editor roles isn't much better than using my Application Default Credentials.
So I should change the runtime service account permissions. The Cloud Run Service Identity docs have this to say about least privileged access configuration:
This changes the permissions for all services in a project, as well
as Compute Engine and Google Kubernetes Engine instances. Therefore,
the minimum set of permissions must contain the permissions required
for Cloud Run, Compute Engine, and Google Kubernetes Engine in a
project.
Unfortunately, the docs don't say what those permissions are or which set of predefined roles covers them.
What I've done so far:
Use the dev console to create a new GCP project
Use the dev console to create a new service account with the Cloud Run Admin role
Use the dev console to create (and download) a key for the service account
Create (and activate) a gcloud configuration for the project
$ gcloud config list
[core]
account = {service-account-name}#{project-id}.iam.gserviceaccount.com
disable_usage_reporting = True
project = {project-id}
[run]
region = us-central1
Activate the service account using the downloaded key
Use the dev console to enable the Cloud Run API
Use the dev console to enable Container Registry→Settings→Container Analysis API
Create a sample application and Dockerfile as instructed by the quickstart documentation
Run gcloud builds submit --tag gcr.io/[PROJECT-ID]/helloworld
...fails due to missing cloud build permissions
Add the Cloud Build Editor role to service account and resubmit build
...fails due to missing storage permissions. I didn't pay careful attention to what was missing.
Add the Storage Object Admin role to service account and resubmit build
...fails due to missing storage bucket permissions
Replace service account's Storage Object Admin role with the Storage Admin role and resubmit build
...fails with
Error: (gcloud.builds.submit) HTTPError 403:
<?xml version='1.0' encoding='UTF-8'?>
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
{service-account-name} does not have storage.objects.get access to
{number}.cloudbuild-logs.googleusercontent.com/log-{uuid}.txt.</Details>
</Error>
Examine the set of available roles and the project's automatically created service accounts. Realize that the Cloud Build Service Account role has many more permissions that the Cloud Build Editor. This surprised me; the legacy Editor role has "Edit access to all resources".
Remove the Cloud Build Editor and Storage Admin roles from service account
Add the Cloud Build Service Account role to service account and resubmit build
...fails with the same HTTP 403 error (missing get access for a log file)
Check Cloud Build→History in the dev console; find successful builds!
Check Container Registry→Images in the dev console; find images!
At this point I think I could finish Google Cloud Run Quickstart: Build and Deploy. But I don't want to proceed with (seemingly spurious) error messages in my build process.
Cloud Run PM here:
We can break this down into the two sets of permissions needed:
# build a container image
gcloud builds submit --tag gcr.io/{PROJECT_ID}/helloworld
You'll need:
Cloud Build Editor and Cloud Build Viewer (as per #wlhee)
# deploy a container image
gcloud beta run deploy --image gcr.io/{PROJECT_ID}/helloworld
You need to do two things:
Grant your service account the Cloud Run Deployer role (if you want to change the IAM policy, say to deploy the service publicly, you'll need Cloud Run Admin).
Follow the Additional Deployment Instructions to grant that service account the ability to deploy your service account
#1
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/run.developer"
#2
gcloud iam service-accounts add-iam-policy-binding \
PROJECT_NUMBER-compute#developer.gserviceaccount.com \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/iam.serviceAccountUser"
EDIT: As noted, the latter grants your service account the ability to actAs the runtime service account. What role this service account has is dependent on what it needs to access: if the only thing Run/GKE/GCE accesses is GCS, then give it something like Storage Object Viewer instead of Editor. We are also working on per-service identities, so you can create a service account and "override" the default with something that has least-privilege.
According to https://cloud.google.com/cloud-build/docs/securing-builds/set-service-account-permissions
"Cloud Build Service Account" - Cloud Build executes your builds using a service account, a special Google account that executes builds on your behalf.
In order to call
gcloud builds submit --tag gcr.io/path
Edit:
Please "Cloud Build Editor" and "Viewer" your service account that starts the build, it's due to the current Cloud Build authorization model.
Sorry for the inconvenience.

IAM permissions to run "gcloud compute images import"

We are attempting to import an image into GCP with the following command
gcloud compute images import
under the context of a service account. When running this command, the message states that it wants to elevate the permissions of the service account to a "Service Account Actor". Since this role is deprecated (i.e. - https://cloud.google.com/iam/docs/service-accounts#the_service_account_actor_role ) and the recommendation of effectively setting the service account to a "service account user" and "service account token creator" does not work. What would be the correct role or set of roles for the execution of this command?
We are running the following version for the gcloud cli
Google Cloud SDK 232.0.0
alpha 2019.01.27
beta 2019.01.27
bq 2.0.40
core 2019.01.27
gsutil 4.35
kubectl 2019.01.27
Also, if this is not the correct forum to ask this type of question, please let me know which and I will be glad to move this to the correct location.
If this is a one-time operation, upload the image to a bucket and execute gcloud compute image import from the cloud shell--which will execute using your user permissions (likely owner). Reference the image in the shell like gs://my-bucket/my-image.vmd
The instructions below will be necessary if you are forced to use a service account on a VM or another resource.
You'll need to (a) identify the active service account and (b) grant the roles/compute.admin role.
(a) Identify the service Account
On the system running gcloud compute images import run this command to identify the active service account
gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* SERVICE_ACCOUNT#googlexxx.com
(b) Add the roles/compute.admin role
You'll need to add the role roles/compute.admin (once working, find a privileged role for POLP)
Open a separate Google Cloud Shell or another shell where you are authenticated with an "owner" role.
Grant the role.computeAdmin permission
# replace this with the active service acct above
ACTIVE_SERVICE_ACCOUNT=SERVICE_ACCOUNT#googlexxx.com
gcloud projects add-iam-policy-binding $GOOGLE_CLOUD_PROJECT \
--member="serviceAccount:${ACTIVE_SERVICE_ACCOUNT}" \
--role=roles/compute.admin
this is what worked for me (in my case, compute.admin was not enough):
# this project hosts the service account and the instance that the service account calls `gcloud compute images import ...` from.
worker_project=my-playground-for-building-stuff
# this project hosts your images (it can be the same project as ${worker_project} if that's how you roll)
image_project=my-awesome-custom-images
# this bucket will host resources required by, and artifacts created by cloudbuild during image creation (if you have already run `gcloud compute images import ...` as a normal user (not serviceaccount), then the bucket probably already exists in your ${image_project})
cloudbuild_bucket=${image_project}-daisy-bkt-us
# this is your service account in your ${worker_project}
service_account=my-busy-minion-who-loves-to-work#${worker_project}.iam.gserviceaccount.com
for legacy_role in legacyBucketReader legacyBucketWriter; do
gsutil iam ch serviceAccount:${service_account}:${legacy_role} gs://${cloudbuild_bucket}
done
for role in editor compute.admin iam.serviceAccountTokenCreator iam.serviceAccountUser; do
gcloud projects add-iam-policy-binding ${image_project} --member serviceAccount:${service_account} --role roles/${role}
done
for api in cloudbuild cloudresourcemanager; do
gcloud services enable ${api}.googleapis.com --project ${worker_project}
done

gcloud: The user does not have access to service account "default"

I attempting to use an activated service account scoped to create and delete gcloud container clusters (k8s clusters), using the following commands:
gcloud config configurations create my-svc-account \
--no-activate \
--project myProject
gcloud auth activate-service-account my-svc-account#my-project.iam.gserviceaccount.com \
--key-file=/path/to/keyfile.json \
--configuration my-svc-account
gcloud container clusters create a-new-cluster \
--configuration my-svc-account \
--project= my-project
--zone "my-zone"
I always receive the error:
...ERROR: (gcloud.container.clusters.create) ResponseError: code=400, message=The user does not have access to service account "default".
How do I grant my-svc-account access to the default service account for GKE?
After talking to Google Support, the issue was that the service account did not have a "Service Account User" permissions activated. Adding "Service Account User" resolves this error.
Add the following role to the service account who makes the operation:
Service Account User
Also see:
https://cloud.google.com/kubernetes-engine/docs/how-to/iam#service_account_user
https://cloud.google.com/iam/docs/service-accounts#the_service_account_user_role
https://cloud.google.com/iam/docs/understanding-roles
For those that ended up here trying to do an Import of Firebase Firestore documents with a command such as:
gcloud beta firestore import --collection-ids='collectionA','collectionB' gs://YOUR_BUCKET
I got around the issue by doing the following:
From the Google Cloud Console Storage Bucket Browser, add the service account completing the operation to the list of members with a role of Storage Admin.
Re-attempt the operation.
For security, I revoked the role after the operation completed, but that's optional.
iam.serviceAccounts.actAs is the exact permission you need from Service Account User
I was getting the The user does not have access to service account... error even though I added the Service Account User role as others have suggested. What I was missing was the organization policy that prevented service account impersonation across projects. This is explained in the docs: https://cloud.google.com/iam/docs/impersonating-service-accounts#enabling-cross-project
Added Service Account User role to service account and it worked for me.