I've previously only used Cloud Functions of gen. 1 but now plan to move to 2nd generation and is just trying to deploy/test a first basic function. I'm just taking the Google sample for a storage triggered function and try to deploy it, but it keeps failing.
This is what it looks like:
> gcloud functions deploy nodejs-finalize-function --gen2 --runtime=nodejs16 --project myproject --region=europe-west3 --source=. --entry-point=handleImage --trigger-event-filters='type=google.cloud.storage.object.v1.finalized' --trigger-event-filters='bucket=se_my_images'
Preparing function...done.
X Deploying function...
✓ [Build] Logs are available at [https://console.cloud.google.com/cloud-build/builds;region=europe-west3/a8355043-adf0-4485-a510-1d54b7e11111?project=123445666123]
✓ [Service]
✓ [Trigger]
- [ArtifactRegistry] Deleting function artifacts in Artifact Registry...
. [Healthcheck]
. [Triggercheck]
Failed.
ERROR: (gcloud.functions.deploy) OperationError: code=7, message=Creating trigger failed for projects/myproject/locations/europe-west3/triggers/nodejs-finalize-function-898863: The Cloud Storage service account for your bucket is unable to publish to Cloud Pub/Sub topics in the specified project.
To use GCS CloudEvent triggers, the GCS service account requires the Pub/Sub Publisher (roles/pubsub.publisher) IAM role in the specified project. (See https://cloud.google.com/eventarc/docs/run/quickstart-storage#before-you-begin)
The error looks easy to understand, but I have aded the Pub/Sub Publisher role to all my service accounts now (the ones listed below) and I still keep getting the same error.
>gcloud iam service-accounts list --project myproject
DISPLAY NAME EMAIL DISABLED
firebase-adminsdk firebase-adminsdk-u2x33#myproject.iam.gserviceaccount.com False
Default compute service account 930445666575-compute#developer.gserviceaccount.com False
backend-dev backend-dev#myproject.iam.gserviceaccount.com False
App Engine default service account myproject#appspot.gserviceaccount.com False
I don't know how to move forward from here so I hope someone can help.
*** EDIT ***.
I added the role to the listed service accounts in the GCP console, IAM > Permissions > View By Principal page/view where I used the Edit Principal button to assign an additional role (Pub/Sub Publisher) to the service accounts (note that I added the role to all my listed service accounts since I'm not 100% sure which one is used by GCP for cloud deployment).
since you are already using gcloud cli, i suggest you follow the step 2 which says:
PROJECT_ID=$(gcloud config get-value project)
PROJECT_NUMBER=$(gcloud projects list --filter="project_id:$PROJECT_ID" --format='value(project_number)')
SERVICE_ACCOUNT=$(gsutil kms serviceaccount -p $PROJECT_NUMBER)
gcloud projects add-iam-policy-binding $PROJECT_ID \
--member serviceAccount:$SERVICE_ACCOUNT \
--role roles/pubsub.publisher
After these 4 cmd's, you should have no problems.. I don't use gcp interface for iam purposes, since all my iam policies are uploaded by terraform/terragrunt.
Related
I was trying to deploy an http triggered Cloud Function with Cloud Build using this configuration.
steps:
- name: 'gcr.io/cloud-builders/gcloud'
args:
- beta
- functions
- deploy
- myfunction
- --source=start_shopify_installation
- --trigger-http
- --region=europe-west1
- --runtime=nodejs14
- --allow-unauthenticated
- --ingress-settings=all
- --security-level=secure-always
- --set-secrets=env_1=secret_1:latest
When I got an error saying Cloud Build could't set an IAM policy.
WARNING: Setting IAM policy failed, try:
gcloud alpha functions add-iam-policy-binding myfunction\
--region=europe-west1 \
--member=allUsers \
--role=roles/cloudfunctions.invoker
The function gets deployed and when I check in the GCP console it looks like the allUsers member has the role Cloud Functions Invoker, but it doesn't have allow unauthenticated in the Authentication column. When I go to invoke the function I get a 'missing permissions' error.
When I execute the suggested command from my Cloud Shell it works just fine. However if I fill it in as an extra step in my deployment configuration that step fails.
I think that my Cloud Build service account must be missing a role in order to make the function accessible without authetication? Currently it has these roles: Cloud Build Service Account, Cloud Functions Developer and Service Account User.
EDIT
I added the Project IAM Admin role to the Cloud Build service account and tried again.
Unfortunately, it didn't change anything.
I reproduced your error (warning) on my side and fixed it: I can see allUsers having Cloud Functions Invoker role in the function's PERMISSIONS tab.
In fact your cloud build service account needs the cloudfunctions.functions.setIamPolicy permission. So the solution is replace Cloud Functions Developer role with Cloud Functions Admin role.
Use of the --allow-unauthenticated flag modifies IAM permissions. To ensure that unauthorized developers cannot modify function permissions, the user or service that is deploying the function must have the cloudfunctions.functions.setIamPolicy permission. This permission is included in both the Owner and Cloud Functions Admin roles.
Ref: https://cloud.google.com/functions/docs/securing/managing-access-iam#at_deployment
tl;dr: Cannot trigger an export with gcloud sql export sql ... on VM which always leads into a PERMISSION_DENIED even though I think that I have set all permissions for its Service Account.
The whole problem actually sounds relatively simple. I want to trigger an export of my Cloud SQL database in my Google Cloud Compute VM at certain times.
What I did so far:
Added the Cloud SQL Admin (just for the sake of testing) permission to the VMs service account in the IAM section.
Created and downloaded the service account key and used gcloud auth activate-service-account --key-file cert.json
Ran the following command:
gcloud sql export sql "${SQL_INSTANCE}" "gs://${BUCKET}/${FILENAME}" -d "${DATABASE}"
(this works without a problem with my own, personal account)
The command resulted in the following error:
ERROR: (gcloud.sql.export.sql) PERMISSION_DENIED: Request had insufficient authentication scopes.
What else I tried
I found this article from Google and used the Compute Service Account instead of creating a Cloud Function Service Account. The result is sadly the same.
You do not have the roles assigned to the service account that you think you have.
You need one of the following roles assigned to the service account:
roles/owner (Not recommended)
roles/viewer (Not recommended)
roles/cloudsql.admin (Not recommended unless required for other SQL operations)
roles/cloudsql.editor (Not recommended unless required for other SQL operations)
roles/cloudsql.viewer (Recommended)
Go to the Google Cloud Console -> Compute Engine.
Click on your VM instance. Scroll down and find the service account assigned to your VM instance. Copy the service account email address.
Run the following command (replace \ with ^ for Windows in the following command and specify your PROJECT ID (not PROJECT NAME) and the service account email address):
gcloud projects get-iam-policy <PROJECT_ID> \
--flatten="bindings[].members" \
--format="table(bindings.role)" \
--filter="bindings.members:<COMPUTE_ENGINE_SERVICE_ACCOUNT>"
Double-check that the roles you require are present in the output.
To list your projects to obtain the PROJECT ID:
gcloud projects list
Note: Do not assign permissions directly to the service account. Assign permissions to the project granting the required role to the service account IAM member.
gcloud projects add-iam-policy-binding <PROJECT_ID> \
--member serviceAccount:<COMPUTE_ENGINE_SERVICE_ACCOUNT> \
--role roles/cloudsql.viewer
I want to compare Google Cloud Run to both Google App Engine and Google Cloud Functions. The Cloud Run Quickstart: Build and Deploy seems like a good starting point.
My Application Default Credentials are too broad to use during development. I'd like to use a service account, but I struggle to configure one that can complete the quickstart without error.
The question:
What is the least privileged set of predefined roles I can assign to a service account that must execute these commands without errors:
gcloud builds submit --tag gcr.io/{PROJECT-ID}/helloworld
gcloud beta run deploy --image gcr.io/{PROJECT-ID}/helloworld
The first command fails with a (seemingly spurious) error when run via a service account with two roles: Cloud Build Service Account and Cloud Run Admin. I haven't run the second command.
Edit: the error is not spurious. The command builds the image and copies it to the project's container registry, then fails to print the build log to the console (insufficient permissions).
Edit: I ran the second command. It fails with Permission 'iam.serviceaccounts.actAs' denied on {service-account}. I could resolve this by assigning the Service Account User role. But that allows the deploy command to act as the project's runtime service account, which has the Editor role by default. Creating a service account with (effectively) both Viewer and Editor roles isn't much better than using my Application Default Credentials.
So I should change the runtime service account permissions. The Cloud Run Service Identity docs have this to say about least privileged access configuration:
This changes the permissions for all services in a project, as well
as Compute Engine and Google Kubernetes Engine instances. Therefore,
the minimum set of permissions must contain the permissions required
for Cloud Run, Compute Engine, and Google Kubernetes Engine in a
project.
Unfortunately, the docs don't say what those permissions are or which set of predefined roles covers them.
What I've done so far:
Use the dev console to create a new GCP project
Use the dev console to create a new service account with the Cloud Run Admin role
Use the dev console to create (and download) a key for the service account
Create (and activate) a gcloud configuration for the project
$ gcloud config list
[core]
account = {service-account-name}#{project-id}.iam.gserviceaccount.com
disable_usage_reporting = True
project = {project-id}
[run]
region = us-central1
Activate the service account using the downloaded key
Use the dev console to enable the Cloud Run API
Use the dev console to enable Container Registry→Settings→Container Analysis API
Create a sample application and Dockerfile as instructed by the quickstart documentation
Run gcloud builds submit --tag gcr.io/[PROJECT-ID]/helloworld
...fails due to missing cloud build permissions
Add the Cloud Build Editor role to service account and resubmit build
...fails due to missing storage permissions. I didn't pay careful attention to what was missing.
Add the Storage Object Admin role to service account and resubmit build
...fails due to missing storage bucket permissions
Replace service account's Storage Object Admin role with the Storage Admin role and resubmit build
...fails with
Error: (gcloud.builds.submit) HTTPError 403:
<?xml version='1.0' encoding='UTF-8'?>
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
{service-account-name} does not have storage.objects.get access to
{number}.cloudbuild-logs.googleusercontent.com/log-{uuid}.txt.</Details>
</Error>
Examine the set of available roles and the project's automatically created service accounts. Realize that the Cloud Build Service Account role has many more permissions that the Cloud Build Editor. This surprised me; the legacy Editor role has "Edit access to all resources".
Remove the Cloud Build Editor and Storage Admin roles from service account
Add the Cloud Build Service Account role to service account and resubmit build
...fails with the same HTTP 403 error (missing get access for a log file)
Check Cloud Build→History in the dev console; find successful builds!
Check Container Registry→Images in the dev console; find images!
At this point I think I could finish Google Cloud Run Quickstart: Build and Deploy. But I don't want to proceed with (seemingly spurious) error messages in my build process.
Cloud Run PM here:
We can break this down into the two sets of permissions needed:
# build a container image
gcloud builds submit --tag gcr.io/{PROJECT_ID}/helloworld
You'll need:
Cloud Build Editor and Cloud Build Viewer (as per #wlhee)
# deploy a container image
gcloud beta run deploy --image gcr.io/{PROJECT_ID}/helloworld
You need to do two things:
Grant your service account the Cloud Run Deployer role (if you want to change the IAM policy, say to deploy the service publicly, you'll need Cloud Run Admin).
Follow the Additional Deployment Instructions to grant that service account the ability to deploy your service account
#1
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/run.developer"
#2
gcloud iam service-accounts add-iam-policy-binding \
PROJECT_NUMBER-compute#developer.gserviceaccount.com \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/iam.serviceAccountUser"
EDIT: As noted, the latter grants your service account the ability to actAs the runtime service account. What role this service account has is dependent on what it needs to access: if the only thing Run/GKE/GCE accesses is GCS, then give it something like Storage Object Viewer instead of Editor. We are also working on per-service identities, so you can create a service account and "override" the default with something that has least-privilege.
According to https://cloud.google.com/cloud-build/docs/securing-builds/set-service-account-permissions
"Cloud Build Service Account" - Cloud Build executes your builds using a service account, a special Google account that executes builds on your behalf.
In order to call
gcloud builds submit --tag gcr.io/path
Edit:
Please "Cloud Build Editor" and "Viewer" your service account that starts the build, it's due to the current Cloud Build authorization model.
Sorry for the inconvenience.
I am the owner of my newly created organization, I created a project under this organization and linked it to the organization billing account where I have 1000$ in credits. Through the web UI, I am able to spin up clusters, VMs, networks... But when I want to do so through gcloud, I am getting permissions denied. E.g.:
$ gcloud compute networks list
API [compute.googleapis.com] not enabled on project [XXX].
Would you like to enable and retry (this will take a few minutes)?
(y/N)? y
ERROR: (gcloud.compute.networks.create) PERMISSION_DENIED: The caller does not have permission
but I can see in the web UI GCP that the API is clearly enabled (and can be used), it's just the gcloud not letting me work with them. The account under gcloud is exactly the same I am using in the web console - validated by gcloud auth list and:
$ gcloud config configurations describe myproject
is_active: true
name: myproject
properties:
compute:
region: europe-west1
zone: europe-west1-b
core:
account: <my-email>
project: <the-project-I-want>
or
$ gcloud services list
ERROR: (gcloud.services.list) User [<myusername>] does not have permission to access project [myproject] (or it may not exist): The caller does not have permission
It works totally fine with a different account (and different organization/projects), but I didn't set up that one in the past. What should I do? Thanks a lot!
UPDATE:
After gcloud init, at least the gcloud services list started to work. But the rest did not:
$ gcloud services list
NAME TITLE
bigquery-json.googleapis.com BigQuery API
cloudapis.googleapis.com Google Cloud APIs
clouddebugger.googleapis.com Stackdriver Debugger API
cloudtrace.googleapis.com Stackdriver Trace API
compute.googleapis.com Compute Engine API
container.googleapis.com Kubernetes Engine API
containerregistry.googleapis.com Container Registry API
datastore.googleapis.com Cloud Datastore API
logging.googleapis.com Stackdriver Logging API
monitoring.googleapis.com Stackdriver Monitoring API
oslogin.googleapis.com Cloud OS Login API
pubsub.googleapis.com Cloud Pub/Sub API
servicemanagement.googleapis.com Service Management API
serviceusage.googleapis.com Service Usage API
sql-component.googleapis.com Cloud SQL
storage-api.googleapis.com Google Cloud Storage JSON API
storage-component.googleapis.com Google Cloud Storage
$ gcloud compute networks create testing-net --subnet-mode=custom '--description=Network to host testing kubernetes cluster'
API [compute.googleapis.com] not enabled on project [{PROJECT_ID}].
Would you like to enable and retry (this will take a few minutes)?
(y/N)? y
ERROR: (gcloud.compute.networks.create) PERMISSION_DENIED: The caller does not have permission
^ the PROJECT_ID above shows my organization's ID, not the actual project under this org.
So the problem was that I used the wrong project_id when gcloud config set project and gcloud defaulted to organization for some reason.
So I had to find correct project id using gcloud projects list and then use gcloud config set project {PROJECT-ID} (not the project name!)
gcloud init - if you wanted to switch gcloud to work between projects which will configure its settings to point to the right project.
I am trying to deploy code from this repo:
https://github.com/anishkny/puppeteer-on-cloud-functions
in Google Cloud Build. My cloudbuild.yaml file contents are:
steps:
- name: 'gcr.io/cloud-builders/gcloud'
args: ['beta', 'functions', 'deploy', 'screenshot', '--trigger-http', '--runtime', 'nodejs8', '--memory', '1024MB']
I have given the following roles to my Cloud Build Service account (****#cloudbuild.gserviceaccount.com):
Cloud Build Service Account
Cloud Functions Developer
Yet, in my Cloud Build log I see the following error:
starting build "1f04522c-fe60-4a25-a4a8-d70e496e2821"
FETCHSOURCE
Fetching storage object: gs://628906418368.cloudbuild-source.googleusercontent.com/94762cc396ed1bb46e8c5dbfa3fa42550140c2eb-b3cfa476-cb21-45ba-849c-c28423982a0f.tar.gz#1534532794239047
Copying gs://628906418368.cloudbuild-source.googleusercontent.com/94762cc396ed1bb46e8c5dbfa3fa42550140c2eb-b3cfa476-cb21-45ba-849c-c28423982a0f.tar.gz#1534532794239047...
/ [0 files][ 0.0 B/ 835.0 B]
/ [1 files][ 835.0 B/ 835.0 B]
Operation completed over 1 objects/835.0 B.
tar: Substituting `.' for empty member name
BUILD
Already have image (with digest): gcr.io/cloud-builders/gcloud
ERROR: (gcloud.beta.functions.deploy) ResponseError: status=[403], code=[Forbidden], message=[The caller does not have permission]
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/gcloud" failed: exit status 1
What am I missing?
It would appear that the permissions changed when (perhaps) Cloud Functions went GA. Another customer raised this issue today and I recalled your question.
The Cloud Build robot (${NUM}#cloudbuild.gserviceaccount.com) additionally needs to be a serviceAccountUser of the ${PROJECT-ID}#appspot.gserviceaccount.com account:
NB While the Cloud Build robot local part is the project number (${NUM}), the appspot robot local part is the project ID (${PROJECT})
Please try:
PROJECT=[[YOUR-PROJECT-ID]]
NUM=$(gcloud projects describe $PROJECT --format='value(projectNumber)')
gcloud iam service-accounts add-iam-policy-binding \
${PROJECT}#appspot.gserviceaccount.com \
--member=serviceAccount:${NUM}#cloudbuild.gserviceaccount.com \
--role=roles/iam.serviceAccountUser \
--project=${PROJECT}
Let me know!
I struggled with this too after reading quite a bit of documentation. A combination of the above answers got me on the right track. Basically, something like the following is needed:
PROJECT=[PROJECT-NAME]
NUM=$(gcloud projects describe $PROJECT --format='value(projectNumber)')
gcloud iam service-accounts add-iam-policy-binding \
${PROJECT}#appspot.gserviceaccount.com \
--member=serviceAccount:${NUM}#cloudbuild.gserviceaccount.com \
--role=roles/iam.serviceAccountUser \
--project=${PROJECT}
gcloud iam service-accounts add-iam-policy-binding \
${PROJECT}#[INSERT_YOUR_IAM_OWNER_SERVICE_ACCOUNT_NAME].iam.gserviceaccount.com \
--member='serviceAccount:service-${NUM}#gcf-admin-robot.iam.gserviceaccount.com' \
--role='roles/iam.serviceAccountUser'
Also, I added the "Cloud Functions Developer" role to my #cloudbuild.gserviceaccount.com account via the IAM Console.
According to Cloud Build documentation, for Cloud Functions you have to grant the "Project Editor" role to your service account.
But, Cloud Functions documentation states that alternatively to using the Project Editor role, you can use "the Cloud Functions Developer role [but you have to] ensure that you have granted the Service Account User role". Regarding Service Accounts, it indicates to have "the CloudFunctions.ServiceAgent role on your project" and to "have permissions for trigger sources, such as Pub/Sub or the Cloud Storage bucket triggering your function".
Due to those considerations, my understanding is that the documentation omitted to specify all the roles your service account would need and went directly to indicate to grant the Project Editor role.
You have to update Service Account permissions on Cloud Build settings page.
Here is instructions https://cloud.google.com/cloud-build/docs/deploying-builds/deploy-cloud-run#fully-managed
You just have to set the status of the Cloud Run Admin role to ENABLED on that page:
start your cloud build with auth
steps:
- name: 'gcr.io/cloud-builders/gcloud'
args: ['auth', 'activate-service-account', 'xoxox#xoxo-dev.iam.gserviceaccount.com', '--key-file=account.json', '--project=rabbito-dev']
and then simply your code deployment on cloud function
- name: 'gcr.io/cloud-builders/gcloud'
args: ['beta', 'functions', 'deploy', 'screenshot', '--trigger-http', '--runtime', 'nodejs8', '--memory', '1024MB']
Please add 'Cloud Functions Service Agent' role to your service account alongside 'Cloud Functions Developer'.