For a couple of days I expirience a strange thing: my service acount does not have permission to read skus from GCP api.
This call:
curl -i -H "Authorization: Bearer <service account acccess token>" https://cloudbilling.googleapis.com/v1/services/6F81-5844-456A/skus?fields=skus,nextPageToken
returns
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
If I change the access token to be a token from a normal (#gmail.com) account - it works, but with a service account I get 403. My service accounts have all the permissions, I even gave Owner role to one of them for thest - didn't work.
Any ideas? Did Google changed something regarding the access from service accounts to this apis?
Related
A workflow fails to start due to permission denied error when trying to impersonate a service account from different project
given:
Projects:
project1
project2
Service Accounts:
sa1#project1 with roles:
Workflows Admin
Cloudrun Admin
Service Account Token Creator
Service Account User
sa2#project2
Workflows:
A workflow1 in project1 (creates a cloudrun instance with serviceAccountName=sa2#project2)
Result:
{
"body": {
"error": {
"code": 403,
"message": "Permission 'iam.serviceaccounts.actAs' denied on service account sa2#project2 (or it may not exist).",
"status": "PERMISSION_DENIED"
}
},
"code": 403,
"headers": {
"Alt-Svc": "h3=\":443\"; ma=2592000,h3-29=\":443\"; ma=2592000,h3-Q050=\":443\"; ma=2592000,h3-Q046=\":443\"; ma=2592000,h3-Q043=\":443\"; ma=2592000,quic=\":443\"; ma=2592000; v=\"46,43\"",
"Cache-Control": "private",
"Content-Length": "244",
"Content-Type": "application/json; charset=UTF-8",
"Date": "Wed, 14 Sep 2022 10:53:24 GMT",
"Server": "ESF",
"Vary": "Origin",
"X-Content-Type-Options": "nosniff",
"X-Frame-Options": "SAMEORIGIN",
"X-Xss-Protection": "0"
},
"message": "HTTP server responded with error code 403",
"tags": [
"HttpError"
]
}
The error message "Permission 'iam.serviceaccounts.actAs' denied on service account sa2#project2 indicates that users need permission to impersonate a service account in order to attach that service account to a resource. This means that the user needs the iam.serviceAccounts.actAs permission on the service account.
There are several predefined roles that allow a principal to impersonate a service account:
Service Account User
Service Account Token Creator
Workload Identity User
Alternatively, you can grant a different predefined role, or a custom role, that includes permissions to impersonate service accounts.
Service Account User (roles/iam.serviceAccountUser): This role includes the iam.serviceAccounts.actAs permission, which allows principals to indirectly access all the resources that the service account can access. For example, if a principal has the Service Account User role on a service account, and the service account has the Cloud SQL Admin role (roles/cloudsql.admin) on the project, then the principal can impersonate the service account to create a Cloud SQL instance.
You can try giving a service account User role on the service account which is trying to create a cloud run instance.
Refer to the link for more information on impersonating service accounts.
My client is a huge corporate.
Therefore the project level configuration to switch service accounts across projects is disabled (iam.disableCrossProjectServiceAccountUsage is enforced )
This is the root cause of my problem and I cannot change it.
More information is available here:
https://cloud.google.com/iam/docs/impersonating-service-accounts#attaching-different-project
My work around:
I needed this this as it seemed the simplest way to access external BigQuery Dataset & Project.
Solution:
Export private key for sa2#project2 and pass it as a secret to application layer.
Use the key file to imperosnatesa2#project2 service account.
example:
engine = create_engine('bigquery://project2', location="asia-northeast1")
I've created Service Account A and granted roles Service Account Admin and Service Account Key Admin. I did this work in the GCP Console.
Service Account A's function is to create other service accounts programmatically, using the GCP Java SDK. It successfully creates new service accounts, but when it goes to create a key for the newly created service account, I get the following response:
{
"code": 403,
"errors": [
{
"domain": "global",
"message": "Permission iam.serviceAccountKeys.create is required to perform this operation on service account projects/-/serviceAccounts/<new_service_account_name>#<project_id>.iam.gserviceaccount.com.",
"reason": "forbidden"
}
],
"message": "Permission iam.serviceAccountKeys.create is required to perform this operation on service account projects/-/serviceAccounts/<new_service_account_name>#<project_id>.iam.gserviceaccount.com.",
"status": "PERMISSION_DENIED"
}
I've tried waiting to see if perhaps I tried to create the key too soon after creating the service account, but waiting hours resulted in no change.
Service Account A can successfully create a key for itself, just not for other service accounts it creates.
How do I resolve?
You have one of three problems:
Service Account A actually does not have the IAM role Service Account Key Admin in the project. Use the CLI command gcloud projects get-iam-policy and double-check.
Your code is using the wrong identity. You believe that you are using the service account but instead, another identity is being loaded by ADC (Application Default Credentials), or you made a mistake in your code.
You assign the correct role but on the service account instead of the project. Use the CLI command gcloud iam service-accounts get-iam-policy. If you find the role listed in the output, you assigned the role in the wrong place. Use the CLI command gcloud projects add-iam-policy-binding instead.
Note: There is a fourth method to prevent you from creating service account keys. Constraints might be enabled:
Restricting service account usage
I'm trying to trigger a gcloud build however when invoked via curl with:
curl -X POST -T request.json -H "Authorization: Bearer ${gcloudBearer}" \
https://cloudbuild.googleapis.com/v1/projects/"$PROJECT_ID"/triggers/"$TRIGGER_ID":run
I get a response of:
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
However the service account I have authenticated with has the following roles enabled:
Cloud Build Service Account
Do I need any additional permission / role? If so which role would this be?
I had "quotes" around my project id by mistake (within the request.json file)... FML
I am trying to call the IAM api using curl, specifically the organizations.roles.list method.
https://cloud.google.com/iam/reference/rest/v1/organizations.roles/list
From the docs, my request should be constructed like this:
https://iam.googleapis.com/v1/organizations/<org-id>/roles
However, calling it results in this error:
{ "error": {
"code": 404,
"message": "Method ListRoles not found for service iam.googleapis.com",
"status": "NOT_FOUND" } }
Full request: curl -H "Authorization: Bearer $(gcloud auth print-access-token)" https://iam.googleapis.com/v1/organizations/<org-id>/roles
What am I doing wrong?
According to the docs, the endpoint https://iam.googleapis.com/v1/organizations/<ORG_ID>/roles is used to list the roles defined at the organization level (i.e. custom roles).
To get the list of default roles (pre-defined roles, curated roles, whatever you want to call them...), you must call the API without specifying any resource:
curl -H"Authorization: Bearer $(gcloud auth print-access-token)" https://iam.googleapis.com/v1/roles
So, to get the complete list of roles in your resource (be it a project or an organization), you have to get the curated roles and aggregate them to the custom roles defined at the resource level, and the custom roles defined on the parent resources (so, to get the roles in a project, you get the curated ones+project custom roles+parent org custom roles).
As to the error you were receiving, I'm not receiving it now when testing it. I've run some tests and I'm receiving:
403 when I don't have the proper permissions
200 with empty response ({}) when there are no custom roles defined
200 with a list of roles as defined in the docs when there are custom roles defined in the resource
Since the question is from Jul '17, and the custom roles started their beta on Sep '17, I'm assuming you were too quick to test the API, and that's the reason for the 404 you were receiving.
so I am able to make a valid request to the video intelligence api with the sample video given in the quickstart. https://cloud.google.com/video-intelligence/docs/getting-started I have tried many different ways of authenticating to the api as well. The API token I am using was created from the Credentials page in the console. There are no options to tie it to the video api so I figured it should automatically work. The API has been enabled on my account.
export TOKEN="foobar"
curl -XPOST -s -k -H"Content-Type: application/json" "https://videointelligence.googleapis.com/v1beta1/videos:annotate?key=$TOKEN" --data '{"inputUri": "gs://custom-bucket/IMG_3591.mov", "features": ["LABEL_DETECTION"]}'
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
curl -XPOST -s -k -H"Content-Type: application/json" "https://videointelligence.googleapis.com/v1beta1/videos:annotate?key=$TOKEN" --data '{"inputUri": "gs://cloud-ml-sandbox/video/chicago.mp4", "features": ["LABEL_DETECTION"]}'
{
"name": "us-east1.18013173402060296928"
}
Update:
I set the file as public and it worked. But I need to access this as private, so I gave the service account access to the file and tried to get the API key like suggested.
export TOKEN="$(gcloud auth print-access-token)"
curl -XPOST -s -k -H"Content-Type: application/json" "https://videointelligence.googleapis.com/v1beta1/videos:annotate?key=$TOKEN" --data '{"inputUri": "gs://custom-bucket/IMG_3591.mov", "features":["LABEL_DETECTION"]}'
{
"error": {
"code": 400,
"message": "API key not valid. Please pass a valid API key.",
"status": "INVALID_ARGUMENT",
"details": [
{
"#type": "type.googleapis.com/google.rpc.Help",
"links": [
{
"description": "Google developers console",
"url": "https://console.developers.google.com"
}
]
}
]
}
}
It seems like the token returned by this print-access-token function does not work. I do have an API key, but it does not have access to the bucket and I don't see a way to give an API key access.
Update 2:
So it looks like we were setting our token wrong. We were following this example https://cloud.google.com/video-intelligence/docs/analyze-labels#videointelligence-label-file-protocol which is where we got the apiKey=$TOKEN from. But it looks like we needed to set the Bearer Header. We did try this at first but we were having the first issue of not having access to the bucket. So thank you.
TL;DR - Video Intelligence service is unable to access the file on your Cloud storage bucket because of lack of permissions. Since the API uses the permissions of the service account token being passed, you will need to grant your service account permissions to read the file in the GCS bucket or the entire GCS bucket itself.
Long version
The access token you pass should correspond to an IAM service account key. The service account will belong to a project (where you need to enable the Video intelligence API access) and the service account should have permissions to access the GCS bucket you're trying to access.
Each such service account has an associated email id in the form SERVICE_ACCOUNT_NAME#PROJECT_NAME.iam.gserviceaccount.com.
In Cloud console, you can go to the Cloud storage bucket/file and grant Reader permissions for the IAM service account email address. There is no need to make this bucket public.
If you use gsutil, you can run the following equivalent command:
gsutil acl ch -u SERVICE_ACCOUNT_NAME#PROJECT_NAME.iam.gserviceaccount.com:READ gs://custom-bucket/IMG_3591.mov`
I confirmed this myself with an IAM service account that I created in my project and used this to invoke the video intelligence API. The file was not made public, but granted Reader permissions only to the service account.
I used gcloud to activate the service account and fetch the access token, although you can do this manually as well using the google OAuth APIs:
gcloud auth activate-service-account SERVICE_ACCOUNT_KEY.json
export TOKEN="$(gcloud auth print-access-token)"
The steps for creating the IAM service account using gcloud are in the same page.
I can repro this issue. I believe the problem is that you don't have proper permission setup for your video file in your gs bucket. To test out this hypothesis try sharing it publicly (checkbox next to the blob in Google Storage) and then run the request again.