I'm trying to trigger a gcloud build however when invoked via curl with:
curl -X POST -T request.json -H "Authorization: Bearer ${gcloudBearer}" \
https://cloudbuild.googleapis.com/v1/projects/"$PROJECT_ID"/triggers/"$TRIGGER_ID":run
I get a response of:
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
However the service account I have authenticated with has the following roles enabled:
Cloud Build Service Account
Do I need any additional permission / role? If so which role would this be?
I had "quotes" around my project id by mistake (within the request.json file)... FML
Related
For a couple of days I expirience a strange thing: my service acount does not have permission to read skus from GCP api.
This call:
curl -i -H "Authorization: Bearer <service account acccess token>" https://cloudbilling.googleapis.com/v1/services/6F81-5844-456A/skus?fields=skus,nextPageToken
returns
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
If I change the access token to be a token from a normal (#gmail.com) account - it works, but with a service account I get 403. My service accounts have all the permissions, I even gave Owner role to one of them for thest - didn't work.
Any ideas? Did Google changed something regarding the access from service accounts to this apis?
I'm trying to export project assets with Google Cloud Asset Inventory and gcloud command (version 314.0.0) authenticated with a service account :
# 1. authenticate with service account my-service-account#$PROJECT_ID.iam.gserviceaccount.com
gcloud auth activate-service-account --key-file=/path/to/my/key.json
# 2. export assets to BQ
gcloud asset export \
--project=$PROJECT_ID \
--bigquery-table=projects/$PROJECT_ID/datasets/$DATASET_ID/tables/$TABLE_ID \
--output-bigquery-force \
--content-type=resource
And got the following error :
ERROR: (gcloud.asset.export) User [my-service-account#$PROJECT_ID.iam.gserviceaccount.com] does not have permission to access project [$PROJECT_ID:exportAssets] (or it may not exist): The caller does not have permission
My service account have the following roles on $PROJECT_ID :
roles/cloudasset.viewer
roles/bigquery.jobUser
roles/bigquery.dataEditor
Note that gcloud asset export works when I'm logged with my own personal account, which have the same roles as my service account.
Adding --verbosity=debug flag to gcloud does not add additional info :
apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing https://cloudasset.googleapis.com/v1/projects/$PROJECT_ID:exportAssets?alt=json
with the following content :
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
I don't understand the difference between being logged with the service account (gcloud auth activate-service-account) and my own personal account (gcloud auth login), both should work since I have exactly the same permissions.
Any idea would be appreciated.
It is an opened investigation on this issue:
permission denied error when exporting asset to GCS or BigQuery
It seems that you have to impersonate the built-in service account service-xxxxxxxx#gcp-sa-cloudasset.iam.gserviceaccount.com and to add the Storage Admin role to it.
Also you will have to add the roles roles/bigquery.jobUser and roles/bigquery.dataEditor to the service account service-xxxxxxxx#gcp-sa-cloudasset.iam.gserviceaccount.com where xxxxxx is the project id.
I am following below link for "Use Amazon S3 to Store a Single Amazon Elasticsearch Service Index"
https://aws.amazon.com/blogs/database/use-amazon-s3-to-store-a-single-amazon-elasticsearch-service-index/
When I am trying
curl -XPUT 'http://localhost:9200/_snapshot/snapshot-repository' -d'{
"type": "s3",
"settings": {
"bucket": "es-s3-repository",
"region": "us-west-2",
"role_arn": "arn:aws:iam::123456789012:role/es-s3-repository"
}
}'
with update bucket, region and role_arn, but I am getting below error
{"Message":"User: anonymous is not authorized to perform: iam:PassRole on resource: arn:aws:iam...}
To resolve this issue, I followed this link https://aws.amazon.com/premiumsupport/knowledge-center/anonymous-not-authorized-elasticsearch/ also. but still It is not working.
You need to sign your requests to AWS Elasticsearch. The blog post that you linked describes using a proxy server to create the signature, did you do that?
As an alternative to using such a proxy server with curl, you can make the requests from a program. In the AWS Elasticsearch docs give you an example in Python, with a link to a Java client.
I am trying to call the IAM api using curl, specifically the organizations.roles.list method.
https://cloud.google.com/iam/reference/rest/v1/organizations.roles/list
From the docs, my request should be constructed like this:
https://iam.googleapis.com/v1/organizations/<org-id>/roles
However, calling it results in this error:
{ "error": {
"code": 404,
"message": "Method ListRoles not found for service iam.googleapis.com",
"status": "NOT_FOUND" } }
Full request: curl -H "Authorization: Bearer $(gcloud auth print-access-token)" https://iam.googleapis.com/v1/organizations/<org-id>/roles
What am I doing wrong?
According to the docs, the endpoint https://iam.googleapis.com/v1/organizations/<ORG_ID>/roles is used to list the roles defined at the organization level (i.e. custom roles).
To get the list of default roles (pre-defined roles, curated roles, whatever you want to call them...), you must call the API without specifying any resource:
curl -H"Authorization: Bearer $(gcloud auth print-access-token)" https://iam.googleapis.com/v1/roles
So, to get the complete list of roles in your resource (be it a project or an organization), you have to get the curated roles and aggregate them to the custom roles defined at the resource level, and the custom roles defined on the parent resources (so, to get the roles in a project, you get the curated ones+project custom roles+parent org custom roles).
As to the error you were receiving, I'm not receiving it now when testing it. I've run some tests and I'm receiving:
403 when I don't have the proper permissions
200 with empty response ({}) when there are no custom roles defined
200 with a list of roles as defined in the docs when there are custom roles defined in the resource
Since the question is from Jul '17, and the custom roles started their beta on Sep '17, I'm assuming you were too quick to test the API, and that's the reason for the 404 you were receiving.
so I am able to make a valid request to the video intelligence api with the sample video given in the quickstart. https://cloud.google.com/video-intelligence/docs/getting-started I have tried many different ways of authenticating to the api as well. The API token I am using was created from the Credentials page in the console. There are no options to tie it to the video api so I figured it should automatically work. The API has been enabled on my account.
export TOKEN="foobar"
curl -XPOST -s -k -H"Content-Type: application/json" "https://videointelligence.googleapis.com/v1beta1/videos:annotate?key=$TOKEN" --data '{"inputUri": "gs://custom-bucket/IMG_3591.mov", "features": ["LABEL_DETECTION"]}'
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
curl -XPOST -s -k -H"Content-Type: application/json" "https://videointelligence.googleapis.com/v1beta1/videos:annotate?key=$TOKEN" --data '{"inputUri": "gs://cloud-ml-sandbox/video/chicago.mp4", "features": ["LABEL_DETECTION"]}'
{
"name": "us-east1.18013173402060296928"
}
Update:
I set the file as public and it worked. But I need to access this as private, so I gave the service account access to the file and tried to get the API key like suggested.
export TOKEN="$(gcloud auth print-access-token)"
curl -XPOST -s -k -H"Content-Type: application/json" "https://videointelligence.googleapis.com/v1beta1/videos:annotate?key=$TOKEN" --data '{"inputUri": "gs://custom-bucket/IMG_3591.mov", "features":["LABEL_DETECTION"]}'
{
"error": {
"code": 400,
"message": "API key not valid. Please pass a valid API key.",
"status": "INVALID_ARGUMENT",
"details": [
{
"#type": "type.googleapis.com/google.rpc.Help",
"links": [
{
"description": "Google developers console",
"url": "https://console.developers.google.com"
}
]
}
]
}
}
It seems like the token returned by this print-access-token function does not work. I do have an API key, but it does not have access to the bucket and I don't see a way to give an API key access.
Update 2:
So it looks like we were setting our token wrong. We were following this example https://cloud.google.com/video-intelligence/docs/analyze-labels#videointelligence-label-file-protocol which is where we got the apiKey=$TOKEN from. But it looks like we needed to set the Bearer Header. We did try this at first but we were having the first issue of not having access to the bucket. So thank you.
TL;DR - Video Intelligence service is unable to access the file on your Cloud storage bucket because of lack of permissions. Since the API uses the permissions of the service account token being passed, you will need to grant your service account permissions to read the file in the GCS bucket or the entire GCS bucket itself.
Long version
The access token you pass should correspond to an IAM service account key. The service account will belong to a project (where you need to enable the Video intelligence API access) and the service account should have permissions to access the GCS bucket you're trying to access.
Each such service account has an associated email id in the form SERVICE_ACCOUNT_NAME#PROJECT_NAME.iam.gserviceaccount.com.
In Cloud console, you can go to the Cloud storage bucket/file and grant Reader permissions for the IAM service account email address. There is no need to make this bucket public.
If you use gsutil, you can run the following equivalent command:
gsutil acl ch -u SERVICE_ACCOUNT_NAME#PROJECT_NAME.iam.gserviceaccount.com:READ gs://custom-bucket/IMG_3591.mov`
I confirmed this myself with an IAM service account that I created in my project and used this to invoke the video intelligence API. The file was not made public, but granted Reader permissions only to the service account.
I used gcloud to activate the service account and fetch the access token, although you can do this manually as well using the google OAuth APIs:
gcloud auth activate-service-account SERVICE_ACCOUNT_KEY.json
export TOKEN="$(gcloud auth print-access-token)"
The steps for creating the IAM service account using gcloud are in the same page.
I can repro this issue. I believe the problem is that you don't have proper permission setup for your video file in your gs bucket. To test out this hypothesis try sharing it publicly (checkbox next to the blob in Google Storage) and then run the request again.