so I am able to make a valid request to the video intelligence api with the sample video given in the quickstart. https://cloud.google.com/video-intelligence/docs/getting-started I have tried many different ways of authenticating to the api as well. The API token I am using was created from the Credentials page in the console. There are no options to tie it to the video api so I figured it should automatically work. The API has been enabled on my account.
export TOKEN="foobar"
curl -XPOST -s -k -H"Content-Type: application/json" "https://videointelligence.googleapis.com/v1beta1/videos:annotate?key=$TOKEN" --data '{"inputUri": "gs://custom-bucket/IMG_3591.mov", "features": ["LABEL_DETECTION"]}'
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
curl -XPOST -s -k -H"Content-Type: application/json" "https://videointelligence.googleapis.com/v1beta1/videos:annotate?key=$TOKEN" --data '{"inputUri": "gs://cloud-ml-sandbox/video/chicago.mp4", "features": ["LABEL_DETECTION"]}'
{
"name": "us-east1.18013173402060296928"
}
Update:
I set the file as public and it worked. But I need to access this as private, so I gave the service account access to the file and tried to get the API key like suggested.
export TOKEN="$(gcloud auth print-access-token)"
curl -XPOST -s -k -H"Content-Type: application/json" "https://videointelligence.googleapis.com/v1beta1/videos:annotate?key=$TOKEN" --data '{"inputUri": "gs://custom-bucket/IMG_3591.mov", "features":["LABEL_DETECTION"]}'
{
"error": {
"code": 400,
"message": "API key not valid. Please pass a valid API key.",
"status": "INVALID_ARGUMENT",
"details": [
{
"#type": "type.googleapis.com/google.rpc.Help",
"links": [
{
"description": "Google developers console",
"url": "https://console.developers.google.com"
}
]
}
]
}
}
It seems like the token returned by this print-access-token function does not work. I do have an API key, but it does not have access to the bucket and I don't see a way to give an API key access.
Update 2:
So it looks like we were setting our token wrong. We were following this example https://cloud.google.com/video-intelligence/docs/analyze-labels#videointelligence-label-file-protocol which is where we got the apiKey=$TOKEN from. But it looks like we needed to set the Bearer Header. We did try this at first but we were having the first issue of not having access to the bucket. So thank you.
TL;DR - Video Intelligence service is unable to access the file on your Cloud storage bucket because of lack of permissions. Since the API uses the permissions of the service account token being passed, you will need to grant your service account permissions to read the file in the GCS bucket or the entire GCS bucket itself.
Long version
The access token you pass should correspond to an IAM service account key. The service account will belong to a project (where you need to enable the Video intelligence API access) and the service account should have permissions to access the GCS bucket you're trying to access.
Each such service account has an associated email id in the form SERVICE_ACCOUNT_NAME#PROJECT_NAME.iam.gserviceaccount.com.
In Cloud console, you can go to the Cloud storage bucket/file and grant Reader permissions for the IAM service account email address. There is no need to make this bucket public.
If you use gsutil, you can run the following equivalent command:
gsutil acl ch -u SERVICE_ACCOUNT_NAME#PROJECT_NAME.iam.gserviceaccount.com:READ gs://custom-bucket/IMG_3591.mov`
I confirmed this myself with an IAM service account that I created in my project and used this to invoke the video intelligence API. The file was not made public, but granted Reader permissions only to the service account.
I used gcloud to activate the service account and fetch the access token, although you can do this manually as well using the google OAuth APIs:
gcloud auth activate-service-account SERVICE_ACCOUNT_KEY.json
export TOKEN="$(gcloud auth print-access-token)"
The steps for creating the IAM service account using gcloud are in the same page.
I can repro this issue. I believe the problem is that you don't have proper permission setup for your video file in your gs bucket. To test out this hypothesis try sharing it publicly (checkbox next to the blob in Google Storage) and then run the request again.
Related
For a couple of days I expirience a strange thing: my service acount does not have permission to read skus from GCP api.
This call:
curl -i -H "Authorization: Bearer <service account acccess token>" https://cloudbilling.googleapis.com/v1/services/6F81-5844-456A/skus?fields=skus,nextPageToken
returns
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
If I change the access token to be a token from a normal (#gmail.com) account - it works, but with a service account I get 403. My service accounts have all the permissions, I even gave Owner role to one of them for thest - didn't work.
Any ideas? Did Google changed something regarding the access from service accounts to this apis?
I've created Service Account A and granted roles Service Account Admin and Service Account Key Admin. I did this work in the GCP Console.
Service Account A's function is to create other service accounts programmatically, using the GCP Java SDK. It successfully creates new service accounts, but when it goes to create a key for the newly created service account, I get the following response:
{
"code": 403,
"errors": [
{
"domain": "global",
"message": "Permission iam.serviceAccountKeys.create is required to perform this operation on service account projects/-/serviceAccounts/<new_service_account_name>#<project_id>.iam.gserviceaccount.com.",
"reason": "forbidden"
}
],
"message": "Permission iam.serviceAccountKeys.create is required to perform this operation on service account projects/-/serviceAccounts/<new_service_account_name>#<project_id>.iam.gserviceaccount.com.",
"status": "PERMISSION_DENIED"
}
I've tried waiting to see if perhaps I tried to create the key too soon after creating the service account, but waiting hours resulted in no change.
Service Account A can successfully create a key for itself, just not for other service accounts it creates.
How do I resolve?
You have one of three problems:
Service Account A actually does not have the IAM role Service Account Key Admin in the project. Use the CLI command gcloud projects get-iam-policy and double-check.
Your code is using the wrong identity. You believe that you are using the service account but instead, another identity is being loaded by ADC (Application Default Credentials), or you made a mistake in your code.
You assign the correct role but on the service account instead of the project. Use the CLI command gcloud iam service-accounts get-iam-policy. If you find the role listed in the output, you assigned the role in the wrong place. Use the CLI command gcloud projects add-iam-policy-binding instead.
Note: There is a fourth method to prevent you from creating service account keys. Constraints might be enabled:
Restricting service account usage
I'm trying to trigger a gcloud build however when invoked via curl with:
curl -X POST -T request.json -H "Authorization: Bearer ${gcloudBearer}" \
https://cloudbuild.googleapis.com/v1/projects/"$PROJECT_ID"/triggers/"$TRIGGER_ID":run
I get a response of:
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
However the service account I have authenticated with has the following roles enabled:
Cloud Build Service Account
Do I need any additional permission / role? If so which role would this be?
I had "quotes" around my project id by mistake (within the request.json file)... FML
I've exported MySQL Database following the MySQL Export Guide successfully.
Now, I'm trying to import MySQL Database following the MySQL Import Guide.
I've checked the permissions for the service_account_email I'm using, and I have allowed both Admin SQL and Admin Storage permissions.
I was able to successfully activate my service account using this command locally:
gcloud auth activate-service-account <service_account_email> --key-file=<service_account_json_file>
After I ran the command:
gcloud sql import sql <instance> <gstorage_file> --database=<db_name> --async
I got this information:
{
"error": {
"errors": Array[1][
{
"domain": "global",
"reason": "required",
"message": "Login Required",
"locationType": "header",
"location": "Authorization"
}
],
"code": 401,
"message": "Login Required"
}
}
Other Things I've Tried
I also tried using the service_account_email of my SQL instance, which came from:
gcloud sql instances describe <instance_name>
But, it seems to have the same error.
Question
Based on the REST API JSON error I'm given, how do I "login" using the service_account_email so I wouldn't get the 401 Error?
Problem is about the permission of database instance service account to write on created bucket. Steps to solve this issue
1) Go to your Cloud SQL Instance and copy service account of instance (Cloud SQL->{instance name}->OVERVIEW->Service account)
2) After copy the service account, go the Cloud Storage Bucket where to want to dump and set desired permission to that account (Storage->{bucket name}->permissions->add member).
The cloud SQL instance is running under a Google service account that is not a part of your project. You will need to grant this user permissions on the file in Cloud Storage that you want to import. Here is a handy dandy bash snippet that will do that.
SA_NAME=$(gcloud sql instances describe YOUR_DB_INSTANCE_NAME --project=YOUR_PROJECT_ID --format="value(serviceAccountEmailAddress)")
gsutil acl ch -u ${SA_NAME}:R gs://YOUR_BUCKET_NAME;
gsutil acl ch -u ${SA_NAME}:R gs://${YOUR_BUCKET_NAME}/whateverDirectory/fileToImport.sql;
The first line gets the service account email address.
The next line gives this service account read permissions on the bucket.
The last line gives the service account read permissions on the file.
Google also has some of the worst error reporting around. If you get this error message it might also be that you entered a PATH incorrectly. In my case it was my path to my bucket directory. Go figure, I don't have permissions to access a bucket that doesn't exist. Technically correct but hardly useful.
After performing some research, and based in the permission error, these are the steps that I find more useful for you to troubleshoot the issue:
In order to easier test ACLs and permissions, you can:
Create and download a key for a service account in question
Use 'gcloud auth activate-service-account' to obtain credentials of service account
Use gsutil as usual to see if you can access the object in question
You might need to grant additional IAM role such as 'roles/storage.admin' to service account in question, see more information here.
According to the google Docs
Describe the instance you are importing to:
gcloud sql instances describe INSTANCE_NAME
Copy the serviceAccountEmailAddress field.
Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket.
gsutil iam ch serviceAccount:SERVICE-ACCOUNT:objectAdmin gs://BUCKET-NAME
Then Import the database
I am trying to call the IAM api using curl, specifically the organizations.roles.list method.
https://cloud.google.com/iam/reference/rest/v1/organizations.roles/list
From the docs, my request should be constructed like this:
https://iam.googleapis.com/v1/organizations/<org-id>/roles
However, calling it results in this error:
{ "error": {
"code": 404,
"message": "Method ListRoles not found for service iam.googleapis.com",
"status": "NOT_FOUND" } }
Full request: curl -H "Authorization: Bearer $(gcloud auth print-access-token)" https://iam.googleapis.com/v1/organizations/<org-id>/roles
What am I doing wrong?
According to the docs, the endpoint https://iam.googleapis.com/v1/organizations/<ORG_ID>/roles is used to list the roles defined at the organization level (i.e. custom roles).
To get the list of default roles (pre-defined roles, curated roles, whatever you want to call them...), you must call the API without specifying any resource:
curl -H"Authorization: Bearer $(gcloud auth print-access-token)" https://iam.googleapis.com/v1/roles
So, to get the complete list of roles in your resource (be it a project or an organization), you have to get the curated roles and aggregate them to the custom roles defined at the resource level, and the custom roles defined on the parent resources (so, to get the roles in a project, you get the curated ones+project custom roles+parent org custom roles).
As to the error you were receiving, I'm not receiving it now when testing it. I've run some tests and I'm receiving:
403 when I don't have the proper permissions
200 with empty response ({}) when there are no custom roles defined
200 with a list of roles as defined in the docs when there are custom roles defined in the resource
Since the question is from Jul '17, and the custom roles started their beta on Sep '17, I'm assuming you were too quick to test the API, and that's the reason for the 404 you were receiving.