My GCP Dataflow job immediately fails with this error message:
Workflow failed. Causes:
Subscription 'projects/project/subscriptions/subscription' not found.
Please supply an existing subscription.
The subscription does exist, I was able to click on it in the dataflow UI. I've been able to run this job previously, and it worked. It's only when I try redeploying it that it gives the error.
EDIT:
this is how I am getting the pubsub subscription
pipeline
.apply(PubsubIO.readProtos(...)
.fromSubscription(options.getSubscription()))
Workflow failed. Causes: Subscription ‘<subscription_name>’ not found. Please supply an existing subscription.
From the error message, it seemed like the pubsub subscription topic is not visible to the dataflow pipeline. Examing the permission on the service account shows that it has "pubsub.subscription.consume" permission already. Nothing looked out of order. So, we reported this issue to the Google Cloud team, and it turns out other developers are also facing a similar issue. Google Cloud has created a bug for the same.
Meanwhile, those who are facing this issue in the dataflow deployment can fix it by adding “pubsub.subscriptions.get” permission to the dataflow service account.
Related
I have succesfully deployed a 2nd generation cloud function with a storage trigger per the google tutorial.
The Cloud Function works when I run a test command in the shell. But if I try for real by uploading a file to my bucket the could function is not invoked.
I can see that the event triggers the pubsub topic:
And in Eventarc I can see signs of the problem:
So, my layman analyse of why the cloud function invokation fails is that I lack some permission for Eventarc to receive the message from PubSub (?). I have read Eventarc troubleshooting and Eventarc accesscontrol and tried to add the eventarc admin role to the eventarc serviceaccount (as seen in image below) but to no result. (I've also added it to any other service account I can find, made the compute service account project owner, etc. but no luck). What am I missing?
(Note, I had an earlier question about this but with broader scope but I opted for a new, more specific question)
You used the Compute Engine default Service Account.
You need to give the needed permissions to this Service Account :
According to the documentation :
Make sure the runtime service account key you are using for your
Application Default Credentials has either the
cloudfunctions.serviceAgent role or the storage.buckets.{get, update}
and the resourcemanager.projects.get permissions. For more information
on setting these permissions, see Granting, changing, and revoking
access to resources.
Please check in IAM page if the default Service Account has the following permissions :
cloudfunctions.serviceAgent
storage.buckets.{get, update}
resourcemanager.projects.get
Also, don't hesitate to check in Cloud logging to see the exact error and the missing permissions.
I get an error (in red, in picture below) whilst creating pubsub Bigquery subscription. Error shows up in subscription creation view:
First question - why is this error appears in the first place?
One of my attempts to solve this was to try to first create standard subscription, then add these missing permissions by assigning a role using such command:
gcloud pubsub subscriptions add-iam-policy-binding EventIngestSubscription-4475d78 --member=serviceAccount:service-388032002134#gcp-sa-pubsub.iam.gserviceaccount.com --role="roles/roles/bigquery.dataEditor", but this produces ERROR: (gcloud.pubsub.subscriptions.add-iam-policy-binding) INVALID_ARGUMENT: Role roles/bigquery.dataEditor is not supported for this resource. error.
When I tried to set pubsub.subscriber roles instead of bigquery.admin - it worked.
Thanks a lot on any insights and suggestions on how to create a Bigquery Subscription. Am really stuck with this one...
The permission that needs to be set is not on the subscription, it is on the BigQuery table itself. Therefore, you are not going to be able to set the BigQuery permissions on the subscription. Instead, you need to ensure that the service account has roles/bigquery.dataEditor on the table you are using with the subscription. You can do this with the bq command-line tool:
bq add-iam-policy-binding --member="serviceAccount:service-<project number>#gcp-sa-pubsub.iam.gserviceaccount.com" --role=roles/bigquery.dataEditor -t "<dataset>.<table>"
This permission is needed so that Pub/Sub can write to BigQuery on your behalf.
Hopefully this is a simple one for someone with a little deeper knowledge than me...
I have a Cloudfunction that responds to webhook calls to submit jobs to Cloudbuild using the API. This works fine except that now we have some jobs that need to use KMS keys from a different project.
secrets:
- kmsKeyName: projects/xxx/locations/global/keyRings/xxx/cryptoKeys/xxx
With this included in cloudbuild.yaml the api call to submit the Cloudbuild job returns:
400 invalid build: failed to check access to "projects/xxx/locations/global/keyRings/xxx/cryptoKeys/xxx"
I've tried adding both the Cloudfunction and Cloudbuild service accounts from the calling account to the account that hosts KMS to everything I could think of, including Owner.
This article has simple and clear instructions for accessing Container Registry and other services in another account, but nothing about KMS. This error doesn't seem to trigger any meaningful results in searches, and it doesn't look familiar to me at all.
Thanks for any help.
The Cloud KMS API was not enabled on the project running Cloudbuild. It's unfortunate that the error message was so vague. In fact, I diagnosed the issue by running gcloud kms decrypt ... in a Cloudbuild job which helpfully told me that the API needed to be enabled.
I was trying to understand example given in google cloud samples present in this link
IAM Example
This example creates a service account, a VM, and a Pub/Sub topic. The VM runs as the service account, and the service account has subscriber access to the Pub/Sub topic, thereby giving services and applications running on the VM access to the Pub/Sub topic.
However when I am trying to deploy this example I am getting below error
The fingerprint of the deployment is a-v3HjAHciZeSLuE-vSeZw==
Waiting for create [operation-1525502430976-56b6fb6809800-dbd09909-c5d681b2]...failed.
ERROR: (gcloud.deployment-manager.deployments.create) Error in Operation [operation-1525502430976-56b6fb6809800-dbd09909-c5d681b2]: errors:
- code: RESOURCE_ERROR
location: /deployments/test-dp/resources/my-pubsub-topic
message: '{"ResourceType":"pubsub.v1.topic","ResourceErrorCode":"403","ResourceErrorMessage":{"code":403,"message":"User
not authorized to perform this action.","status":"PERMISSION_DENIED","details":[],"statusMessage":"Forbidden","requestPath":"https://pubsub.googleapis.com/v1/projects/fresh-deck-194307/topics/my-pubsub-topic:setIamPolicy","httpMethod":"POST"}}'
It mentions that User doesn't have permission to perform this action.
I am unable to understand which user it is mentioning about.
Since I am the project owner and my account is the owner of project, I should be able to deploy a script which can set IAM policy for subscribing to a pubsub topic.
Might be my understanding is wrong above. Could somebody help to understand why this example is failing?
Also I hope if any additional configuration is needed for this example to run, it should be mentioned in README file. But there are no instructions.
Make sure that APIs for all resources that you're trying to deploy are enabled.
Use gcloud auth list command to make sure that the account with enough permissions is the active one.
Use gcloud config list command to make sure that the default project or other settings are correct.
I'm currently trying to use Dataflow with Pub/Sub but I'm getting this error:
Workflow failed. Causes: (6e74e8516c0638ca): There was a problem refreshing your credentials. Please check:
1. Dataflow API is enabled for your project.
2. There is a robot service account for your project:
service-[project number]#dataflow-service-producer-prod.iam.gserviceaccount.com should have access to your project. If this account does not appear in the permissions tab for yourproject, contact Dataflow support.
I tried to look in the API manager to enable Dataflow API but I can't find Dataflow at all. I'm also not seeing the robot service account.
You can see whether the API is enabled by searching for dataflow within the API Manager (should enumerate whether its enabled or not):
To find the appropriate robot account, search for dataflow-service-producer-prod.iam.gserviceaccount.com within the IAM page:
Finally, the quick start guide may be of use.
You can enable it from the console or just use the gcloud command.
Enable Dataflow API: gcloud services enable dataflow.googleapis.com
Disable Dataflow API: gcloud services disable dataflow.googleapis.com
Adding the dataflow Worker role to the default project compute service account solved the problem for me