I'm currently trying to use Dataflow with Pub/Sub but I'm getting this error:
Workflow failed. Causes: (6e74e8516c0638ca): There was a problem refreshing your credentials. Please check:
1. Dataflow API is enabled for your project.
2. There is a robot service account for your project:
service-[project number]#dataflow-service-producer-prod.iam.gserviceaccount.com should have access to your project. If this account does not appear in the permissions tab for yourproject, contact Dataflow support.
I tried to look in the API manager to enable Dataflow API but I can't find Dataflow at all. I'm also not seeing the robot service account.
You can see whether the API is enabled by searching for dataflow within the API Manager (should enumerate whether its enabled or not):
To find the appropriate robot account, search for dataflow-service-producer-prod.iam.gserviceaccount.com within the IAM page:
Finally, the quick start guide may be of use.
You can enable it from the console or just use the gcloud command.
Enable Dataflow API: gcloud services enable dataflow.googleapis.com
Disable Dataflow API: gcloud services disable dataflow.googleapis.com
Adding the dataflow Worker role to the default project compute service account solved the problem for me
Related
Currently I am trying to create a custom metric to be tracked in the Google Cloud Monitoring dashboard.
Although the admin of the system has given me Monitoring Editor, Monitoring Writer, and Monitoring Admin roles, this error is still occurring.
permission based error
Which permission settings do I need to create a custom / log based metric?
I have tried creating a metric on the Google Cloud Monitoring dashboard with the following roles: Monitoring Editor, Monitoring Writer, and Monitoring Admin. Even with these roles in my IAM permission settings, the red alert error still shows up when creating a metric.
It's confusing.
Although Log-based metrics produce Cloud Monitoring metrics, they are functionality of the Cloud Logging API (and so they require Logging-specific roles/permissions).
Here's the list of permissions: https://cloud.google.com/iam/docs/permissions-reference
Search it (!) for logging.logMetrics.create and you'll get:
Owner (roles/owner)
Editor (roles/editor)
App Engine flexible environment Service Agent (roles/appengineflex.serviceAgent)
Cloud Composer API Service Agent (roles/composer.serviceAgent)
Cloud Dataflow Service Agent (roles/dataflow.serviceAgent)
Logging Admin (roles/logging.admin)
Logs Configuration Writer (roles/logging.configWriter)
Serverless VPC Access Service Agent (roles/vpcaccess.serviceAgent)
Get roles/logging.admin or roles/logging.confgWriter bound to your identity and try again.
New to GCP and use IAC for our Terraform. I've managed to build most of the initial organisation config in Terraform no problem with the exception of Cloud Identity and Organisation Policies. I'm using gcloud provided login credentials. Whenever I try to build for those two services I get this:
Error creating Group: googleapi: Error 403: Your application has authenticated using end user credentials from the Google Cloud SDK or Google Cloud Shell which are not supported by the cloudidentity.googleapis.com. We recommend configuring the billing/quota_project setting in gcloud or using a service account through the auth/impersonate_service_account setting. For more information about service accounts and how to use them in your application, see https://cloud.google.com/docs/authentication/. If you are getting this error with curl or similar tools, you may need to specify 'X-Goog-User-Project' HTTP header for quota and billing purposes. For more information regarding 'X-Goog-User-Project' header, please check https://cloud.google.com/apis/docs/system-parameters.
So in this case i'm using the Google Cloud SDK, so the error makes sense. However, the two options it presents don't work:
Setting a quota project makes no difference
I can't create a service account at the organisational level (and when I create one within a project it can't configure these organisational level constructs)
So how do I go about Terraforming these services?
Thanks.
I'm trying to create a job in Dataflow to export to a text file the data published to a pub/sub topic. When saving the configuration I get a 'Job creation failed' message specifying 'Current user cannot act as service account ...-compute#developer.gserviceaccount.com', as shown in the attached picture.
Following Google's documentation I added the following roles to my user for this project (in addition to the owner role I already have):
Compute Viewer
Dataflow Admin
Dataflow Developer
Storage Object Admin
Service Account User
However the Controller Service Account mentioned in the message doesn't seem to exist in the list of Account Services of this project (IAM & Admin > Account Services). Is there anything I'm missing here?
Other requirements already checked:
I have the Compute Engine API already enabled
As owner I have the iam.serviceAccounts.actAs permission
Your best option is to create a custom service account in IAM and use it to build/run your job. if you're using cloud build to deploy and run your template you'll need to set your logging location.
More details at the below links:
Using custom service accounts in cloud build
Setting logging location in cloud build YAML
I am trying to use setIAMPolicy for Cloud Build Service account #cloudbuild.gserviceaccount.com. I want to provide AppEngine Admin, Cloud Run Admin permissions to the Cloud Build Service member so that it can do automated releases on AppEngine.
Somehow it throws 404 when I pass resource of Cloud Build Service account while getting IAM Policy. To confirm, I tried GET https://iam.googleapis.com/v1/{name=projects/*}/serviceAccounts in API Explorer and it also does not return the Google Managed Service accounts. It seems it only returns the service accounts which are created and not the Google Managed default accounts.
How can I set IAM Policy to grant these permissions to Cloud Build?
The general idea is to enable these permissions for both App Engine and Cloud Run.
Also, a common problem is not knowing that cron permissions are needed for App Engine and Cloud build. For example, this article mentions "Update cron schedules" as "No" for "App Engine Admin". Whether you need that or not depends on how your builds are done. If you end-up needing that too, use permission "Cloud Scheduler Admin" on your #cloudbuild.gserviceaccount.com. You can apply the same logic to other permissions and that chart might be useful for knowing what is needed depending on your setup.
I need to execute a dataflow job using a service account , I'm following a very simple and basic example wordcount offered within the same platform itself.
Which is weird is the error I'm getting:
According to this, GCP requires the service account having permissions as Dataflow worker in order to execute my job. The weir part comes over when the error kept on showing up even though I have already set the required permissions:
Can someone explain this strange behavior? thanks so much
To run a Dataflow job, a project must enable billing and the following Google Cloud Platform APIs:
Google Cloud Dataflow API
Compute Engine API (Google Compute Engine)
Google Cloud Logging API
Google Cloud Storage
Google Cloud Storage JSON API
BigQuery API
Google Cloud Pub/Sub
Google Cloud Datastore API
You should also have enough quota in the project for any one of the APIs you are using in the Dataflow job.
I would suggest you to create a fresh service account which its name has not been used before and then granting roles/dataflow.worker to this new fresh service account. Remember, that Cloud IAM propagation takes fewer than 60 seconds, up to 7 minutes, so please have a couple of minutes between an IAM change and Dataflow job creation.
Another possible workaround is to delete the Dataflow worker permission and add it again. The permission remains after the removal of the account, pointing to its old ID. This ID must not be refreshed until explicitly deleting the role.
I encourage you to visit Dataflow IAM roles with role descriptions and permissions documentation.