Currently I am trying to create a custom metric to be tracked in the Google Cloud Monitoring dashboard.
Although the admin of the system has given me Monitoring Editor, Monitoring Writer, and Monitoring Admin roles, this error is still occurring.
permission based error
Which permission settings do I need to create a custom / log based metric?
I have tried creating a metric on the Google Cloud Monitoring dashboard with the following roles: Monitoring Editor, Monitoring Writer, and Monitoring Admin. Even with these roles in my IAM permission settings, the red alert error still shows up when creating a metric.
It's confusing.
Although Log-based metrics produce Cloud Monitoring metrics, they are functionality of the Cloud Logging API (and so they require Logging-specific roles/permissions).
Here's the list of permissions: https://cloud.google.com/iam/docs/permissions-reference
Search it (!) for logging.logMetrics.create and you'll get:
Owner (roles/owner)
Editor (roles/editor)
App Engine flexible environment Service Agent (roles/appengineflex.serviceAgent)
Cloud Composer API Service Agent (roles/composer.serviceAgent)
Cloud Dataflow Service Agent (roles/dataflow.serviceAgent)
Logging Admin (roles/logging.admin)
Logs Configuration Writer (roles/logging.configWriter)
Serverless VPC Access Service Agent (roles/vpcaccess.serviceAgent)
Get roles/logging.admin or roles/logging.confgWriter bound to your identity and try again.
Related
I can see in one of the GCE Instance start up script as
userdel -r userid
due to this user is not able to SSH through browser.
My question is which logs how do we find who has added this startup script to the vm and when?
can we use some logs?
Yes , you can check logs for this in activity log.
you can use below url by replacing your project id.
https://console.cloud.google.com/home/activity?project=
If you want to know who adds a startup script to the VM, you can check the Admin Activity audit logs and the System Event audit logs.
Admin Activity audit logs contain log entries for API calls or other actions that modify the configuration or metadata of resources.
And, System Event audit logs contain log entries for Google Cloud actions that modify the configuration of resources.
Google Cloud services write audit logs that record administrative
activities and accesses within your Google Cloud resources. Audit logs
help you answer "who did what, where, and when?" within your Google
Cloud resources with the same level of transparency as in on-premises
environments. Cloud Audit Logs provides the following audit logs for
each Cloud project, folder, and organization:
Admin Activity audit logs
Data Access audit logs
System Event audit logs
Policy Denied audit logs
The Data Access audit logs can be very useful too, but are disabled by default, Data Access audit logs contain API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data. If you want to enable it, please follow this link.
To view the Audits logs
In the Cloud console, go to the Logging> Logs Explorer page.
Select an existing Cloud project, folder, or organization.
In the Query builder pane, do the following:
In Resource type, select the Google Cloud resource whose audit logs
you want to see.
In Log name, select the audit log type that you want to see:
For Admin Activity audit logs, select activity.
For Data Access audit logs, select data_access.
For System Event audit logs, select system_event.
For Policy Denied audit logs, select policy.
If you don't see these options, then there aren't any audit logs of
that type available in the Cloud project, folder, or organization.
If you want to know more about Audit logs in GCP, please follow this link.
I'm trying to create a job in Dataflow to export to a text file the data published to a pub/sub topic. When saving the configuration I get a 'Job creation failed' message specifying 'Current user cannot act as service account ...-compute#developer.gserviceaccount.com', as shown in the attached picture.
Following Google's documentation I added the following roles to my user for this project (in addition to the owner role I already have):
Compute Viewer
Dataflow Admin
Dataflow Developer
Storage Object Admin
Service Account User
However the Controller Service Account mentioned in the message doesn't seem to exist in the list of Account Services of this project (IAM & Admin > Account Services). Is there anything I'm missing here?
Other requirements already checked:
I have the Compute Engine API already enabled
As owner I have the iam.serviceAccounts.actAs permission
Your best option is to create a custom service account in IAM and use it to build/run your job. if you're using cloud build to deploy and run your template you'll need to set your logging location.
More details at the below links:
Using custom service accounts in cloud build
Setting logging location in cloud build YAML
I am trying to use setIAMPolicy for Cloud Build Service account #cloudbuild.gserviceaccount.com. I want to provide AppEngine Admin, Cloud Run Admin permissions to the Cloud Build Service member so that it can do automated releases on AppEngine.
Somehow it throws 404 when I pass resource of Cloud Build Service account while getting IAM Policy. To confirm, I tried GET https://iam.googleapis.com/v1/{name=projects/*}/serviceAccounts in API Explorer and it also does not return the Google Managed Service accounts. It seems it only returns the service accounts which are created and not the Google Managed default accounts.
How can I set IAM Policy to grant these permissions to Cloud Build?
The general idea is to enable these permissions for both App Engine and Cloud Run.
Also, a common problem is not knowing that cron permissions are needed for App Engine and Cloud build. For example, this article mentions "Update cron schedules" as "No" for "App Engine Admin". Whether you need that or not depends on how your builds are done. If you end-up needing that too, use permission "Cloud Scheduler Admin" on your #cloudbuild.gserviceaccount.com. You can apply the same logic to other permissions and that chart might be useful for knowing what is needed depending on your setup.
I have a small python app running in google cloud run with docker. The application is triggered by http requests, executes a query in big query and return the result. Unfortunately I get the following permission error:
Reason: 403 POST https://bigquery.googleapis.com/bigquery/v2/projects/XXXX/jobs: Access Denied: Project XXXX: User does not have bigquery.jobs.create permission in project XXXX.\n\n(job ID: XXXX-XX-XX-XX-XXXX)\n\n
I understand I need to give access from cloud run to big query. How do I do it? to which user? how can i find out?
You need to add BiqQuery permissions via IAM Roles to the service account assigned to Cloud Run.
To allow Cloud Run to create Big Query jobs (bigquery.jobs.create) you need one of the following roles:
roles/bigquery.user
roles/bigquery.jobUser
The service account for Cloud Run is displayed in the Google Cloud Console in the Cloud Run section for your service. Most likely this is Compute Engine default service account.
To add a BiqQuery role, you can use the Google Cloud Console. Go to IAM, find the service account. Add roles to the service account.
Documentation:
BigQuery predefined Cloud IAM roles
Service accounts on Cloud Run (fully managed)
Granting roles to service accounts
One of the issues could be that Service Account which your Cloud Run job is using does not have permissions on BigQuery.
You can update the service account permission and add roles/bigquery.user role to create a job.
Also, based on your application requirement add relevant roles. You can see details about different BigQuery roles here.
A good rule is provide only required permissions to a service account.
I hope this helps.
The application is triggered by http requests, executes a query in big query and return the result.
From the security standpoint the permissions required are identical to those used by the custom website from this solution. I'm the author. The website is also triggered by http requests, executes a query in BQ and returns the result. And granting the permission to create jobs (via bigquery.jobUser role) is not enough.
You can grant the required permissions to the service account in different ways (e.g. a more sweeping permission and a more restricted one), the details are here at the Step 6.
Generally speaking, the more restricted and the more granular the permissions are the better for security.
I'm adding extra clarifications and also pasting specific instructions related to Google's tools usage.
To add the permission to create and run jobs (the BQ error message says this permission is lacking) execute the command:
gcloud projects add-iam-policy-binding <project-name> --member=serviceAccount:<sa-name>#<project-name>.iam.gserviceaccount.com --role roles/bigquery.jobUser
The command can be executed in Cloud Shell, open it using the "Activate Cloud Shell" icon in BigQuery Web UI or from other Google Console page. Replace the placeholders:
<sa-name> - replace with service account name used by Cloud Run,
<project-name> - replace with the project name.
The command adds the role bigquery.jobUser to the service account. Do not add other permissions/roles to solve the inability to create/run jobs because excessive permissions are bad for security.
Another permission is required to read BQ data. There are two options to add it:
Grant the bigquery.dataViewer role to the service account:
gcloud projects add-iam-policy-binding <project-name> --member=serviceAccount:<sa-name>#<project-name>.iam.gserviceaccount.com --role roles/bigquery.dataViewer
Then proceed to the next step. Not recommended unless you are using a throw-away project. The drawback of this approach is granting permissions to view all project datasets.
Take more granular approach (recommended) by allowing the service account to query one dataset only. This is the approach described below.
Execute the commands replacing <ds-name> with the dataset name (used by your query):
bq show --format=prettyjson <ds-name> >/tmp/mydataset.json
vi /tmp/mydataset.json
Using vi, append the following item to the existing access array and replace the placeholders before saving the file:
,
{
"role": "READER",
"userByEmail": "[<sa-name>#<project-name>.iam.gserviceaccount.com](mailto:<sa-name>#<project-name>.iam.gserviceaccount.com)"
}
Execute the command to effect the changes for the dataset:
bq update --source /tmp/mydataset.json <ds-name>
I am the owner of a project and want to give Permissions to another user to view Logs of Google Cloud Build, but I can not figure out which Role / Permission this user needs.
Roles I've unsuccessfully tried are:
Cloud Build Editor,
Cloud Build Viewer,
Stackdriver Debugger Agent,
Stackdriver Debugger User,
Cloud Trace Admin,
Logging Admin,
Private Logs Viewer,
Logs Viewer,
Monitoring Admin
Google Cloud enterprise support confirmed to me that the only role that currently grants this is the project-level Viewer permission (or Editor/Owner).
One workaround is to set the --gcs-log-dir flag when submitting a build to a bucket you control (rather than the default Google-managed gs://[PROJECT_NUMBER].cloudbuild-logs.googleusercontent.com/ bucket).
They're tracking making a more granular permission in this feature request: https://issuetracker.google.com/issues/134928412, which you can subscribe to updates on.
If you add the project viewer role then this will resolve the issue. But, it seems like this is an issue (or maybe a feature request - to add the necessary permission to the Logs Viewer so that logs outside of the Stackdriver logging page can be viewed in other UIs as well), looking at the documentation it says "roles/viewer (Project Viewer) gives members the same permissions as roles/logging.viewer at the project level. " which means that by just assigning the Logs Viewer then it should have given the user the necessary permission to view the logs at the Cloud Build UI.