is it possible to create a GCP IAM user which has access only to a specific Google Monitoring dashboard? I figured it should be possible via conditional role bindings, but monitoring dashboards are not a supported resource.
Are there other ways to achieve this?
As mentioned by #Kamelia Y, at this time such feature is not available yet and there is a Feature Request in the public tracker: https://issuetracker.google.com/147039227
There is not an available workaround for real-time separation of metrics.
However, a project owner can achieve this separation via exporting their metrics manually and sending the appropriate information to the appropriate user. Here are two examples of exporting Stackdriver Monitoring metrics using App Engine/PubSub [1] or using Colab/Monitoring API [2]. Both use BigQuery to query the data and Cloud Scheduler to automate exporting the information.
[1] https://cloud.google.com/solutions/stackdriver-monitoring-metric-export
[2] https://medium.com/google-cloud/downsampling-and-exporting-stackdriver-monitoring-data-c3483e88a352
Related
I'm currently working on a project where I want to have access to log data and other stuff
like graph from GCP on a web application.
For security reasons (I guess), it's not possible to integrate a GCP adress in an tag. Do you know any way to make it work or do you have any idea for something that would make the data accessible ?
You might refer to Cloud Logging for logs, and Cloud Monitoring for metrics and visualization that are the part of Cloud Operation Suite. I believe almost all of the visualizations you see on Google Cloud Console (Web UI) are from Cloud Monitoring. Both of them have API and SDK that can be accessed using popular languages like Python, Java, or JavaScript. Cloud Monitoring itself supports creating visualization in Widgets and combining them all on Dashboards.
For the embedding with IFrame part, I haven't seen a direct way to do that, as it will need to do authentication to view Cloud Monitoring dashboards. According to this answer, you can export the visualization from Grafana instead. But I haven't tried that approach yet.
About the GCP address in a tag. Can you elaborate more about that in the comment below? I can suggest that all the resources in GCP have their own resource name that you can filter from the Metric Explore in Cloud Monitoring. Custom tags are supported, you can put any valid name you like.
I'm building a chat and having a feature with cloud translation API, for each client I create a new API Key to been able to identify the consume usage of each client, the problem is the following:
I want to see the consume of all API Keys inside a project, something like the Operations Logging:
But revealing information of the timestamp and the API Key name use so I can be able to track each client usage of the service and determine how much I am going to bill them.
Update
Doing some additional research come up to this article which gives a walkthrough to gain visibility on Service Account Keys (similar but not what I needed). On this guide they create a Log Sink to push logs into BigQuery.
The problem now is that the filter used to extract the logs is the following:
logName:"projects/<PROJECT>/logs/cloudaudit.googleapis.com"
protoPayload.authenticationInfo.serviceAccountKeyName:"*"
The second line extract log that belongs to Service Account Key Name. But as it was stated at the beginning of the question I'm looking for the API Key log not the service account key.
You can use Cloud Audit logs 1 , Cloud Audit Logs provides the following audit logs for each Cloud project, folder, and organization:
-Admin Activity audit logs
-Data Access audit logs
-System Event audit logs
-Policy Denied audit logs
Google Cloud services write audit log entries to these logs to help you answer the questions of "who did what, where, and when?" within your Google Cloud resources.
For this scenario it could be helpful Data Access audit logs 2, it contains API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data. Data Access audit logs do not record the data-access operations on resources that are publicly shared (available to All Users or All Authenticated Users) or that can be accessed without logging into Google Cloud.
As mentioned in the previous comment, this logs are disabled by default because they can be quite large; they must be explicitly enabled to be written.
However, the simplest way to view your API metrics is to use the Google Cloud Console's API Dashboard 3. You can see an overview of all your API usage, or you can drill down to your usage of a specific API.
Background
I have a Google Cloud project running my N applications. Each application has an exclusive IAM service account (total N service account) with minimal permissions.
Scenario
Let's imagine that one of the service accounts was leaked out. An attacker will try to take advantage of these credentials. Because he doesn't know exactly which kind of permissions this account has, we will try to make calls and see if it working for him.
Question
I want to "listen" to audit logs. Once I will see the log from kind "access denied", I will know that something is wrong with this service account.
Is this possible to write all those access denied incidents to Google Cloud Stackdriver?
How you recommend implementing it?
Thank you
Is this possible to write all those access denied incidents to Google
Cloud Stackdriver?
Most but not all Google Cloud services support this. However, access success will also be logged.
You will need to enable Data Access Audit Logs.
This could generate a massive amount of logging information.
Access logs for Org and Folder are only available via API and not the console.
Review pricing before enabling data access audit logs.
How you recommend implementing it?
This question is not suitable for Stackoverflow as it seeks recommendations and opinions. In general you will export your logs to Google Cloud Pub/Sub to be processed by a service such as Cloud Functions, Cloud Run, etc. There are also commercial services such as DataDog designed for this type of service support. Exporting logs to Google BigQuery is another popular option.
Read this article published on DataDog's website on Data Access Audit Logging and their services. I am not recommended their service, just providing a link to more information. Their article is very good.
Best practices for monitoring GCP audit logs
To understand the fields that you need to process read this document:
AuthorizationInfo
This link will also help:
Understanding audit logs
Here is one way to go about it:
Create a new cloud pubsub topic
Create a new log routing sink with destination service of cloud pubsub topic created in the previous step (set a filter to be something like
protoPayload.authenticationInfo.principalEmail="<service-account-name>#<project-name>.iam.gserviceaccount.com" AND protoPayload.authorizationInfo.granted="false" to only get messages about unsuccessful auth action for your service account)
Create a cloud function that's triggered with a new message for the pubsub topic is published; this function can do whatever you desire, like send a message to the email address, page you or anything else you can come up with in the code.
I want write utility which fetches billing information for my project but I am not able to find any specific API from GCP to do the same. I tried a couple of APIs like getBillingInfo but these APIs just give information about billing account, not the pricing. They have the mechanism to export billing data to file but I want to do it programmatically. Is there any API to do the same?
One possible way that I am aware of is to export the Cloud billing to BigQuery. The process (document) to export Cloud Billing to BigQuery can be found here: https://cloud.google.com/billing/docs/how-to/export-data-bigquery
Once the export is done, the billing information including the price and services is available in almost realtime on BigQuery table. Once it's available on BigQuery there are numerous ways of extracting the information. A good solution would be to have Data Studio on Google Cloud to send you a periodic report on your billing information.
Is it possible to create StackDriver dashboards programmatically from a declarative template (similar to how Google Cloud Deployment allows you to create and manage cloud resources)?
We use a different GCP project for each of our environments (master, staging, prod) and want to have StackDriver dashboards with the same metrics in each project.
I checked the StackDriver Monitoring API docs as well as the Cloud Deployment supported resources and did not find any evidence of support for this use case.
I'm a Product Manager with Stackdriver. There's not currently a public available dashboard API. Until that's available, one solution many of our customers use is to create a canonical dashboard template and then copy this each time a new one is needed, modifying the input data.