Difference between logs collected using Google Workspace Reports API, admin log type "gcp" vs using Cloud Logging API directly? - google-admin-sdk

As the title says, there seems to be two ways to collect certain GCP audit logs.
Using The Google Workspace Admin SDK -- Specifying the "gcp" application name in the call.
Using the Google Cloud Logging API
What is the difference in logs collected? I'm still researching but hopefully someone on the google team sees this tag and knows exactly.
Does all of these logs get collected via worksapce admin sdk?
Admin Activity audit logs
Data Access audit logs
System Event audit logs
Policy Denied audit logs

The Admin Reports API only reports "OS login" events.
https://developers.google.com/admin-sdk/reports/v1/appendix/activity/gcp

Related

IAM Permission Denied - Google Cloud Create metric

Currently I am trying to create a custom metric to be tracked in the Google Cloud Monitoring dashboard.
Although the admin of the system has given me Monitoring Editor, Monitoring Writer, and Monitoring Admin roles, this error is still occurring.
permission based error
Which permission settings do I need to create a custom / log based metric?
I have tried creating a metric on the Google Cloud Monitoring dashboard with the following roles: Monitoring Editor, Monitoring Writer, and Monitoring Admin. Even with these roles in my IAM permission settings, the red alert error still shows up when creating a metric.
It's confusing.
Although Log-based metrics produce Cloud Monitoring metrics, they are functionality of the Cloud Logging API (and so they require Logging-specific roles/permissions).
Here's the list of permissions: https://cloud.google.com/iam/docs/permissions-reference
Search it (!) for logging.logMetrics.create and you'll get:
Owner (roles/owner)
Editor (roles/editor)
App Engine flexible environment Service Agent (roles/appengineflex.serviceAgent)
Cloud Composer API Service Agent (roles/composer.serviceAgent)
Cloud Dataflow Service Agent (roles/dataflow.serviceAgent)
Logging Admin (roles/logging.admin)
Logs Configuration Writer (roles/logging.configWriter)
Serverless VPC Access Service Agent (roles/vpcaccess.serviceAgent)
Get roles/logging.admin or roles/logging.confgWriter bound to your identity and try again.

start up script virtual machine in google cloud

I can see in one of the GCE Instance start up script as
userdel -r userid
due to this user is not able to SSH through browser.
My question is which logs how do we find who has added this startup script to the vm and when?
can we use some logs?
Yes , you can check logs for this in activity log.
you can use below url by replacing your project id.
https://console.cloud.google.com/home/activity?project=
If you want to know who adds a startup script to the VM, you can check the Admin Activity audit logs and the System Event audit logs.
Admin Activity audit logs contain log entries for API calls or other actions that modify the configuration or metadata of resources.
And, System Event audit logs contain log entries for Google Cloud actions that modify the configuration of resources.
Google Cloud services write audit logs that record administrative
activities and accesses within your Google Cloud resources. Audit logs
help you answer "who did what, where, and when?" within your Google
Cloud resources with the same level of transparency as in on-premises
environments. Cloud Audit Logs provides the following audit logs for
each Cloud project, folder, and organization:
Admin Activity audit logs
Data Access audit logs
System Event audit logs
Policy Denied audit logs
The Data Access audit logs can be very useful too, but are disabled by default, Data Access audit logs contain API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data. If you want to enable it, please follow this link.
To view the Audits logs
In the Cloud console, go to the Logging> Logs Explorer page.
Select an existing Cloud project, folder, or organization.
In the Query builder pane, do the following:
In Resource type, select the Google Cloud resource whose audit logs
you want to see.
In Log name, select the audit log type that you want to see:
For Admin Activity audit logs, select activity.
For Data Access audit logs, select data_access.
For System Event audit logs, select system_event.
For Policy Denied audit logs, select policy.
If you don't see these options, then there aren't any audit logs of
that type available in the Cloud project, folder, or organization.
If you want to know more about Audit logs in GCP, please follow this link.

Google Cloud get the log usage information of a API Key

I'm building a chat and having a feature with cloud translation API, for each client I create a new API Key to been able to identify the consume usage of each client, the problem is the following:
I want to see the consume of all API Keys inside a project, something like the Operations Logging:
But revealing information of the timestamp and the API Key name use so I can be able to track each client usage of the service and determine how much I am going to bill them.
Update
Doing some additional research come up to this article which gives a walkthrough to gain visibility on Service Account Keys (similar but not what I needed). On this guide they create a Log Sink to push logs into BigQuery.
The problem now is that the filter used to extract the logs is the following:
logName:"projects/<PROJECT>/logs/cloudaudit.googleapis.com"
protoPayload.authenticationInfo.serviceAccountKeyName:"*"
The second line extract log that belongs to Service Account Key Name. But as it was stated at the beginning of the question I'm looking for the API Key log not the service account key.
You can use Cloud Audit logs 1 , Cloud Audit Logs provides the following audit logs for each Cloud project, folder, and organization:
-Admin Activity audit logs
-Data Access audit logs
-System Event audit logs
-Policy Denied audit logs
Google Cloud services write audit log entries to these logs to help you answer the questions of "who did what, where, and when?" within your Google Cloud resources.
For this scenario it could be helpful Data Access audit logs 2, it contains API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data. Data Access audit logs do not record the data-access operations on resources that are publicly shared (available to All Users or All Authenticated Users) or that can be accessed without logging into Google Cloud.
As mentioned in the previous comment, this logs are disabled by default because they can be quite large; they must be explicitly enabled to be written.
However, the simplest way to view your API metrics is to use the Google Cloud Console's API Dashboard 3. You can see an overview of all your API usage, or you can drill down to your usage of a specific API.

Google Cloud service account monitoring alerts

Background
I have a Google Cloud project running my N applications. Each application has an exclusive IAM service account (total N service account) with minimal permissions.
Scenario
Let's imagine that one of the service accounts was leaked out. An attacker will try to take advantage of these credentials. Because he doesn't know exactly which kind of permissions this account has, we will try to make calls and see if it working for him.
Question
I want to "listen" to audit logs. Once I will see the log from kind "access denied", I will know that something is wrong with this service account.
Is this possible to write all those access denied incidents to Google Cloud Stackdriver?
How you recommend implementing it?
Thank you
Is this possible to write all those access denied incidents to Google
Cloud Stackdriver?
Most but not all Google Cloud services support this. However, access success will also be logged.
You will need to enable Data Access Audit Logs.
This could generate a massive amount of logging information.
Access logs for Org and Folder are only available via API and not the console.
Review pricing before enabling data access audit logs.
How you recommend implementing it?
This question is not suitable for Stackoverflow as it seeks recommendations and opinions. In general you will export your logs to Google Cloud Pub/Sub to be processed by a service such as Cloud Functions, Cloud Run, etc. There are also commercial services such as DataDog designed for this type of service support. Exporting logs to Google BigQuery is another popular option.
Read this article published on DataDog's website on Data Access Audit Logging and their services. I am not recommended their service, just providing a link to more information. Their article is very good.
Best practices for monitoring GCP audit logs
To understand the fields that you need to process read this document:
AuthorizationInfo
This link will also help:
Understanding audit logs
Here is one way to go about it:
Create a new cloud pubsub topic
Create a new log routing sink with destination service of cloud pubsub topic created in the previous step (set a filter to be something like
protoPayload.authenticationInfo.principalEmail="<service-account-name>#<project-name>.iam.gserviceaccount.com" AND protoPayload.authorizationInfo.granted="false" to only get messages about unsuccessful auth action for your service account)
Create a cloud function that's triggered with a new message for the pubsub topic is published; this function can do whatever you desire, like send a message to the email address, page you or anything else you can come up with in the code.

Google Cloud Build - View logs permissions

I am the owner of a project and want to give Permissions to another user to view Logs of Google Cloud Build, but I can not figure out which Role / Permission this user needs.
Roles I've unsuccessfully tried are:
Cloud Build Editor,
Cloud Build Viewer,
Stackdriver Debugger Agent,
Stackdriver Debugger User,
Cloud Trace Admin,
Logging Admin,
Private Logs Viewer,
Logs Viewer,
Monitoring Admin
Google Cloud enterprise support confirmed to me that the only role that currently grants this is the project-level Viewer permission (or Editor/Owner).
One workaround is to set the --gcs-log-dir flag when submitting a build to a bucket you control (rather than the default Google-managed gs://[PROJECT_NUMBER].cloudbuild-logs.googleusercontent.com/ bucket).
They're tracking making a more granular permission in this feature request: https://issuetracker.google.com/issues/134928412, which you can subscribe to updates on.
If you add the project viewer role then this will resolve the issue. But, it seems like this is an issue (or maybe a feature request - to add the necessary permission to the Logs Viewer so that logs outside of the Stackdriver logging page can be viewed in other UIs as well), looking at the documentation it says "roles/viewer (Project Viewer) gives members the same permissions as roles/logging.viewer at the project level. " which means that by just assigning the Logs Viewer then it should have given the user the necessary permission to view the logs at the Cloud Build UI.