I´m trying to figure it out how to detect if someone deactivate the audit logs on GCP.
Right now I´m monitoring audit logs from GCP from different services and this logs are been sent to a SIEM, but if someone deactivate the audit logs option I can not detect it on my SIEM because it stops arriving.
I tried to detect it from the SIEM side and try to alert if the pipelines in this case were empty but it didn´t work. Is there a way to alert on GCP if this logs have been disabled? I searched on internet but I didn´t get anything clear.
Will put my answer here so it's helpful to other people.
Data Audit Logs can be enabled for each service independently. You can do that either from the console (IAM & Admin > Audit Logs > Select Service and Enable Logs).
What way to alert if logs are disabled is to use the API. You can query the Project and see which logs are enabled. Example below using gcloud
➜ ~ gcloud projects get-iam-policy $PROJECT
auditConfigs:
- auditLogConfigs:
- logType: ADMIN_READ
- logType: DATA_READ
- logType: DATA_WRITE
service: storage.googleapis.com
So this can be done either with gcloud wrapped in a script or using the GCP API.
Related
Currently I am trying to create a custom metric to be tracked in the Google Cloud Monitoring dashboard.
Although the admin of the system has given me Monitoring Editor, Monitoring Writer, and Monitoring Admin roles, this error is still occurring.
permission based error
Which permission settings do I need to create a custom / log based metric?
I have tried creating a metric on the Google Cloud Monitoring dashboard with the following roles: Monitoring Editor, Monitoring Writer, and Monitoring Admin. Even with these roles in my IAM permission settings, the red alert error still shows up when creating a metric.
It's confusing.
Although Log-based metrics produce Cloud Monitoring metrics, they are functionality of the Cloud Logging API (and so they require Logging-specific roles/permissions).
Here's the list of permissions: https://cloud.google.com/iam/docs/permissions-reference
Search it (!) for logging.logMetrics.create and you'll get:
Owner (roles/owner)
Editor (roles/editor)
App Engine flexible environment Service Agent (roles/appengineflex.serviceAgent)
Cloud Composer API Service Agent (roles/composer.serviceAgent)
Cloud Dataflow Service Agent (roles/dataflow.serviceAgent)
Logging Admin (roles/logging.admin)
Logs Configuration Writer (roles/logging.configWriter)
Serverless VPC Access Service Agent (roles/vpcaccess.serviceAgent)
Get roles/logging.admin or roles/logging.confgWriter bound to your identity and try again.
I can see in one of the GCE Instance start up script as
userdel -r userid
due to this user is not able to SSH through browser.
My question is which logs how do we find who has added this startup script to the vm and when?
can we use some logs?
Yes , you can check logs for this in activity log.
you can use below url by replacing your project id.
https://console.cloud.google.com/home/activity?project=
If you want to know who adds a startup script to the VM, you can check the Admin Activity audit logs and the System Event audit logs.
Admin Activity audit logs contain log entries for API calls or other actions that modify the configuration or metadata of resources.
And, System Event audit logs contain log entries for Google Cloud actions that modify the configuration of resources.
Google Cloud services write audit logs that record administrative
activities and accesses within your Google Cloud resources. Audit logs
help you answer "who did what, where, and when?" within your Google
Cloud resources with the same level of transparency as in on-premises
environments. Cloud Audit Logs provides the following audit logs for
each Cloud project, folder, and organization:
Admin Activity audit logs
Data Access audit logs
System Event audit logs
Policy Denied audit logs
The Data Access audit logs can be very useful too, but are disabled by default, Data Access audit logs contain API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data. If you want to enable it, please follow this link.
To view the Audits logs
In the Cloud console, go to the Logging> Logs Explorer page.
Select an existing Cloud project, folder, or organization.
In the Query builder pane, do the following:
In Resource type, select the Google Cloud resource whose audit logs
you want to see.
In Log name, select the audit log type that you want to see:
For Admin Activity audit logs, select activity.
For Data Access audit logs, select data_access.
For System Event audit logs, select system_event.
For Policy Denied audit logs, select policy.
If you don't see these options, then there aren't any audit logs of
that type available in the Cloud project, folder, or organization.
If you want to know more about Audit logs in GCP, please follow this link.
I have a Windows Server 2016 Instance in GCP,
I would like to to view the Event Viewer log or custom log in Google Cloud Logging.
I already install the OpsAgent, configure the default config in C:\Program Files\Google\Cloud Operations\Ops Agent\config\config.yaml and restart the service.
The VM also had Stackdriver Logging API: Write Only &
Stackdriver Monitoring API: Write Only permissions.
I try to restart the IIS, the log is appear in Event Viewer but not in Cloud Logging.
What i need to do to make the logging coming out in Cloud Logging ?
Thank you.
I able to solve my own issues.
Since my default Logs Router _Default is only allowing the log from container.
By create a new Log Router to incude resource.type="gce_instance" resources, I'm able to see the log coming from the Windows instance now.
I created a Health Check in my GCP project for http traffic. I wanted to see the logs in Cloud Logging but I cant see any logs related to the created HealthCheck. I have logging enabled in HealthCheck. Can anyone help me?
You can go to Cloud Logging and filter on gcp_health_check resources, like this
You can view your health check logs under Logs explorer in the GCP console.
Check out this documentation for more details.
To see all logs, you can use the following query:
logName="projects/YOUR_PROJECT_ID/logs/compute.googleapis.com%2Fhealthchecks"
I would like to define the steps in making a pubsub log export sink from one project to another. As a secondary goal, I would like the sink to bridge organizations. So far I have followed these steps as outlined in the gcloud help pages and the auth documentation.
First I created a sink:
# from project A
gcloud logging sinks create \
<sink_name> \
pubsub.googleapis.com/projects/<project_B>/topics/<topic_name> \
--log-filter <filter>
The CLI returns successfully and gives a little advice on setting up permissions for a service account it created:
Created [https://logging.googleapis.com/v2/projects/<project_A>/sinks/<sink_name>].
Please remember to grant `serviceAccount:<new_user>#logging-<project_A_account_number>.iam.gserviceaccount.com` Pub/Sub Publisher role to the topic.
More information about sinks can be found at https://cloud.google.com/logging/docs/export/configure_export
Following this advice, I gave the new service account the appropriate permissions for that topic.
gcloud projects add-iam-policy-binding <project_B> \
--member serviceAccount:<new_user>#logging-<project_A_account_numbe_id>.iam.gserviceaccount.com \
--role roles/pubsub.publisher
This command returns without issue.
In spite of everything seeming OK, no logs flow through the sink.
Here are some clues: The Exports tab on the Logs Viewer reports a permissions error in the sink. The Project Activity tab reports a permissions issue.
Image: Logs Viewer, Exports
Image: Project, Activity
Is there a solution to make this work? Is it possible to generalize this to send logs to a sink in this project from other gcloud organizations?
I have been able to reproduce the scenario you wanted. Let me set the basics for the scenario description:
Project A (where my logs are stored): project-a
Project B (where my Pub/Sub Topic and Subscription are): project-b
Topic: projects/project-b/topics/myTopic
Subscription: mySub
Sink: test-sink
Then, this is the processed I followed:
In project A: create a filter for logs in the Logging > Logs tab in the Google Cloud Console.
Create an export with the elements in the image below (remember to append pubsub.googleapis.com/ to the name of your topic in the other project):
Move to the Exports tab and copy the Writer Identity, which should have the format test-sink#XXXXXXXX.iam.gserviceaccount.com
In project B: go to the IAM & admin > IAM tab in the Console and add a new member being the previous service account obtained in step 3 with the role Pub/Sub Editor enabled.
Create a Pub/Sub subscription with the command gcloud beta pubsub subscriptions create --topic myTopic mySub
Do some operation that results in logs read by the filter you specified in Project A.
Consume the logs written to the topic using the subscription, with the command gcloud beta pubsub subscriptions pull mySub.
There you will find the logs that are written from Project A to your tropic in Project B. I have reproduced this same scenario writing logs from a simple App Engine application (and therefore with the appropriate log filter searching for App Engine logs), and when I make requests to the App Engine app, some logs are created and then written in myTopic, which I can read using mySub.
Regarding your second question, I cannot make sure whether this same procedure works in a cross-organization scenario but I see no issues with that.
Matt, your gcloud commands to create a sink and grant publisher role to the service account used by the sink looks correct. The error you saw may be transient and it should have resolved after some time.
A possible reason for the error could be that there was a time delay between the two commands, and the sink tried to export logs immediately before you granted the IAM role. Can you confirm that the error resolved itself eventually?
To answer your last question, yes you can export logs from one project to a destination in another project in a different organization.