Background
I have a Google Cloud project running my N applications. Each application has an exclusive IAM service account (total N service account) with minimal permissions.
Scenario
Let's imagine that one of the service accounts was leaked out. An attacker will try to take advantage of these credentials. Because he doesn't know exactly which kind of permissions this account has, we will try to make calls and see if it working for him.
Question
I want to "listen" to audit logs. Once I will see the log from kind "access denied", I will know that something is wrong with this service account.
Is this possible to write all those access denied incidents to Google Cloud Stackdriver?
How you recommend implementing it?
Thank you
Is this possible to write all those access denied incidents to Google
Cloud Stackdriver?
Most but not all Google Cloud services support this. However, access success will also be logged.
You will need to enable Data Access Audit Logs.
This could generate a massive amount of logging information.
Access logs for Org and Folder are only available via API and not the console.
Review pricing before enabling data access audit logs.
How you recommend implementing it?
This question is not suitable for Stackoverflow as it seeks recommendations and opinions. In general you will export your logs to Google Cloud Pub/Sub to be processed by a service such as Cloud Functions, Cloud Run, etc. There are also commercial services such as DataDog designed for this type of service support. Exporting logs to Google BigQuery is another popular option.
Read this article published on DataDog's website on Data Access Audit Logging and their services. I am not recommended their service, just providing a link to more information. Their article is very good.
Best practices for monitoring GCP audit logs
To understand the fields that you need to process read this document:
AuthorizationInfo
This link will also help:
Understanding audit logs
Here is one way to go about it:
Create a new cloud pubsub topic
Create a new log routing sink with destination service of cloud pubsub topic created in the previous step (set a filter to be something like
protoPayload.authenticationInfo.principalEmail="<service-account-name>#<project-name>.iam.gserviceaccount.com" AND protoPayload.authorizationInfo.granted="false" to only get messages about unsuccessful auth action for your service account)
Create a cloud function that's triggered with a new message for the pubsub topic is published; this function can do whatever you desire, like send a message to the email address, page you or anything else you can come up with in the code.
Related
I wanted to know if there was a way to track alerts or audit anything that happens with the AWS account like who changed what and why. I did find this https://docs.aws.amazon.com/opensearch-service/latest/developerguide/audit-logs.html where they use a comand line for enabling audit logs on an existing domain: aws opensearch update-domain-config --domain-name my-domain --log-publishing-options "AUDIT_LOGS={CloudWatchLogsLogGroupArn=arn:aws:logs:us-east-1:123456789012:log-group:my-log-group,Enabled=true}" but this is in regard to Amazon OpenSearch Service which I believe is only free for 12 months if you haven't used already. AWS Audit Manager. I am aware there are services that can do this but require a fee and I wanted to know if there were any free options
From the AWS documentation:
With AWS CloudTrail, you can monitor your AWS deployments in the cloud by getting a history of AWS API calls for your account, including API calls made by using the AWS Management Console, the AWS SDKs, the command line tools, and higher-level AWS services. You can also identify which users and accounts called AWS APIs for services that support CloudTrail, the source IP address from which the calls were made, and when the calls occurred. You can integrate CloudTrail into applications using the API, automate trail creation for your organization, check the status of your trails, and control how administrators turn CloudTrail logging on and off.
AWS Config provides a detailed view of the resources associated with your AWS account, including how they are configured, how they are related to one another, and how the configurations and their relationships have changed over time.
Basically, AWS CloudTrail keeps a log of API calls (requests to AWS to do/change stuff), while AWS Config tracks how individual configurations have changed over time (for a limited range of resources, such as Security Group rule being changed).
As the title says, there seems to be two ways to collect certain GCP audit logs.
Using The Google Workspace Admin SDK -- Specifying the "gcp" application name in the call.
Using the Google Cloud Logging API
What is the difference in logs collected? I'm still researching but hopefully someone on the google team sees this tag and knows exactly.
Does all of these logs get collected via worksapce admin sdk?
Admin Activity audit logs
Data Access audit logs
System Event audit logs
Policy Denied audit logs
The Admin Reports API only reports "OS login" events.
https://developers.google.com/admin-sdk/reports/v1/appendix/activity/gcp
I'm building a chat and having a feature with cloud translation API, for each client I create a new API Key to been able to identify the consume usage of each client, the problem is the following:
I want to see the consume of all API Keys inside a project, something like the Operations Logging:
But revealing information of the timestamp and the API Key name use so I can be able to track each client usage of the service and determine how much I am going to bill them.
Update
Doing some additional research come up to this article which gives a walkthrough to gain visibility on Service Account Keys (similar but not what I needed). On this guide they create a Log Sink to push logs into BigQuery.
The problem now is that the filter used to extract the logs is the following:
logName:"projects/<PROJECT>/logs/cloudaudit.googleapis.com"
protoPayload.authenticationInfo.serviceAccountKeyName:"*"
The second line extract log that belongs to Service Account Key Name. But as it was stated at the beginning of the question I'm looking for the API Key log not the service account key.
You can use Cloud Audit logs 1 , Cloud Audit Logs provides the following audit logs for each Cloud project, folder, and organization:
-Admin Activity audit logs
-Data Access audit logs
-System Event audit logs
-Policy Denied audit logs
Google Cloud services write audit log entries to these logs to help you answer the questions of "who did what, where, and when?" within your Google Cloud resources.
For this scenario it could be helpful Data Access audit logs 2, it contains API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data. Data Access audit logs do not record the data-access operations on resources that are publicly shared (available to All Users or All Authenticated Users) or that can be accessed without logging into Google Cloud.
As mentioned in the previous comment, this logs are disabled by default because they can be quite large; they must be explicitly enabled to be written.
However, the simplest way to view your API metrics is to use the Google Cloud Console's API Dashboard 3. You can see an overview of all your API usage, or you can drill down to your usage of a specific API.
My library is a CLI utility, and people get it by running pip install [libname]. I would like to automatically record exceptions that occur when people use it and store these logs in the cloud. I have found services that should do just that: AWS CloudWatch, GCP Stackdriver.
However, while looking at their API it appears that I would have to ship my private key in order for the library to authenticate to my account. This doesn't sound right and I am warned by the cloud providers not to do this.
Example from GCP fails, requires credentials:
from google.cloud import logging
client = logging.Client()
logger = client.logger('log_name')
logger.log_text('A simple entry') # API call
While python library exposes source, I understand that any kind of authentication I ship would bear the risk of people sending any fake logs, but this is OK to me, as I would just limit the spending on my account for the (unexpected) case that somebody does just that. Of Course the credentials that ship with the library should be restricted to logging only.
Any example of how to enable logging to a cloud service from user machines?
For Azure Application Insights' "Instrumentation Key" there is a very good article about that subject here: https://devblogs.microsoft.com/premier-developer/alternative-way-to-protect-your-application-insights-instrumentation-key-in-javascript/
While I'm not familiar with the offerings of AWS or GCP, I would assume similar points are vaild.
Generally speaking: While the instrumentation key is a method of authentication, it is not considered a very secret key in most scenarios. The worst damage somebody can do is to send unwanted logs. They cannot read any data or overwrite anything with that key. And you already stated above that you are not really worried in your case about the issue of unwated logs.
So, as long as you are using an App Insights instance only for one specific application / purpose, I would say you are fine. You can still further aggregate that data in the background with data from different sources.
To add an concrete example to this: This little tool from Microsoft (the specific use case does not matter here), collects telemetry as well and sends it to Azure Application Insights - if the user does not opt out. I won't point to the exact code line but their instrumentation key is checked-in to that public GitHub repo for anybody to find.
Alternatively, the most secure way would be to send data from the
browser to your custom API on your server then forward to Application
Insights resource with the correct instrumentation key (see diagram
below).
(Source: the link above)
App Insights SDK for python is here btw: https://github.com/microsoft/ApplicationInsights-Python
To write logs to Stackdriver requires credentials. Anonymous connections to Stackdriver are NOT supported.
Under no circumstances give non-privileged users logging read permissions. Stackdriver records sensitive information in Stackdriver Logs.
Google Cloud IAM provides the role roles/logging.logWriter. This role gives users just enough permissions to write logs. This role does not grant read permissions.
The role roles/logging.logWriter is fairly safe. A user can write logs, but cannot read, overwrite or delete logs. I say fairly safe as there is private information stored in the service account. I would create a separate project only for Stackdriver logging with no other services.
The second issue with providing external users access is cost. Stackdriver logs are $0.50 per GiB. You would not want someone uploading a ton of logfile entries. Make sure that you monitor external usage. Create an alert to monitor costs.
Creating and managing service accounts
Chargeable Stackdriver Products
Alert on Stackdriver usage
Stackdriver Permissions and Roles
I am trying to subscribe to Gmail for mail notifications using Google's Pub/Sub and I've gone through the documentation and tutorials provided by Google. At one point, the docs state that I need to publish the correct rights to my PubSub topic:
You need to grant publish privileges to serviceAccount:gmail-api-push#system.gserviceaccount.com. You can do this using the Cloud Pub/Sub Developer Console permissions interface following the resource-level access control instructions.
My understanding is that the publish privilege is equivalent to pubsub.topics.publish or just "PubSub Publisher". But what about the serviceAccount:gmail-api-push#system.gserviceaccount.com? I cant' find that service account anywhere and the console is not letting me create it due to character restrictions. Is there another way to get this account?
You do not need to create the gmail service account, it already exists. You are correct about “granting publish privileges” as the same as “Pub/Sub Publisher”.
Under the topic permissions interface, I would pass in the service account gmail-api-push#system.gserviceaccount.com (note: take out the leading “serviceAccount:” part) in the ‘new members’ field, and select “Pub/Sub Publisher” as the role.
I am guessing you couldn’t find the service account because you were looking up serviceAccount:gmail-api-push#system.gserviceaccount.com.
Permissions are set by Pub/Sub Topic in the Pub/Sub Topics section of the GCP Console.
Go to the Google Cloud Console.
From the Hamburget menu, go to Big Data -> Pub/Sub -> Topics.
Shortcut to this location: https://console.cloud.google.com/projectselector2/cloudpubsub/topicList
Click the checkbox for the Topic.
At the right side of the console, select "SHOW INFO PANEL".
Click Add members
Enter the service account and select the desired role. Usually Pub/Sub Publisher.