how to securely publish logs to the cloud - amazon-web-services

My library is a CLI utility, and people get it by running pip install [libname]. I would like to automatically record exceptions that occur when people use it and store these logs in the cloud. I have found services that should do just that: AWS CloudWatch, GCP Stackdriver.
However, while looking at their API it appears that I would have to ship my private key in order for the library to authenticate to my account. This doesn't sound right and I am warned by the cloud providers not to do this.
Example from GCP fails, requires credentials:
from google.cloud import logging
client = logging.Client()
logger = client.logger('log_name')
logger.log_text('A simple entry') # API call
While python library exposes source, I understand that any kind of authentication I ship would bear the risk of people sending any fake logs, but this is OK to me, as I would just limit the spending on my account for the (unexpected) case that somebody does just that. Of Course the credentials that ship with the library should be restricted to logging only.
Any example of how to enable logging to a cloud service from user machines?

For Azure Application Insights' "Instrumentation Key" there is a very good article about that subject here: https://devblogs.microsoft.com/premier-developer/alternative-way-to-protect-your-application-insights-instrumentation-key-in-javascript/
While I'm not familiar with the offerings of AWS or GCP, I would assume similar points are vaild.
Generally speaking: While the instrumentation key is a method of authentication, it is not considered a very secret key in most scenarios. The worst damage somebody can do is to send unwanted logs. They cannot read any data or overwrite anything with that key. And you already stated above that you are not really worried in your case about the issue of unwated logs.
So, as long as you are using an App Insights instance only for one specific application / purpose, I would say you are fine. You can still further aggregate that data in the background with data from different sources.
To add an concrete example to this: This little tool from Microsoft (the specific use case does not matter here), collects telemetry as well and sends it to Azure Application Insights - if the user does not opt out. I won't point to the exact code line but their instrumentation key is checked-in to that public GitHub repo for anybody to find.
Alternatively, the most secure way would be to send data from the
browser to your custom API on your server then forward to Application
Insights resource with the correct instrumentation key (see diagram
below).
(Source: the link above)
App Insights SDK for python is here btw: https://github.com/microsoft/ApplicationInsights-Python

To write logs to Stackdriver requires credentials. Anonymous connections to Stackdriver are NOT supported.
Under no circumstances give non-privileged users logging read permissions. Stackdriver records sensitive information in Stackdriver Logs.
Google Cloud IAM provides the role roles/logging.logWriter. This role gives users just enough permissions to write logs. This role does not grant read permissions.
The role roles/logging.logWriter is fairly safe. A user can write logs, but cannot read, overwrite or delete logs. I say fairly safe as there is private information stored in the service account. I would create a separate project only for Stackdriver logging with no other services.
The second issue with providing external users access is cost. Stackdriver logs are $0.50 per GiB. You would not want someone uploading a ton of logfile entries. Make sure that you monitor external usage. Create an alert to monitor costs.
Creating and managing service accounts
Chargeable Stackdriver Products
Alert on Stackdriver usage
Stackdriver Permissions and Roles

Related

Google Cloud get the log usage information of a API Key

I'm building a chat and having a feature with cloud translation API, for each client I create a new API Key to been able to identify the consume usage of each client, the problem is the following:
I want to see the consume of all API Keys inside a project, something like the Operations Logging:
But revealing information of the timestamp and the API Key name use so I can be able to track each client usage of the service and determine how much I am going to bill them.
Update
Doing some additional research come up to this article which gives a walkthrough to gain visibility on Service Account Keys (similar but not what I needed). On this guide they create a Log Sink to push logs into BigQuery.
The problem now is that the filter used to extract the logs is the following:
logName:"projects/<PROJECT>/logs/cloudaudit.googleapis.com"
protoPayload.authenticationInfo.serviceAccountKeyName:"*"
The second line extract log that belongs to Service Account Key Name. But as it was stated at the beginning of the question I'm looking for the API Key log not the service account key.
You can use Cloud Audit logs 1 , Cloud Audit Logs provides the following audit logs for each Cloud project, folder, and organization:
-Admin Activity audit logs
-Data Access audit logs
-System Event audit logs
-Policy Denied audit logs
Google Cloud services write audit log entries to these logs to help you answer the questions of "who did what, where, and when?" within your Google Cloud resources.
For this scenario it could be helpful Data Access audit logs 2, it contains API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data. Data Access audit logs do not record the data-access operations on resources that are publicly shared (available to All Users or All Authenticated Users) or that can be accessed without logging into Google Cloud.
As mentioned in the previous comment, this logs are disabled by default because they can be quite large; they must be explicitly enabled to be written.
However, the simplest way to view your API metrics is to use the Google Cloud Console's API Dashboard 3. You can see an overview of all your API usage, or you can drill down to your usage of a specific API.

Google Cloud service account monitoring alerts

Background
I have a Google Cloud project running my N applications. Each application has an exclusive IAM service account (total N service account) with minimal permissions.
Scenario
Let's imagine that one of the service accounts was leaked out. An attacker will try to take advantage of these credentials. Because he doesn't know exactly which kind of permissions this account has, we will try to make calls and see if it working for him.
Question
I want to "listen" to audit logs. Once I will see the log from kind "access denied", I will know that something is wrong with this service account.
Is this possible to write all those access denied incidents to Google Cloud Stackdriver?
How you recommend implementing it?
Thank you
Is this possible to write all those access denied incidents to Google
Cloud Stackdriver?
Most but not all Google Cloud services support this. However, access success will also be logged.
You will need to enable Data Access Audit Logs.
This could generate a massive amount of logging information.
Access logs for Org and Folder are only available via API and not the console.
Review pricing before enabling data access audit logs.
How you recommend implementing it?
This question is not suitable for Stackoverflow as it seeks recommendations and opinions. In general you will export your logs to Google Cloud Pub/Sub to be processed by a service such as Cloud Functions, Cloud Run, etc. There are also commercial services such as DataDog designed for this type of service support. Exporting logs to Google BigQuery is another popular option.
Read this article published on DataDog's website on Data Access Audit Logging and their services. I am not recommended their service, just providing a link to more information. Their article is very good.
Best practices for monitoring GCP audit logs
To understand the fields that you need to process read this document:
AuthorizationInfo
This link will also help:
Understanding audit logs
Here is one way to go about it:
Create a new cloud pubsub topic
Create a new log routing sink with destination service of cloud pubsub topic created in the previous step (set a filter to be something like
protoPayload.authenticationInfo.principalEmail="<service-account-name>#<project-name>.iam.gserviceaccount.com" AND protoPayload.authorizationInfo.granted="false" to only get messages about unsuccessful auth action for your service account)
Create a cloud function that's triggered with a new message for the pubsub topic is published; this function can do whatever you desire, like send a message to the email address, page you or anything else you can come up with in the code.

Using Google Cloud KMS on behalf of user

I have a CLI tool that interacts with Google KMS. In order for it to work, I fetch the user credentials as a JSON file which is stored on disk. Now a new requirement came along. I need to make a web app out of this CLI tool. The web app will be protected via Google Cloud IAP. Question is, how do I run the CLI tool on behalf of the authenticated user?
You don't. Better use a service-account and assign the required role. That service account still could have domain-wide delegation of rights (able to impersonate just any user, which is known).
Running CLI tools from a web-application probably also could/should be avoided. Iit might be better to convert his CLI tool into a Cloud Function and then call it via HTTP trigger, from within the web-application (so that access to the service account is limited as far as possible).
This might also be something to reconsider, security-wise:
I fetch the user credentials as a JSON file which is stored on disk.
Even if it might have been required, with a service-account it wouldn't.

Automating third party access to AWS Resources

I'm currently creating an open source web interface for a very CPU intensive task that's making use of other open source projects. Because it is very simple and I want to keep it open source, I don't want to bother with a revenue scheme supporting it. My plan currently is to host the site in an S3 bucket and have some simple lambda functions managing the execution delegation to the client his AWS account.
My question is, is it possible to grant access to somebody his AWS account, similar to how it works with OAuth 2.0. In an ideal world, I'd like them to see a big "authorize" button redirecting them to AWS, listing the permissions and having a confirm or deny button. Trust issues aside, this is the only resource I could find and it looks quite cumbersome for somebody to authorize my app which in essence will only perform computations on their AWS EC2 account.
Actually, cross account access using IAM roles is absolutely the best way to do this. The docs have all the info you would need. It can be very simple for your user. The set up instructions for Spotinst, a third party AWS service provider, demonstrate how simple it can be. They have it in four steps:
Connect Spotinst to your Cloud Provider:
Click on the "Open template in Cloudformation" button and follow the instructions. Make sure to not refresh or leave this page until
you save your credentials.
Paste the Role ARN that was created.
Click on the "Connect account" button.
If you try it out I think you'll find it to be even easier than adding oauth to your service.

AWS assume iam roles vs gcp's json files with private keys

One thing I dislike about Google Cloud Platform (GCP) is its less baked-in security model around roles/service accounts.
Running locally on my laptop, I need to use the service account's key specified in a JSON file. In AWS, I can just assume a role I have been granted access to assume (without needing to carry around a private key). Is there an analogue to this with GCP?
I am going to try and answer this. I have the AWS Security Specialty (8 AWS certifications) and I know AWS very well. I have been investing a lot of time this year mastering Google Cloud with a focus on authorization and security. I am also an MVP Security for Alibaba Cloud.
AWS has a focus on security and security features that I both admire and appreciate. However, unless you really spend the time to understand all the little details, it is easy to implement poor/broken security in AWS. I can also say the same about Google security. Google has excellent security built into Google Cloud Platform. Google just does it differently and also requires a lot of time to understand all the little features / details.
In AWS, you cannot just assume a role. You need an AWS Access Key first or be authenticated via a service role. Then you can call STS to assume a role. Both AWS and Google make this easy with AWS Access Keys / Google Service Accounts. Whereas AWS uses roles, Google uses roles/scopes. The end result is good in either platform.
Google authentication is based upon OAuth 2.0. AWS authentication is based upon Access Key / Secret Key. Both have their strengths and weaknesses. Both can be either easy to implement (if you understand them well) or a pain to get correct.
The major cloud providers (AWS, Azure, Alibaba, Google, IBM) are moving very fast with a constant stream of new features and services. Each one has strengths and weaknesses. Today, there is no platform that offers all the features of the others. AWS today is ahead both in features and market share. Google has a vast number of services that outnumber AWS and I don't know why this is overlooked. The other platforms are catching up quickly and today, you can implement enterprise class solutions and security with any of the cloud platforms.
Today, we would not choose only Microsoft or only Open Source for our application and server infrastructure. In 2019, we will not be chosing only AWS or only Google, etc. for our cloud infrastructure. We will mix and match the best services from each platform for our needs.
As described in the Getting Started with Authentication [1] page, for service accounts it is needed the key file in order to authenticate.
From [2]: You can authenticate to a Google Cloud Platform (GCP) API using service accounts or user accounts, and for APIs that don't require authentication, you can use API keys.
Service and user accounts needs the key file to authenticate. Taking this information into account, there is no manner to locally authenticate without using a key file.
Links:
[1] https://cloud.google.com/docs/authentication/getting-started
[2] https://cloud.google.com/docs/authentication/