I have made a dashboard in AWS Cloudwatch. My goal is to display the dashboard on a display in our office. Hence I shared the dashboard using the option "Share your dashboard and require a username and password".
This makes me able to log in and view the dashboard from a non-AWS user, but I am not completely satisfied with the behaviour. My challenges are:
When logging in there is no option to "keep me signed in". So every day or so I need to sign in again. This is not desirable for an office dashboard.
Dashboard settings like light vs dark mode and refresh frequency are not stored anywhere, so after each log in (from the non-AWS user) I need to manually adjust these.
After a lot of struggle with CloudWatch dashboards I decided to switch to Grafana for my dashboards instead.
Grafana solved all my issues with CloudWatch and we even got more advanced dashboard features as well. Currently, the Grafana free tier also support all our needs, so no extra cost.
Related
When reviewing the documentation here (https://cloud.google.com/monitoring/alerts/incidents-events#incident) and using the general product, it appears that the open incidents and associated details are only displayed via the Google Cloud Console.
The CLI and API appear to only support management of alerting policies, but I cannot find a way to retrieve a list of open incidents. For example, I'd like to send an alert if more than 5 incidents are open for 12 hours on a specific alerting policy. It would seem that the data exists (as provided from the Google Cloud Console), but the API is not public.
I think it is not possible.
Incidents appear to not be part of Google's public API for Cloud Monitoring.
There are a couple of ways to verify this:
APIs Explorer documents Cloud Monitoring API and there are no Incident resource types nor methods.
Using e.g. Chrome Developer Console while browsing Cloud Console: Incidents doesn't (appear to) include any public API endpoints/methods
There's an existing feature request (FR) on Google's public issue tracker for this.
I encourage you to "Star this issue" by clicking the star icon to the left of the issue title Manage Incidents via API to both "+1" the FR and to subscribe to updates on it.
I run a small research group at a large university that manages hundreds of GCP accounts. The university acts as the Billing Administrator, and my research group was assigned a GCP "project" for all of our work. However, for privacy reasons, they cannot give me access to the Billing API because this would allow me to see the billing details for other labs.
Because we have trainees in our lab who WILL make mistakes, I would like to setup an automated system that monitors our current GCP bill, and (1) sends notifications or (2) terminates all VMs, when that bill reaches certain predefined limits. For example, if our monthly budget is $10k, then I would like to receive a notification at $5k, another notification at $10k, and I would like to terminate all VMs at $15k.
My problem is that in order to implement a system like this, I need access to the Billing API. I have already contacted my system administrator and they have said that this is impossible. Instead, they proposed that I write a script that lists all VMs and uses the Cost Calculator to estimate my monthly GCP bill.
However, this seems a little circuitous. When I am using the Google Cloud Console, I can see the total and forecasted costs for my project, so it seems that I should be able to access this information programmatically. However, I cannot find any information on how to do this, since all solutions require me to activate the Billing API. Any ideas?
There is no API to fetch the data you see in the Google Cloud Console. You will need to export the billing data and then process each row of data to generate reports.
There are two options that I can think of:
Option 1) Ask the admin to set up billing data export to BigQuery. Grant you permission to query the billing tables. You can then query BiGQuery to generate your own cost reports.
Set up Cloud Billing data export to BigQuery
Option 2) Create a separate billing account for your project and grant you permission. A GCP ORG can have multiple Billing Accounts tied to the same Payments Account. This option supports creating budget alerts.
My library is a CLI utility, and people get it by running pip install [libname]. I would like to automatically record exceptions that occur when people use it and store these logs in the cloud. I have found services that should do just that: AWS CloudWatch, GCP Stackdriver.
However, while looking at their API it appears that I would have to ship my private key in order for the library to authenticate to my account. This doesn't sound right and I am warned by the cloud providers not to do this.
Example from GCP fails, requires credentials:
from google.cloud import logging
client = logging.Client()
logger = client.logger('log_name')
logger.log_text('A simple entry') # API call
While python library exposes source, I understand that any kind of authentication I ship would bear the risk of people sending any fake logs, but this is OK to me, as I would just limit the spending on my account for the (unexpected) case that somebody does just that. Of Course the credentials that ship with the library should be restricted to logging only.
Any example of how to enable logging to a cloud service from user machines?
For Azure Application Insights' "Instrumentation Key" there is a very good article about that subject here: https://devblogs.microsoft.com/premier-developer/alternative-way-to-protect-your-application-insights-instrumentation-key-in-javascript/
While I'm not familiar with the offerings of AWS or GCP, I would assume similar points are vaild.
Generally speaking: While the instrumentation key is a method of authentication, it is not considered a very secret key in most scenarios. The worst damage somebody can do is to send unwanted logs. They cannot read any data or overwrite anything with that key. And you already stated above that you are not really worried in your case about the issue of unwated logs.
So, as long as you are using an App Insights instance only for one specific application / purpose, I would say you are fine. You can still further aggregate that data in the background with data from different sources.
To add an concrete example to this: This little tool from Microsoft (the specific use case does not matter here), collects telemetry as well and sends it to Azure Application Insights - if the user does not opt out. I won't point to the exact code line but their instrumentation key is checked-in to that public GitHub repo for anybody to find.
Alternatively, the most secure way would be to send data from the
browser to your custom API on your server then forward to Application
Insights resource with the correct instrumentation key (see diagram
below).
(Source: the link above)
App Insights SDK for python is here btw: https://github.com/microsoft/ApplicationInsights-Python
To write logs to Stackdriver requires credentials. Anonymous connections to Stackdriver are NOT supported.
Under no circumstances give non-privileged users logging read permissions. Stackdriver records sensitive information in Stackdriver Logs.
Google Cloud IAM provides the role roles/logging.logWriter. This role gives users just enough permissions to write logs. This role does not grant read permissions.
The role roles/logging.logWriter is fairly safe. A user can write logs, but cannot read, overwrite or delete logs. I say fairly safe as there is private information stored in the service account. I would create a separate project only for Stackdriver logging with no other services.
The second issue with providing external users access is cost. Stackdriver logs are $0.50 per GiB. You would not want someone uploading a ton of logfile entries. Make sure that you monitor external usage. Create an alert to monitor costs.
Creating and managing service accounts
Chargeable Stackdriver Products
Alert on Stackdriver usage
Stackdriver Permissions and Roles
Is there a way to hide the billing widget from some users in the GCP console?
I know if I login as myself, I can customize my dashboard, but I need to hide the billing widget from a couple of users (long story) so I'm looking for guidance.
Another option would be to create some sort of "default" dashboard for everyone, which would not include the billing widget, that would work too if someone knows how to do that.
Thanks...Rich
Unfortunately, there is no way to remove that Billing card for all users. If you remove it on your dashboard, it is only hidden on yours. The default dashboard is the same for all users.
On top of that, the same permissions that allow users to access the Management Console also provides permissions to see the contents of that card. At the moment, the only way to hide the contents of that card is to remove access to the console as a whole.
This being said, there is currently a feature request in progress to have special permissions created for the billing card or to include that billing info in the Cloud Billing account (which can be restricted on a per user basis)
I really like the AWS Cloudwatch dashboard feature, but it would be far more useful if I could increase visibility to it to let others in our company who might not know AWS. Is there any way of displaying it outside of the AWS website?
kraught's answer is actually the correct one and should not be downvoted. CloudWatch Dasboards now literally have a share option. You can share your dashboards a) publicly, b) via username/password or c) SSO. Either way you set this up, you eventually get a shareable link that's accessible outside of the AWS console and you can easily embed this in your website.