API for getting daily costs related to GCP - google-cloud-platform

Is there any API(preferably python) that can be used to get the resources' usage cost report in GCP? Billing APIs don't seem to return the costs of resources being used.

You can export you cloud billing data to BigQuery:
https://cloud.google.com/billing/docs/how-to/export-data-bigquery
https://cloud.google.com/billing/docs/how-to/export-data-bigquery-setup
You select the dataset where the cost metadata goes, once in BigQuery, it's fairly easy to query, which you can do with the python BigQuery client API, it also makes sure you keep a historic in case you change billing provider, of course it'll incur storage cost which will vary based on your usage.

Related

Is there some way in which we can view the usage statistics of existing QuickSight reports and dashboards?

Context: As a new intern at a firm, one of my responsibilities is to maintain a clean and ordered QuickSight Analysis and Datasets list.
There are a lot of existing analysis reports and dashboards on the firm's Amazon QuickSight account, dating back to several years. There is a concern of deleting the old reports/supporting datasets which take up a lot of SPICE storage because of the thought that someone is using/accessing it. Is there a way one can see the stats of each report - how many people accessed it, how many times it was used over the last month etc., which could help one decide the analysis reports/datasets that can be deleted. Please help.
This AWS blog post -- Using administrative dashboards for a centralized view of Amazon QuickSight objects -- discussed how BI administrators can use the QuickSight dashboard, Lambda functions, and other AWS services to create a centralized view of groups, users, and objects access permission information and abnormal access auditing.
It is mainly security focused, but you can get the idea, how to find the relevant information about access to QuickSight objects in the AWS CloudTrail events.

Google Cloud Datastore Billing

gcloud datastore export --namespaces="(default)" gs://${BUCKET}
Will google charge us for datastore read operations when we do datastore exports? We'd like to run nightly backups, but we don't want to get charged an arm and a leg.
Yes. It may not be huge unless your table contains lots and lots of entities.
Refer to the table for pricing details. https://cloud.google.com/datastore/pricing
Source:
Export and import operations are charged for entity reads and writes at the rates shown in the table above. If you cancel an export or import, you will be charged for operations performed up until the time that the cancel request has propagated through Cloud Datastore.
https://cloud.google.com/datastore/pricing

Viewing GCP Cloud Datastore metrics, traces, query plan

What tools are there to investigate query performance in Cloud Datastore? I'm interesting in seeing things similar to a SQL query plan or any metrics beyond the total round trip request time.
Unfortunately, these insights don't exist. Cloud Datastore only supports queries where an index exists.

How to know if how many node hours I have used on a Google Cloud ML for prediction?

I tried to find out how to get the node hour usage of my Google Cloud ML Prediction API, but didn't find any. Is there a way to know the usage, except for looking at the bills?
Here is the API Documentation I referred to.
The documentation page you referenced is part of the Cloud Machine Learning Engine API documentation:
An API to enable creating and using machine learning models.
That API is for using the product itself, it doesn't contain billing information for the product.
For billing info you want to look at Cloud Billing and its API:
With this API, you can do any of the following.
For billing accounts:
...
Get billing information for a project.
However from just a quick glance at the docs (I didn't use it yet) the API itself doesn't appear to directly provide the particular info you're looking for. But possible ways to get that info appear to be:
using Billing Reports:
The Cloud Billing Reports page lets you view your Google Cloud
Platform (GCP) usage costs at a glance and discover and analyze
trends. The Reports page displays a chart that plots usage costs for
all projects linked to a billing account. To help you view the cost
trends that are important to you, you can select a data range, specify
a time range, configure the chart filters, and group by project,
product, or SKU.
Billing reports can help you answer questions like these:
How is my current month's GCP spending trending?
Export Billing Data to a File:
To access a detailed breakdown of your charges, you can export your
daily usage and cost estimates automatically to a CSV or JSON file
stored in a Google Cloud Storage bucket you specify. You can then
access the data via the Cloud Storage API, CLI tool, or Google Cloud
Platform Console.
Export Billing Data to BigQuery:
Billing export to BigQuery enables you to export your daily usage and
cost estimates automatically throughout the day to a BigQuery dataset
you specify. You can then access your billing data from BigQuery. You
can also use this export method to export data to a JSON file.

GCP quota and billing at user level

I am trying to see if there is a way for me to bill and restrict users based on their usage of resources across projects on our GCP instance. I know billing and quotas officially are at the project level, but we will have shared projects which will be used by people in seperate cost codes. I was thinking of building an API per cost code, and having people access the resources only through there, but I'm not sure if this is the best idea.
You can use a grouping system where you use labels to track your resources. Using labels such as cost center, service and environment will allow you to track your GCP resource usage and spending.
Then you can export your billing data to BigQuery where it can be filtered and segmented by labels.
In terms of notifications and restricting access, you may set up budget alerts by following this documentation. Together with Cloud Pub/Sub budget notifications and Cloud Functions, you can have more control on your spending by capping costs and stopping your billing.