Viewing GCP Cloud Datastore metrics, traces, query plan - google-cloud-platform

What tools are there to investigate query performance in Cloud Datastore? I'm interesting in seeing things similar to a SQL query plan or any metrics beyond the total round trip request time.

Unfortunately, these insights don't exist. Cloud Datastore only supports queries where an index exists.

Related

Differentiation between query executed on on demand pricing and flat rate pricing in bigquery

In GCP bigquery, there are two types of pricing 1) On demand 2) Flat rate pricing.
I have two projects one is configured with on-demand, other one is configured with flat rate pricing. Now when i executes a query on two projects, i need to know the differentiation between the query executed on on-demand pricing and flat rate pricing. In the GUI, we can know the difference .
Flat Rate Pricing
On demand pricing
But through bigquery libraries i am calling the bigquery API to get the job Object by jobId. In that i am unable to find the difference between those queries , atleast i expected some info related to reservation in the flat rate pricing query, but there is no luck. I need this info in our monitoring tool to identify the difference between the queries that are executed on on-demand and flat rate.
One analysis i found was through information schema we can get the info, but i am more interested at the API level through bigquery java libraries.
In the Get job API, you get a JOB object. In it, you have a JobStatistic entry. In this entry, you have the reservation detail

Aws cloud watch log insight vs Aws Athena

I have implemented some queries and dashboards where I used cloudwatch logInsight. I have used sample queries where I used parse, fields, aggregate functions etc.
I am curious if we need complex queries like joins, windows functions etc, can we use Loginsight query language to create our scenarios just as done in SQL? I didn't find any good examples on docs here: https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CWL_QuerySyntax.html
Also I read Cloudwatch log are six times expensive as compare to aws athena.
Is there any suggestions on complex monitoring what is the best way to do monitoring in aws?

Can I query Logs Explorer directly from a cloud function?

For a monitoring project I'm using Logs Router to send log data to BigQuery Table so I can then query the BigQuery table from cloud functions. Would it be possible to directly query Log Explorer from Cloud Functions? (i.e not having to replicate my logs to BigQuery?)
Thanks
Yes, of course you can. You even have client libraries for that. However, keep in mind that, by default, your logs are kept only 30 days. It could be enough, or not, depending on your use case.
You can create custom log bucket, with a different retention period, or sing the logs in BigQuery.
The main advantage of BigQuery if the capacity to join the logs data with other data in BigQuery, to perform powerful analytics computation. But still depends on your use case.

API for getting daily costs related to GCP

Is there any API(preferably python) that can be used to get the resources' usage cost report in GCP? Billing APIs don't seem to return the costs of resources being used.
You can export you cloud billing data to BigQuery:
https://cloud.google.com/billing/docs/how-to/export-data-bigquery
https://cloud.google.com/billing/docs/how-to/export-data-bigquery-setup
You select the dataset where the cost metadata goes, once in BigQuery, it's fairly easy to query, which you can do with the python BigQuery client API, it also makes sure you keep a historic in case you change billing provider, of course it'll incur storage cost which will vary based on your usage.

How to know if how many node hours I have used on a Google Cloud ML for prediction?

I tried to find out how to get the node hour usage of my Google Cloud ML Prediction API, but didn't find any. Is there a way to know the usage, except for looking at the bills?
Here is the API Documentation I referred to.
The documentation page you referenced is part of the Cloud Machine Learning Engine API documentation:
An API to enable creating and using machine learning models.
That API is for using the product itself, it doesn't contain billing information for the product.
For billing info you want to look at Cloud Billing and its API:
With this API, you can do any of the following.
For billing accounts:
...
Get billing information for a project.
However from just a quick glance at the docs (I didn't use it yet) the API itself doesn't appear to directly provide the particular info you're looking for. But possible ways to get that info appear to be:
using Billing Reports:
The Cloud Billing Reports page lets you view your Google Cloud
Platform (GCP) usage costs at a glance and discover and analyze
trends. The Reports page displays a chart that plots usage costs for
all projects linked to a billing account. To help you view the cost
trends that are important to you, you can select a data range, specify
a time range, configure the chart filters, and group by project,
product, or SKU.
Billing reports can help you answer questions like these:
How is my current month's GCP spending trending?
Export Billing Data to a File:
To access a detailed breakdown of your charges, you can export your
daily usage and cost estimates automatically to a CSV or JSON file
stored in a Google Cloud Storage bucket you specify. You can then
access the data via the Cloud Storage API, CLI tool, or Google Cloud
Platform Console.
Export Billing Data to BigQuery:
Billing export to BigQuery enables you to export your daily usage and
cost estimates automatically throughout the day to a BigQuery dataset
you specify. You can then access your billing data from BigQuery. You
can also use this export method to export data to a JSON file.