Differentiation between query executed on on demand pricing and flat rate pricing in bigquery - google-cloud-platform

In GCP bigquery, there are two types of pricing 1) On demand 2) Flat rate pricing.
I have two projects one is configured with on-demand, other one is configured with flat rate pricing. Now when i executes a query on two projects, i need to know the differentiation between the query executed on on-demand pricing and flat rate pricing. In the GUI, we can know the difference .
Flat Rate Pricing
On demand pricing
But through bigquery libraries i am calling the bigquery API to get the job Object by jobId. In that i am unable to find the difference between those queries , atleast i expected some info related to reservation in the flat rate pricing query, but there is no luck. I need this info in our monitoring tool to identify the difference between the queries that are executed on on-demand and flat rate.
One analysis i found was through information schema we can get the info, but i am more interested at the API level through bigquery java libraries.

In the Get job API, you get a JOB object. In it, you have a JobStatistic entry. In this entry, you have the reservation detail

Related

API for monitoring AWS lambda and other instances pricing in real time?

I am looking for a programmatic way to monitor my lambda serverless environment cost in real time, or x hours retrospective. I am looking at the budget API but it seems like it always goes around a defined budget which is not my use case. The other way I thought might work is the count lambda executions and calculate according to lambda instance type. Any insight or direction how to go about this programmatically would be highly appreciated.
From Using the AWS Cost Explorer API - AWS Billing and Cost Management:
The Cost Explorer API allows you to programmatically query your cost and usage data. You can query for aggregated data such as total monthly costs or total daily usage. You can also query for granular data, such as the number of daily write operations for DynamoDB database tables in your production environment.
Cost Explorer refreshes your cost data at least once every 24 hours, so it isn't "real-time".

GCP BigQuery resource level costs aggregation

Am working on a solution to get the resource costs from the BigQuery.
We have recently moved to BigQuery implementation for collecting billing information since CSV/JSON has been deprecated for use from GCP.
Though Bigquery provides only SKU level billing information, our application needs to collect the resource level billing information as well.
Usage reports can be exported as a CSV in cloud storage and which contains Measurement id, Resource id and USage units per resource. But, in BigQuery we don't have Measurement id to match with usage reports to get the resource level billing information.
The billing result from Bigquery is as follows.
Needed information on how we can collect the resource level costs along with BigQuery line items.
Following query can give you (user email & total bytes processed ). Since we know Total bytes BQ has processed for last ´N´ days, accordingly we can estimate cost of BQ analysis
SELECT
user_email,
sum(total_bytes_processed)
FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
WHERE creation_time BETWEEN TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 10 DAY)
AND CURRENT_TIMESTAMP()
AND job_type = "QUERY"
AND end_time BETWEEN TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 10 DAY) AND CURRENT_TIMESTAMP()
group by user_email
The only way, for now, is to use labels to get the resource level costs in GCP which is recommended by GCP as well.
Use labels to gain visibility into GCP resource usage and spending
https://cloud.google.com/blog/topics/cost-management/use-labels-to-gain-visibility-into-gcp-resource-usage-and-spending

API for getting daily costs related to GCP

Is there any API(preferably python) that can be used to get the resources' usage cost report in GCP? Billing APIs don't seem to return the costs of resources being used.
You can export you cloud billing data to BigQuery:
https://cloud.google.com/billing/docs/how-to/export-data-bigquery
https://cloud.google.com/billing/docs/how-to/export-data-bigquery-setup
You select the dataset where the cost metadata goes, once in BigQuery, it's fairly easy to query, which you can do with the python BigQuery client API, it also makes sure you keep a historic in case you change billing provider, of course it'll incur storage cost which will vary based on your usage.

If Google BigQuery returns an error, am I paying for using the resources?

I'm using a CROSS JOIN across 2 tables in Google BigQuery. Then the following error appears
Resources exceeded during query execution: Your project or organization exceeded the maximum disk and memory limit available for shuffle operations. Consider provisioning more slots, reducing query concurrency, or using more efficient logic in this job.
Is this kind of query being charged by Google?
Based on the documentation section "On-demand pricing" I would say, that you should not be charged for queries that return an error.
You aren't charged for queries that return an error or for queries that retrieve results from the cache.

Is there any API to get the total cost for a particular resourceID in AWS?

I am trying to get total cost consumed for a particular resourceID in AWS cost explorer.I am able to take monthly or daily usage cost using usage type, instance name & product code but I couldn't take using resourceID.
I referred this link for calculating total cost based on usage type, instance type but I don't find any documentation to find any API for resourceID.
You can setup detailed billing in AWS which will dump all your usage details to csv files in an s3 bucket. You can read more about it on the billing page. Afterwards you can check on on the following page:
BILLING & COST MANAGEMENT->COST EXPLORER->MONTHLY SPEND by SERVICE
VIEW
You can also create tags for each of your clients then start receiving cost allocation reports that break down the per instance cost for each of the tags you set up.