Am working on a solution to get the resource costs from the BigQuery.
We have recently moved to BigQuery implementation for collecting billing information since CSV/JSON has been deprecated for use from GCP.
Though Bigquery provides only SKU level billing information, our application needs to collect the resource level billing information as well.
Usage reports can be exported as a CSV in cloud storage and which contains Measurement id, Resource id and USage units per resource. But, in BigQuery we don't have Measurement id to match with usage reports to get the resource level billing information.
The billing result from Bigquery is as follows.
Needed information on how we can collect the resource level costs along with BigQuery line items.
Following query can give you (user email & total bytes processed ). Since we know Total bytes BQ has processed for last ´N´ days, accordingly we can estimate cost of BQ analysis
SELECT
user_email,
sum(total_bytes_processed)
FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
WHERE creation_time BETWEEN TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 10 DAY)
AND CURRENT_TIMESTAMP()
AND job_type = "QUERY"
AND end_time BETWEEN TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 10 DAY) AND CURRENT_TIMESTAMP()
group by user_email
The only way, for now, is to use labels to get the resource level costs in GCP which is recommended by GCP as well.
Use labels to gain visibility into GCP resource usage and spending
https://cloud.google.com/blog/topics/cost-management/use-labels-to-gain-visibility-into-gcp-resource-usage-and-spending
Related
While coming to storage metrices, there are just options provided 'object count', 'Total Bytes' , 'Total byte seconds' . What if I need few more metrices like object whose retention period are about to over, maximum size of an object in the bucket etc. How can I achieve such metrices using monitoring dashboard?
Tried to replicate on my GCP environment but unfortunately their was no metrices for retention period and maximum size of an object in the bucket, I recommend you to create a feature request for another metrices dashboard for Cloud Storage. Regarding to your other inquiry you can check Metric Explorer from the documentation you can choose a specific metrics to create a chart for a specific metric categories using configuration (Console) or fetching of data using the query editor (MQL or PROMQL) then save to a custom dashboard.
In GCP bigquery, there are two types of pricing 1) On demand 2) Flat rate pricing.
I have two projects one is configured with on-demand, other one is configured with flat rate pricing. Now when i executes a query on two projects, i need to know the differentiation between the query executed on on-demand pricing and flat rate pricing. In the GUI, we can know the difference .
Flat Rate Pricing
On demand pricing
But through bigquery libraries i am calling the bigquery API to get the job Object by jobId. In that i am unable to find the difference between those queries , atleast i expected some info related to reservation in the flat rate pricing query, but there is no luck. I need this info in our monitoring tool to identify the difference between the queries that are executed on on-demand and flat rate.
One analysis i found was through information schema we can get the info, but i am more interested at the API level through bigquery java libraries.
In the Get job API, you get a JOB object. In it, you have a JobStatistic entry. In this entry, you have the reservation detail
Can Google BigQuery data transfer service allow me to transfer specific app data automatically?
For example, I have 10 apps in my Google play console, I only want to transfer to BQ within only 3 apps. Is it possible to make this work or any approach?
Also, I just read the price of doc, The monthly charge is $25 per unique Package Name in the Installs_country table.
I don't quite understand how to calculate my cost with that example.
Thank you.
For your requirement, you can download the reports in Cloud Storage of a specific app by selecting the app from Google Play Store for which you want to get the data and then send it to BigQuery using BigQuery Data Transfer Service. The cost calculation of Google Play, it is calculated as $25 per month per unique package and stored in the Installs_country table in BigQuery.
For selecting the specific app, follow the steps given below :
Go to the Play Console.
Click on Download Reports and select the type of report you want.
Under "Select an application," type and select the app for which you want to get the data.
Select the year and month for which you want to download the report.
If you are storing data in a Cloud Storage bucket then that will incur cost and the pricing for data transfer from one storage bucket to another storage bucket can be checked in this link and since you are storing and querying in BigQuery that will also be chargeable.For BigQuery pricing details you can check this documentation. You can use the Billing Calculator to calculate your costs.
Is there any API(preferably python) that can be used to get the resources' usage cost report in GCP? Billing APIs don't seem to return the costs of resources being used.
You can export you cloud billing data to BigQuery:
https://cloud.google.com/billing/docs/how-to/export-data-bigquery
https://cloud.google.com/billing/docs/how-to/export-data-bigquery-setup
You select the dataset where the cost metadata goes, once in BigQuery, it's fairly easy to query, which you can do with the python BigQuery client API, it also makes sure you keep a historic in case you change billing provider, of course it'll incur storage cost which will vary based on your usage.
I am trying to get total cost consumed for a particular resourceID in AWS cost explorer.I am able to take monthly or daily usage cost using usage type, instance name & product code but I couldn't take using resourceID.
I referred this link for calculating total cost based on usage type, instance type but I don't find any documentation to find any API for resourceID.
You can setup detailed billing in AWS which will dump all your usage details to csv files in an s3 bucket. You can read more about it on the billing page. Afterwards you can check on on the following page:
BILLING & COST MANAGEMENT->COST EXPLORER->MONTHLY SPEND by SERVICE
VIEW
You can also create tags for each of your clients then start receiving cost allocation reports that break down the per instance cost for each of the tags you set up.