I have been working on enabling the billing exports to Bigquery. I know the process to set it up based on the documentation but looks like I can set it up for only one project at a time. There are close of 70 projects in GCP and need to load the billing data for all the projects to big query and create a billing report to see the expensive projects and its services. My questions are
how can I configure the billingexport for all projects at a single time?
how can I get access to historical billing data?
Kindly answer my questions. Appreciate your help.
Thanks!
how can I configure the billingexport for all projects at a single
time?
Billing export to BigQuery is based upon the billing account. Enable billing export for each billing account. If you have different billing accounts per project, then you must enable export for each billing account.
how can I get access to historical billing data?
Billing data in BiqQuery is only available after you enable export to BigQuery and after waiting for the export to start. Previous data is not exported to BigQuery. For that reason, it is recommended to enable billing export at the time you create a billing account.
Another recommendation is to create a new project to hold the billing data.
For more details on the individual steps:
Set up Cloud Billing data export to BigQuery
Related
Can anyone have an idea how we can stop billing of aws quick sight and export current configurations of visuals and dashboard?
So if in the future we need to set up again with the same configuration we can do it?
Also, for deleting the quicksight account what other services will be removed?
I run a small research group at a large university that manages hundreds of GCP accounts. The university acts as the Billing Administrator, and my research group was assigned a GCP "project" for all of our work. However, for privacy reasons, they cannot give me access to the Billing API because this would allow me to see the billing details for other labs.
Because we have trainees in our lab who WILL make mistakes, I would like to setup an automated system that monitors our current GCP bill, and (1) sends notifications or (2) terminates all VMs, when that bill reaches certain predefined limits. For example, if our monthly budget is $10k, then I would like to receive a notification at $5k, another notification at $10k, and I would like to terminate all VMs at $15k.
My problem is that in order to implement a system like this, I need access to the Billing API. I have already contacted my system administrator and they have said that this is impossible. Instead, they proposed that I write a script that lists all VMs and uses the Cost Calculator to estimate my monthly GCP bill.
However, this seems a little circuitous. When I am using the Google Cloud Console, I can see the total and forecasted costs for my project, so it seems that I should be able to access this information programmatically. However, I cannot find any information on how to do this, since all solutions require me to activate the Billing API. Any ideas?
There is no API to fetch the data you see in the Google Cloud Console. You will need to export the billing data and then process each row of data to generate reports.
There are two options that I can think of:
Option 1) Ask the admin to set up billing data export to BigQuery. Grant you permission to query the billing tables. You can then query BiGQuery to generate your own cost reports.
Set up Cloud Billing data export to BigQuery
Option 2) Create a separate billing account for your project and grant you permission. A GCP ORG can have multiple Billing Accounts tied to the same Payments Account. This option supports creating budget alerts.
I want write utility which fetches billing information for my project but I am not able to find any specific API from GCP to do the same. I tried a couple of APIs like getBillingInfo but these APIs just give information about billing account, not the pricing. They have the mechanism to export billing data to file but I want to do it programmatically. Is there any API to do the same?
One possible way that I am aware of is to export the Cloud billing to BigQuery. The process (document) to export Cloud Billing to BigQuery can be found here: https://cloud.google.com/billing/docs/how-to/export-data-bigquery
Once the export is done, the billing information including the price and services is available in almost realtime on BigQuery table. Once it's available on BigQuery there are numerous ways of extracting the information. A good solution would be to have Data Studio on Google Cloud to send you a periodic report on your billing information.
I'm currently under an AWS Organisation Subscription. I want to export cost by CSV of my linked account into a S3 bucket.
I have rights on my account, I can use Cost Explorer, do Budgets, but I'm unable to do the Cost&Usage Report as shown in this screenshot :
The official doc doesn't talk about this case. My question is 'Does the organisation account can enable the billing feature 'Cost and Usage Reports'
for one linked account?' If not, is there a way to automate this export of CSV into a S3 bucket ?
The CUR (Cost and usage report) will be generated only in the billing account if you're under an organization. All you have to do is go to your billing account, and enable CUR from there. It's a simple process and the reports will be sent to the billing account S3. It is also best practice from security and isolation perspective.
Note that the link you sent is the DBR (Detailed billing report) and is considered legacy already.
https://docs.aws.amazon.com/cur/latest/userguide/what-is-cur.html
If you use the consolidated billing feature in AWS Organizations, the Amazon S3 bucket that you designate to receive the billing reports must be owned by the master account in your organization. You can't receive billing reports in a bucket that is owned by a member account. If you use consolidated billing, you can also have your costs broken down by member account.
I'm implementing a Cloud Dataflow job on GCP that needs to deal with 2 GCP projects.
Both input and output are Bigquery partitionned tables.
The issue I'm going through now is that I must read data from a project A and write it into a project B.
I havent seen anything related to cross project service accounts and I can't give Dataflow two different credential key either which is a bit annoying ?
I don't know if someone else went through that kind of architecture or how you dealt with it.
I think you can accomplish this with the following steps:
Create a dedicated service account in the project running the Dataflow job.
Grant the service account the Dataflow Worker and BigQuery Job User roles. The service account might need additional roles based on the full resource needs of the Dataflow job.
In Project A, grant the service account the BigQuery Data Viewer role to either the entire project or to specific datasets.
In Project B, grant the service account the BigQuery Data Editor role to either the entire project or to specific datasets.
When you start the Dataflow job, override the service account pipeline option supplying the new service account.
It is very simple. you need to give required permission/access to your service account from both the project.
So you need only service account which has required access/permission in both the project
Hope it helps.