GCP cloud billing export into BQ through terraform - google-cloud-platform

Can we automate GCP billing export into BQ through Terraform?
I tried below terraform code but it's not working. So, not sure if GCP billing exporting into BQ would be possible through Terraform or not.
resource "google_logging_billing_account_sink" "billing-sink" {
name = "billing-sink"
description = "Billing export"
billing_account = "**********"
unique_writer_identity = true
destination = "bigquery.googleapis.com/projects/${var.project_name}/datasets/${google_bigquery_dataset.billing_export.dataset_id}"
}
resource "google_project_iam_member" "log_writer" {
project = var.project_name
role = "roles/bigquery.dataEditor"
member = google_logging_billing_account_sink.billing-sink.writer_identity
}

Unfortunately,there is no such option. This concern is already raised under github and this is in enhancement. Currently there is no ETA available. I can see in terraform only google_logging_billing_account_sink and Automating logs export to BigQuery with Terraform.

Related

running into error 403 during terraform apply

Running terraform apply getting below error.
{"error":{"code":403,"message":"The billing account for the owning project is disabled in state absent","errors":[{"message":"The billing account for the owning project is disabled in state absent","domain":"global","reason":"accountDisabled","locationType":"header","location":"Authorization"}]}}: timestamp=2022-12-31T00:04:43.690-0500
For the project I added the billing account.
I am able to run gcloud commands from shell without any errors using the same service account.
terraform_gcp % gcloud auth activate-service-account --key-file=sakey.json
Activated service account credentials for: [gcp-terraform#saproject.iam.gserviceaccount.com]
terraform_gcp % gsutil ls
gs://mygcptfstatebucket/
terraform_gcp % gcloud compute instances list
Listed 0 items.
my main.tf is
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "4.47.0"
}
}
}
provider "google" {
project = "my-gcp-project"
region = "us-east1"
zone = "us-east1-b"
}
Any insights into this error.
Tried adding below block and its working seems issue with my configuration.
data "google_project" "project_name" {
project_id = "projectid"
}
and referencing project_id in the resource block.

Creating a terraform with Eventarc with its destination to cloud function

https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/eventarc_trigger
I was looking to create a terraform file to help deploy an eventarc with its destination as a cloud function. On the terraform docs it says it was not available to configure. Does this mean I can only deploy the eventarc with a generic cloud function and would need to configure the rest through the GUI? Or is there another solution I can try to fully deploy it through terraform?
Or maybe can I try to create it through deploying a cloud function through terraform? If so how would I code the event_trigger block for the eventarc? Below I am guessing what it would be?
resource "google_cloudfunctions_function" "cloudfunc-name" {
name = "cloudfunc-name"
description = "cloud func desc"
runtime = "python39"
project = "googleproject"
region = "us-central1"
available_memory_mb = 256
max_instances = 10
timeout = 300
entry_point = "helloworld_entry"
source_archive_bucket = "filepath_to_bucket"
source_archive_object = google_storage_bucket_object.function_zip_bucket_object.name
event_trigger {
event_type = "google.cloud.audit.log.v1.written"
event = "google.cloud.bigquery.v2.TableService.InsertTable"
receive_event = "us-central1"
service_account = "someserviceaccount#gserviceaccount.com"
failure_policy {
retry = false
}
}
Eventarc can only trigger Cloud Functions 2nd generation. Why? Because Cloud Functions 2nd gen is backed on Cloud Run.
So, you have to use the 2nd gen terraform module to deploy your function and then use the Cloud Functions section on the eventarc. I didn't test recently. In any cases, if it don't work with the Cloud Functions config on event arc, you can replace it with Cloud Run config (provide the name of the functions (which is the name of the Cloud Run service also))

How can I create a project in gcp via terraform?

I am planning to use terraform to deploy to GCP and I have read the instruction on how to set it up:
provider "google" {
project = "{{YOUR GCP PROJECT}}"
region = "us-central1"
zone = "us-central1-c"
}
it requires a project name in the provider configuration. But I am planning to create the project via terraform like below code:
resource "google_project" "my_project" {
name = "My Project"
project_id = "your-project-id"
org_id = "1234567"
}
how can I use terraform without a pre-created project?
Take a look on this tutorial (from Community):
Creating Google Cloud projects with Terraform
This tutorial assumes that you already have a Google Cloud account set up for your organization and that you are allowed to make organization-level changes in the account
First step,for example, is to setup your ENV variables with your Organization ID and your billing account ID which will allow you to create the projects using terraform:
export TF_VAR_org_id=YOUR_ORG_ID
export TF_VAR_billing_account=YOUR_BILLING_ACCOUNT_ID
export TF_ADMIN=${USER}-terraform-admin
export TF_CREDS=~/.config/gcloud/${USER}-terraform-admin.json

How to run terraform script in aws sub organization?

I have Terrafrom script that build infrastructure on AWS main account. In my AWS account i have sub organisations . I need to run my TF script to build infrastructure on one of that sub-organisation. How can i do it ?
The best practice to do so is to create a "TerraformRole" in your sub account, which can be assumed by the "TerraformRole" from your master AWS account.
You then define the AWS provider to assume this role.
provider "aws" {
version = "~> 2.33.0"
region = var.region
assume_role {
role_arn = "arn:aws:iam::${var.account_id}:role/${var.terraform_role_name}"
}
}

Cannot deploy public api on Cloud Run using Terraform

Terraform now supports cloud run as documented here,
and I'm trying the example code below.
resource "google_cloud_run_service" "default" {
name = "tftest-cloudrun"
location = "us-central1"
provider = "google-beta"
metadata {
namespace = "my-project-name"
}
spec {
containers {
image = "gcr.io/cloudrun/hello"
}
}
}
Although it deploys the sample hello service with no error, when I access to the auto-generated URL, it returns 403(Forbidden) response.
Is it possible to create public cloud run api using terraform?
(When I'm creating the same service using GUI, GCP provides "Allow unauthenticated invocations" option under "Authentication" section, but there seems to be no equivalent option in terraform document...)
Just add the following code to your terraform script, which will make it publicly accessable
data "google_iam_policy" "noauth" {
binding {
role = "roles/run.invoker"
members = [
"allUsers",
]
}
}
resource "google_cloud_run_service_iam_policy" "noauth" {
location = google_cloud_run_service.default.location
project = google_cloud_run_service.default.project
service = google_cloud_run_service.default.name
policy_data = data.google_iam_policy.noauth.policy_data
}
You can also find this here
Here the deployment is only based on Knative serving spec. Cloud Run managed implements these specs but have its own internal behavior, like role check linked with IAM (not possible with Knative and a K8S cluster, this is replaced by Private/Public service). The namespace on Cloud Run managed is the projectId, a workaround to identify the project for example, not a real K8S namespace.
So, the latest news that I have from Google (I'm Cloud Run Alpha Tester) which tells they are working with Deployment Manager and Terraform for integrating Cloud Run in them. I don't have deadline, sorry.