GCP inter-project IAM with terraform - google-cloud-platform

I'm new to GCP and terraform, i need some explanation about the topic in the title.
My problem:
I have 2 (or more) GCP projects under the same organization.
I want a cloud run from project A to write on a bucket in project B.
I have two terraform projects, one for each GCP project.
My question is: how can I make things work?
Thanks in advance.
I created the bucket in project B.
I created the cloud run in project A.
I created a service account in project A for the cloudrun.
In project B I created the binding, but something is not clear to me...

Add this to your project's B terraform:
resource "google_storage_bucket_iam_member" "grant_access_to_sa_from_project_a_to_this_bucket" {
provider = google
bucket = "<my_project_b_bucket_name"
role = "roles/storage.objectViewer"
member = "serviceAccount:my_service_account#project_a.iam.gserviceaccount.com"
}
Specify the role according to what you need. The list of the gcs roles are here.
The docs of gcs buckets IAM policies are here.

Related

How can I add IAM role for other project service account in Google Cloud in different project using Terraform

To be Specific, I have two projects A and B. I want to add an IAM role to service account from project A in Project B. I'm executing terraform script from bitbucket pipeline
Below is the resource block I tried to implement.
resource "google_project_iam_member" "role1" {
project = var.project
role = "roles/dialogflow.admin"
member = "user:cui-server-service-account#cproject.iam.gserviceaccount.com"
}
project is the variable used for Project B and cproject is a variable used for Project A which I'll pass the project names during bitbucket pipeline execution.
member = "serviceAccount:<<service account email>>"
The prefix is to be serviceAccount rather than user from your example - see how the member/members argument is described
And check that the user (or service account) who (which) runs the terraform 'apply' - has relevant IAM roles to assign what you would like.

AWS cross account access for code commit in build AWS code job source

I have two AWS accounts A, B.
All my code commit repositories are present in account A.
Now I want to create the AWS code Build job in account B for repositories in account A.
I am trying to figure out to get the list of AWS repositories in account B from account A while selecting the source for creating a code build job.
I am not sure how to get the list of repositories from account A to account B in the source Repository field.
I have followed the below tutorial only till the second topic.
https://docs.aws.amazon.com/codecommit/latest/userguide/cross-account.html
Any help will be appreciated.
You can configure access to CodeCommit repositories for IAM users and groups in another AWS account. This is often referred to as cross-account access.
Mainly you be need to do the following:
Will need to create a policy and role for the repository with the needed permissions.
Create a policy and attach to your CodeBuild Role allowing the access on the Resource for the created Role
eg.
"Resource": "arn:aws:iam::REPO_ACCOUNT:role/MyCrossAccountRepositoryContributorRole"
This will enable the CodeBuild to access the needed CodeCommit repository.
This page explain this very well: Configure cross-account access to an AWS CodeCommit repository using roles.
Also, check this blog post that explain a little more detailed what you want:
AWS CodePipeline with a Cross-Account CodeCommit Repository.

Create an iam role under specific aws account using terraform

I'm really new to terraform and has been stuck in this for a while.
So I'm using an external module which creates an aws_iam_role and also corresponding policies. In my terraform code, I just use the following code to create the module but how can I make sure the roles are created under specific aws account? I have multiple aws accounts right now but I just want the external module to be in one of them. The account id for the target aws account is known. Thanks!
module "<external_module>" {
source = "git::..."
...
}
Thanks!

Error 403: Storage objects forbidden in GCP

I’m trying to create all new sandbox project in GCP for easy deployment and upgrade project. Using Terraform I am creating a GKE cluster. Issue is, the terraform scripts are written for the service accounts of a project named let’s say NP-H. Now, I am trying to create clusters using the same scripts in a project named let’ say NP-S.
I ran Terraform init and experiencing an
error 403: XXX.serviceaccount does not have storage.object.create access to google cloud storage objects., forbidden.
storage: object doesn’t exist.
Now, is the problem with Terraform script or service account permissions?
If it is Terraform script, what are the changes I need to make?
PS: I was able to create a buckets and upload them to cloud storage…
Two ways you can store credentials:
provider "google" {
credentials = file("service-account-file.json")
project = "PROJECT_ID"
region = "us-central1"
zone = "us-central1-c"
}
or
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/service-account-file.json"
Make sure service account is from project ID NP-S, Menu > IAM & admin > service account, and has proper permissions: Menu > IAM & admin > IAM > ...
cat service-account-file.json
and make sure it is the email from correct project ID. You can do a quick test with owner/editor role to isolate the issue if need be as those role have most permissions.
If you're using service account impersonation, do this :
terraform {
backend "gcs" {
bucket = "<your-bucket>"
prefix = "<prefix>/terraform.tfstate"
impersonate_service_account = "<your-service-account>#<your-project-id>.iam.gserviceaccount.com"
}
}
Source : Updating remote state files with a service account

Sharing custom images within an organization in GCP

I am trying to share a custom image in GCP between projects in the organization.
1) Project A
2) project B
All my custom Images are in project A.
I would like to share images of project A to Project B
As per the documentation I ran the following command to share images to project B
gcloud projects add-iam-policy-binding projecta --member serviceAccount:xxxxxx#cloudservices.gserviceaccount.com --role roles/compute.imageUser
I am using Terraform to provision the instances. In terraform, I am specifying to take the image from project A.
boot_disk {
initialize_params {
image = "projects/project_A/global/images/custom_image"
}
}
I am getting the below error
Error: Error creating instance: googleapi: Error 403: Required 'compute.images.useReadOnly' permission for 'projects/project_A/global/images/custom_image', forbidden
Can someone please help me out....
I guess the documentation is for Deployment Manager, not for Terraform, the command you run granted the role to service account xxxxxx#cloudservices.gserviceaccount.com, but Terraform is not using that account by default.
You need to make sure Terraform has enough permission. You may supply xxxxxx#cloudservices.gserviceaccount.com to Terraform or create a new service account for Terraform and grant roles/compute.imageUser to it.
You've just done the first step which is granti an service account a proper permissions to share your images across your organisation. The roles/compute.imageUser role is required to do it.
Your Terraform config also looks OK (you have to make sure the self_link to your image is correct (refer to this documentation to make sure image value in Terraform config is OK).
Also make sure you're providing proper service account credentials to Terraform as stated in #Ken Hung's answer.