AWS account creation using terraform - amazon-web-services

I am trying to create new aws account within our AWS org, but I am still getting no changes after terraform plan:
"No changes. Your infrastructure matches the configuration.
Terraform has compared your real infrastructure against your configuration and found no differences, so no changes are needed."
Am I missing something? This is the code:
resource "aws_organizations_account" "new_aws_member_account" {
name = "XXX"
email = "XXX#XXX"
iam_user_access_to_billing = "ALLOW"
}
I already tried to deploy new IAM policy (within AWS org account) and there was no problem, but I just can't create new account using this code, I probably missed something, but don't know what.
Our AWS org is created manually using AWS console, so not via terraform, but this shouldn't be a problem or yes?
Can you help please?

Related

Create an iam role under specific aws account using terraform

I'm really new to terraform and has been stuck in this for a while.
So I'm using an external module which creates an aws_iam_role and also corresponding policies. In my terraform code, I just use the following code to create the module but how can I make sure the roles are created under specific aws account? I have multiple aws accounts right now but I just want the external module to be in one of them. The account id for the target aws account is known. Thanks!
module "<external_module>" {
source = "git::..."
...
}
Thanks!

Error 403: Storage objects forbidden in GCP

I’m trying to create all new sandbox project in GCP for easy deployment and upgrade project. Using Terraform I am creating a GKE cluster. Issue is, the terraform scripts are written for the service accounts of a project named let’s say NP-H. Now, I am trying to create clusters using the same scripts in a project named let’ say NP-S.
I ran Terraform init and experiencing an
error 403: XXX.serviceaccount does not have storage.object.create access to google cloud storage objects., forbidden.
storage: object doesn’t exist.
Now, is the problem with Terraform script or service account permissions?
If it is Terraform script, what are the changes I need to make?
PS: I was able to create a buckets and upload them to cloud storage…
Two ways you can store credentials:
provider "google" {
credentials = file("service-account-file.json")
project = "PROJECT_ID"
region = "us-central1"
zone = "us-central1-c"
}
or
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/service-account-file.json"
Make sure service account is from project ID NP-S, Menu > IAM & admin > service account, and has proper permissions: Menu > IAM & admin > IAM > ...
cat service-account-file.json
and make sure it is the email from correct project ID. You can do a quick test with owner/editor role to isolate the issue if need be as those role have most permissions.
If you're using service account impersonation, do this :
terraform {
backend "gcs" {
bucket = "<your-bucket>"
prefix = "<prefix>/terraform.tfstate"
impersonate_service_account = "<your-service-account>#<your-project-id>.iam.gserviceaccount.com"
}
}
Source : Updating remote state files with a service account

Ensure Google service accounts

In Terraform I enable services like so:
resource "google_project_service" "apigateway" {
service = "apigateway.googleapis.com"
}
Afterwards I ensure that I am referencing the service account of apigateway (service-123#gcp-sa-apigateway.iam.gserviceaccount.com) only after the resource was created.
Now it does happen sometimes that when using the email of sa, I get an error that the service account is not present:
Error 400: Service account service-123#gcp-sa-apigateway.iam.gserviceaccount.com does not exist.
I double checked in API Explorer that the API is enabled!
This in turn does happen for apigateway the same way as for others (e.g. cloudfunctions).
So I am wondering how do I ensure that the service account is created?
Naively I assumed creating google_project_services should do the trick but that seems not be true in every case. Documentation around Google service account is pretty sparse it seems :(
As John Hanley remarks, you can create this dependency in terraform with depends_on.
As you can see on the following comment, the service account will be created but the key will be assigned until the first sentence is done.
resource "google_service_account" "service_account" {
account_id = "terraform-test"
display_name = "Service Account"
}
resource "google_service_account_key" "mykey" {
service_account_id = google_service_account.service_account.id
public_key_type = "TYPE_X509_PEM_FILE"
depends_on = [google_service_account.service_account]
}
Also, if the service account is already created on the GCP platform only is executed the key statement.
It is important noticed that the account that you are using for this configuration needs to have the required IAM permission to create an account.
Found out about google_project_service_identity.
So since I saw this problem with cloudfunctions you could create a google_project_service_identity.cloudfunctions and hope for a detailed error message.
Sadly this is not available for all, e.g. apigateway.
For apigateway specifically, Google Support confirmed that undocumented behavior is the SA gets created lazily when creating first resource.

How to properly create gcp service-account with roles in terraform

Here is the terraform code I have used to create a service account and bind a role to it:
resource "google_service_account" "sa-name" {
account_id = "sa-name"
display_name = "SA"
}
resource "google_project_iam_binding" "firestore_owner_binding" {
role = "roles/datastore.owner"
members = [
"serviceAccount:sa-name#${var.project}.iam.gserviceaccount.com",
]
depends_on = [google_service_account.sa-name]
}
Above code worked great... except it removed the datastore.owner from any other service account in the project that this role was previously assigned to. We have a single project that many teams use and there are service accounts managed by different teams. My terraform code would only have our team's service accounts and we could end up breaking other teams service accounts.
Is there another way to do this in terraform?
This of course can be done via GCP UI or gcloud cli without any issue or affecting other SAs.
From terraform docs, "google_project_iam_binding" is Authoritative. Sets the IAM policy for the project and replaces any existing policy already attached. That means that it replaces completely members for a given role inside it.
To just add a role to a new service account, without editing everybody else from that role, you should use the resource "google_project_iam_member":
resource "google_service_account" "sa-name" {
account_id = "sa-name"
display_name = "SA"
}
resource "google_project_iam_member" "firestore_owner_binding" {
project = <your_gcp_project_id_here>
role = "roles/datastore.owner"
member = "serviceAccount:${google_service_account.sa-name.email}"
}
Extra change from your sample: the use of service account resource email generated attribute to remove the explicit depends_on. You don't need the depends_on if you do it like this and you avoid errors with bad configuration.
Terraform can infer the dependency from the use of a variable from another resource. Check the docs here to understand this behavior better.
It's an usual problem with Terraform. Either you do all with it, or nothing. If you are between, unexpected things can happen!!
If you want to use terraform, you have to import the existing into the tfstate. Here the doc for the bindind, and, of course, you have to add all the account in the Terraform file. If not, the binding will be removed, but this time, you will see the deletion in the tf plan.

google storage transfer service account does not exist in new project

I am trying to create resources using Terraform in a new GCP project. As part of that I want to set roles/storage.legacyBucketWriter to the Google managed service account which runs storage transfer service jobs (the pattern is project-[project-number]#storage-transfer-service.iam.gserviceaccount.com) for a specific bucket. I am using the following config:
resource "google_storage_bucket_iam_binding" "publisher_bucket_binding" {
bucket = "${google_storage_bucket.bucket.name}"
members = ["serviceAccount:project-${var.project_number}#storage-transfer-service.iam.gserviceaccount.com"]
role = "roles/storage.legacyBucketWriter"
}
to clarify, I want to do this so that when I create one off transfer jobs using the JSON APIs, it doesn't fail prerequisite checks.
When I run Terraform apply, I get the following:
Error applying IAM policy for Storage Bucket "bucket":
Error setting IAM policy for Storage Bucket "bucket": googleapi:
Error 400: Invalid argument, invalid
I think this is because the service account in question does not exist yet as I can not do this via the console either.
Is there any other service that I need to enable for the service account to be created?
it seems I am able to create/find the service account once I run this:
https://cloud.google.com/storage/transfer/reference/rest/v1/googleServiceAccounts/get
for my project to get the email address.
not sure if this is the best way but it works..
Soroosh's reply is accurate, after querying the API as per this DOC: https://cloud.google.com/storage-transfer/docs/reference/rest/v1/googleServiceAccounts/ will enable the service account and terraform will run, but now you have to create an api call in terraform for that to work, ain't nobody got time for that.