Terraform Cloud / Enterprise - How to use AWS Assume Roles - amazon-web-services

I would like to use AWS Assume Roles, with Terraform Cloud / Enterprise
In Terraform Open Source, you would typically just do an Assume Role, leveraging the .aws/Credential Profile on the CLI, which is the initial authentication, and performing the Assume Role:
provider "aws" {
assume_role {
role_arn = "arn:aws:iam::ACCOUNT_ID:role/ROLE_NAME"
session_name = "SESSION_NAME"
external_id = "EXTERNAL_ID"
}
}
The issue is, with Terraform Enterprise or Cloud, you cannot reference a profile, as the immutable infrastructure will not have that file in its directory.
Terraform Cloud/Enterprise needs to have an Access Key ID, and Secret Access Key, set as a variable, so its infrastructure can perform the Terraform RUN, via its Pipeline, and authenticate to what ever AWS Account you would like to provision within.
So the question is:
How can I perform an AWS Assume Role, leveraging the Access Key ID, and Secret Access Key, of the AWS account with the "Action": "sts:AssumeRole", Policy?
I would think, the below would work, however Terraform is doing the initial authentication via the AWS Credential Profile creds, for the account which has the sts:AssumeRole policy
Can Terraform look at the access_key, and secret_key, to determine what AWS account to use, when trying to assume the role, rather than use the AWS Credential Profile?
provider "aws" {
region = var.aws_region
access_key = var.access_key_id
secret_key = var.secret_access_key
assume_role {
role_arn = "arn:aws:iam::566264069176:role/RemoteAdmin"
#role_arn = "arn:aws:iam::<awsaccount>:role/<rolename>" # Do a replace in "file_update_automation.ps1"
session_name = "RemoteAdminRole"
}
}
In order to allow Terraform Cloud/Enterprise to get new Assume Role Session Tokens, it would need to use the Access_key and Secret_key, to tell it what AWS Account has the sts:assume role, linking to the member AWS Account to be provisioned, and not an AWS Creds Profile
Thank you

This can be achive if you have a business plan enabled and implement self hosted terraform agents in you infrastructure.See video.

I used the exact same provider configuration minus the explicit adding of the acces keys.
The access keys were added in the Terraform Cloud workspace as environment variables.

This is definitely possible with Terraform Enterprise (TFE) if your TFE infrastructure is also hosted in AWS and the instance profile is trusted by the role you are trying to assume.
For Terraform Cloud (TFC) it is a different story, today there is no way to create a trust between TFC and an IAM role, but we can leverage the AWS SDK's ability to pickup credentials from environment variables. You have 2 options:
Create an AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variable on the workspace and set them (remember to mark the secret access key as sensitive). The provider will pickup these in the env and work the same as on your local.
If all workspaces need to use the same access and secret keys, you can set the env variables on a variable set, which will apply to all workspaces.

Related

Terraform and AWS secrets

I must be missing something in how AWS secrets can be accessed through Terraform. Here is the scenario I am struggling with:
I create an IAM user named "infra_user", create ID and secret access key for the user, download the values in plain txt.
"infra_user" will be used to authenticate via terraform to provision resources, lets say an S3 and an EC2 instance.
To protect the ID and secret key of "infra_user", I store them in AWS secrets manager.
In order to authenticate with "infra_user" in my terraform script, I will need to retrieve the secrets via the following block:
data "aws_secretsmanager_secret" "arn" {
arn = "arn:aws:secretsmanager:us-east-1:123456789012:secret:example-123456"
}
But, to even use the data block in my script and retrieve the secrets wouldn't I need to authenticate to AWS in some other way in my provider block before I declare any resources?
If I create another user, say "tf_user", to just retrieve the secrets where would I store the access key for "tf_user"? How do I avoid this circular authentication loop?
The Terraform AWS provider documentation has a section on Authentication and Configuration and lists an order of precedence for how the provider discovers which credentials to use. You can choose the method that makes the most sense for your use case.
For example, one (insecure) method would be to set the credentials directly in the provider:
provider "aws" {
region = "us-west-2"
access_key = "my-access-key"
secret_key = "my-secret-key"
}
Or, you could set the environment variables in your shell:
export AWS_ACCESS_KEY_ID="my-access-key"
export AWS_SECRET_ACCESS_KEY="my-secret-key"
export AWS_DEFAULT_REGION="us-west-2"
now your provider block simplifies to:
provider "aws" {}
when you run terraform commands it will automatically use the credentials in your environment.
Or as yet another alternative, you can store the credentials in a profile configuration file and choose which profile to use by setting the AWS_PROFILE environment variable.
Authentication is somewhat more complex and configurable than it seems at first glance, so I'd encourage you to read the documentation.

How to share AWS infrastructure among multiple accounts using Terraform and AWS SSO?

I have the following multi-account setup with AWS SSO:
An account called "infrastructure-owner". Under this account, there is a role called "SomeAccessLevel" where I can click to sign-in the web console.
Another account called "infrastructure-consumer". Under this account there is the same role called "SomeAccessLevel" where I can click to sign-in the web console. There may be other roles.
Account "infrastructure-owner" owns resources (for example S3 buckets, DynamoDB tables, or VPNs) typically with read/write access. This account is somewhat protected and rarely used. Account "infrastructure-consumer" merely have read access to resources in "infrastructure-owner". This account is used often by multiple people/services. For example, production data pipelines run in "infrastructure-consumer" and have read-only rights to S3 buckets in "infrastructure-owner". However, from time to time, new data may be included manually in these S3 buckets via sign-in "infrastructure-owner".
I would like to provision this infrastructure with Terraform. I am unable to provide permissions for "infrastructure-consumer" to access resources from "infrastructure-owner". I've read dozens of blog posts on AWS multi-account / SSO / Terraform but I still cannot do it. At this point, I cannot even do it manually in the web console.
Please realize that "SomeAccessLevel" is a role created by AWS that I cannot modify (typically called AWSReservedSSO_YOURNAMEHERE_RANDOMSTRING). Also, I cannot give permissions to particular users, since these users may not be owned by "infrastructure-consumer". Also, users access this account via SSO using a role.
The following Terraform code is an example DynamoDB table created in the "infrastructure-owner" that I would like to read in the "infrastructure-consumer" account (any role):
# Terraform config
terraform {
required_version = ">= 1.0.0"
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 3.44"
}
}
backend "remote" {
hostname = "app.terraform.io"
organization = "YOUR_ORGANIZATION_HERE"
workspaces {
name = "YOUR_TF_WORKSPACE_NAME_HERE" # linked to "infrastructure-owner"
}
}
}
# Local provider
provider "aws" {
profile = "YOUR_AWS_PROFILE_NAME_HERE" # linked to "infrastructure-owner"
region = "eu-central-1"
}
# Example resource that I would like to access from other accounts like "infrastructure-consumer"
resource "aws_dynamodb_table" "my-database" {
# Basic
name = "my-database"
billing_mode = "PAY_PER_REQUEST"
hash_key = "uuid"
# Key
attribute {
name = "uuid"
type = "S"
}
}
# YOUR CODE TO ALLOW "infrastructure-consumer" TO READ THE TABLE.
It could also happen that there is a better architecture for this use case. I am trying to follow general practices for AWS multi-account for production environments, and Terraform for provisioning them.
Thank you!
I assume you mean AWS accounts and not IAM accounts (users).
I remember that roles to be assumed via AWS SSO had something called permission sets, which is no more than a policy with API actions allowed|denied to be performed while assuming a role. I don't know exactly how AWS SSO could influence how role trust works in AWS, but you could have a role in infrastructure-owner's account that trusts anything in infrastructure-consumer's account, i.e. trusting "arn:aws:iam::${var.infrastructure-consumer's account}:root"
To achieve that with Terraform you would run it in your management account (SSO Administrator's Account) and make that trust happen.

Terraform to get AWS data from Account A and use it in Account B

I'm using two sets of terraform scripts which creates application infrastructure in AWS on Account A and codepipeline in Account B, TF scripts that creates codepipeline in Account B will need some configuration parameters(ALB, ECS..etc) from Account A which is already setup, I'm familiar on getting data if everything is hosted on the same AWS account, Is it possible to retrieve the data of one account from other using Terraform? Is there any documentation for this scenario?
Yes, it is possible. Since the question is rather generic I can only provide generic information.
In general, for cross-account access, cross-account IAM roles are used. This is a good practice, not a requirement. AWS doc info about the roles:
Providing Access to an IAM User in Another AWS Account That You Own
Tutorial: Delegate Access Across AWS Accounts Using IAM Roles
Based on these, in Account A you would have to setup an assumable role with a trust relationship allowing Account B to assume the role. In Account B, the IAM user that is used for terraform would need to have IAM permissions to assume the role.
Having the role setup, in terraform you would use aws provider that would assume_role, e.g. from docs:
provider "aws" {
alias = "assumed_role_provider"
assume_role {
role_arn = "arn:aws:iam::ACCOUNT_ID:role/ROLE_NAME"
session_name = "SESSION_NAME"
external_id = "EXTERNAL_ID"
}
}
Then to use the provider you would use its alias for resources or data sources, e.g. from docs:
resource "aws_instance" "foo" {
provider = aws.assumed_role_provider
# ...
}

How can I create the first IAM role to assume with Terraform?

I'm trying to set up AWS with Terraform and confused about aws_iam_role.
So to create a first role(let say poweruser), aws_iam_role requires assume_role_policy, which is a role arn to assume. (But there is no poweruser role with poweruser policy yet.)
It feels like I'm in chicken and egg situation - to create a role, I need another role to assume, but can't create since I don't have a role to assume yet.
How can I do this with Terraform? Am I misunderstood something? or Do I need to do some initial role/user set up by manual first?
If this is your first time running your Terraform script, I suggest Adding a user in the console with Programmatic Access, attach an AWS Managed Policy (like AdministratorAccess) and make sure to save Access key ID and Secret access key.
Then, in your HOME directory cd ~, create a folder .aws
Inside .aws directory create a file named credentials.
[default]
aws_access_key_id = paste
aws_secret_access_key = paste
In your terraform file:
provider "aws" {
profile = "default" #pertains to the default profile included in credentials file
region = "us-east-2"
version = "~> 2.26.0"
}
You must create the first IAM user manually to get an access key and secret_key for Terraform.
More precisely saying, for example, you've just made an aws account and there are no IAM users made which means there is no access key and secret key existed at all. However, Terraform must need access key and secret_key to be performed.
# AWS Provider
provider "aws" {
region = "ap-northeast-1"
access_key = "AIKAWP3TMGZNG34RUWRS"
secret_key = "RNA6yaOwlOw4AEuFaBH2qJzSh/aE1zhFL5cbvgbb"
}
So you must make an IAM user manually for only the first IAM user to get access key and secret_key to perform Terraform. After making the first IAM user and give the access key and secret_key to Terraform, you are able to make other IAM users with Terraform without manual ways.

Terraform using IAM role assume

I have been using access/secret keys with terraform to create/manage our infrastructure in AWS. However, I am trying to switch to using IAM role instead. I should be able to use a role in my account and assume the role in another account and should be able to run plan, apply etc to build infra in the other account. Any ideas, please suggest.
So far, I am testing with https://www.terraform.io/docs/providers/aws/, but for some reason, it is not working for me or the instructions are not clear to me.
Get the full ARN for the role you want to assume. In your provider config use the 'assume_role' block with the ARN: https://www.terraform.io/docs/providers/aws/index.html#assume_role
provider "aws"
region = "<whatever region>"
assume_role {
role_arn = "arn:aws:iam::ACCOUNT_ID:role/ROLE_NAME"
}
}
We use a non-terraform script to setup our credentials using IAM role and assume role.(something like https://github.com/Integralist/Shell-Scripts/blob/master/aws-cli-assumerole.sh ) For using with okta, we use https://github.com/redventures/oktad
We get the tmp credentaials and token, save it in ~/.aws/credentials as respective dev/prod etc profile and then point our respective terraform provider configuration like this:
provider "aws" {
region = "${var.region}"
shared_credentials_file = "${pathexpand("~/.aws/credentials")}"
profile = "${var.dev_profile}"
}