Terraform variable referencing locals not working - amazon-web-services

I need to pass the database host name (that is dynamically generated) as an environmental variable into my task definition. I thought I could set locals and have the variable map refer to a local but it seems to not work, as I receive this error: “error="failed to check table existence: dial tcp: lookup local.grafana-db-address on 10.0.0.2:53: no such host". I am able to execute the terraform plan without issues and the code works when I hard code the database host name, but that is not optimal.
My Variables and Locals
//MySql Database Grafana Username (Stored as ENV Var in Terraform Cloud)
variable "username_grafana" {
description = "The username for the DB grafana user"
type = string
sensitive = true
}
//MySql Database Grafana Password (Stored as ENV Var in Terraform Cloud)
variable "password_grafana" {
description = "The password for the DB grafana password"
type = string
sensitive = true
}
variable "db-port" {
description = "Port for the sql db"
type = string
default = "3306"
}
locals {
gra-db-user = var.username_grafana
}
locals {
gra-db-password = var.password_grafana
}
locals {
db-address = aws_db_instance.grafana-db.address
}
locals {
grafana-db-address = "${local.db-address}.${var.db-port}"
}
variable "app_environments_vars" {
type = list(map(string))
description = "Database environment variables needed by Grafana"
default = [
{
"name" = "GF_DATABASE_TYPE",
"value" = "mysql"
},
{
"name" = "GF_DATABASE_HOST",
"value" = "local.grafana-db-address"
},
{
"name" = "GF_DATABASE_USER",
"value" = "local.gra-db-user"
},
{
"name" = "GF_DATABASE_PASSWORD",
"value" = "local.gra-db-password"
}
]
}
Task Definition Variable reference
"environment": ${jsonencode(var.app_environments_vars)},
Thank you to everyone who has helped me with this project. I am new to all of this and could not have done it without help from this community.

You can't use dynamic references in your app_environments_vars. So your default values "value" = "local.grafana-db-address" will never get resolved by TF. If will be just a literal string "local.grafana-db-address".
You have to modify your code so that all these dynamic references in app_environments_vars get populated in locals.
UPDATE
Your app_environments_vars should be local variable for it to be resolved:
locals {
app_environments_vars = [
{
"name" = "GF_DATABASE_TYPE",
"value" = "mysql"
},
{
"name" = "GF_DATABASE_HOST",
"value" = local.grafana-db-address
},
{
"name" = "GF_DATABASE_USER",
"value" = local.gra-db-user
},
{
"name" = "GF_DATABASE_PASSWORD",
"value" = local.gra-db-password
}
]
}
then you pass that local to your template for the task definition.

Related

Terraform variable inteporlation and evaluation

I'm working with modules in Terraform using Yaml approach to manage variables. I have a very simple module that should create parameter in AWS Parameter Store based on my RDS and IAM User modules output.So, I wrote this module:
resource "aws_ssm_parameter" "ssm_parameter" {
name = var.parameter_name
type = var.parameter_type
value = var.parameter_value
overwrite = var.overwrite
tags = var.tags
}
The variables I'm using are stored into a Yaml file like this:
ssms:
/arquitetura/catalogo/gitlab/token:
type: SecureString
value: ManualInclude
/arquitetura/catalogo/s3/access/key:
type: String
value: module.iam_user.access_key
/arquitetura/catalogo/s3/secret/access/key:
type: SecureString
value: module.iam_user.secret_access_key
/arquitetura/catalogo/rds/user:
type: String
value: module.rds_instance.database_username
/arquitetura/catalogo/rds/password:
type: SecureString
value: module.rds_instance.database_password
As we can see, I have in "value" the module output I would like to send to Parameter Store. I'm loading this variable file using file and yamldecode functions:
ssmfile = "./env/${terraform.workspace}/ssm.yaml"
ssmfilecontent = fileexists(local.ssmfile) ? file(local.ssmfile) : "ssmFileNotFound: true"
ssmsettings = yamldecode(local.ssmfilecontent)
So, I have a local.ssmsettings and I can write a module call like this:
module "ssm_parameter" {
source = "../aws-ssm-parameter-tf"
for_each = local.ssmsettings.ssms
parameter_name = each.key
parameter_type = each.value.type
parameter_value = each.value.value
tags = local.tags
}
Doing this, my parameter is stored as:
{
"Parameter": {
"Name": "/arquitetura/catalogo/rds/user",
"Type": "String",
"Value": "module.rds_instance.database_username",
"Version": 1,
"LastModifiedDate": "2022-12-15T19:02:01.825000-03:00",
"ARN": "arn:aws:ssm:sa-east-1:111111111111:parameter/arquitetura/catalogo/rds/user",
"DataType": "text"
}
}
Value is receiving the string module.rds_instance.database_username instead of the module output.
I know that file function doesn't interpolate variables and I know Terraform doesn't have an eval function.
Does anybody had the same situation that can tell me how you solved the problem or have any clue that I can follow?
I already tried to work with Terraform templates, without success.
Thanks in advance.
Terraform has no way to understand that the value strings in your YAML files are intended to be references to values elsewhere in your module, and even if it did it wouldn't be possible to resolve them from there because this YAML file is not actually a part of the Terraform module, and is instead just a data file that Terraform has loaded into memory.
However, you can get a similar effect by placing all of the values your YAML file might refer to into a map of strings inside your module:
locals {
ssm_indirect_values = tomap({
manual_include = "ManualInclude"
aws_access_key_id = module.iam_user.access_key
aws_secret_access_key = module.iam_user.secret_access_key
database_username = module.rds_instance.database_username
database_password = module.rds_instance.database_password
})
}
Then change your YAML data so that the value strings match with the keys in this map:
ssms:
/arquitetura/catalogo/gitlab/token:
type: SecureString
value: manual_include
/arquitetura/catalogo/s3/access/key:
type: String
value: aws_access_key_id
/arquitetura/catalogo/s3/secret/access/key:
type: SecureString
value: aws_secret_access_key
/arquitetura/catalogo/rds/user:
type: String
value: database_username
/arquitetura/catalogo/rds/password:
type: SecureString
value: database_password
You can then substitute the real values instead of the placeholders before you use the data structure in for_each:
locals {
ssm_file = "${path.module}/env/${terraform.workspace}/ssm.yaml"
ssm_file_content = file(local.ssm_file)
ssm_settings = yamldecode(local.ssm_file_content)
ssms = tomap({
for k, obj in local.ssm_settings.ssms :
k => {
type = obj.type
value = local.ssm_indirect_values[obj.value]
}
})
}
module "ssm_parameter" {
source = "../aws-ssm-parameter-tf"
for_each = local.ssms
parameter_name = each.key
parameter_type = each.value.type
parameter_value = each.value.value
tags = local.tags
}
The for expression in the definition of local.ssms uses the source value string as a lookup key into local.ssm_indirect_values, thereby inserting the real value.
The module "ssm_parameter" block now refers to the derived local.ssms instead of the original local.ssm_settings.ssms, so each.value.value will be the final resolved value rather than the lookup key, and so your parameter should be stored as you intended.

Terraform: How to add multiple option_settings to option, in option group?

I'm trying to create an option group that requires an option with option settings that add multiple values.
See the following, scrubbed for sensitivity:
option {
option_name = "VALID_OPTION_NAME"
option_settings = [
{
name = "foobar1"
value = "foobar1"
},
{
name = "foobar2"
value = "foobar2"
},
{
name = "foobar3"
value = "foobar3"
},
{
name = "foobar4"
value = "foobar4"
},
{
name = "foobar5"
value = "foobar5"
}
]
}
terraform validate gives the following error:
[0m on rds.tf line 112, in resource "aws_db_option_group" "rds-option-group":
112: [4moption_settings[0m = [
[0m
An argument named "option_settings" is not expected here. Did you mean to
define a block of type "option_settings"?
I've tried numerous variations of this syntax to no avail. AWS in the GUI has the option by default to include multiple option settings, so there should be a way to do it in Terraform as well.
The Option Group docs for Terraform unfortunately don't include an example where one option has multiple settings.
Among other things, I also checked out this thread which didn't help me, I believe because I'm not using that module.
Any recommendations?
Answer, for any future viewers:
option {
option_name = "xx"
option_settings {
name = "xx"
value = "xx"
}
option_settings {
name = "xx"
value = "xx"
}
option_settings {
name = "xx"
value = "xx"
}
option_settings {
name = "xx"
value = "xx"
}
}

Terraform - Copy AWS SSM Parameters

longtime lurker first time poster
Looking for some guidance from you all. I'm trying to replicate the aws command to essentially get the parameters (ssm get-parameters-by-path) then loop through the parameters and get them
then loop through and put them into a new parameter (ssm put-parameter)
I understand there's a for loop expression in TF but for the life of me I can't put together how I would achieve this.
so thanks to the wonderful breakdown below, I've gotten closer! But have this one issue. Code below:
provider "aws" {
region = "us-east-1"
}
data "aws_ssm_parameters_by_path" "parameters" {
path = "/${var.old_env}"
recursive = true
}
output "old_params_by_path" {
value = data.aws_ssm_parameters_by_path.parameters
sensitive = true
}
locals {
names = toset(data.aws_ssm_parameters_by_path.parameters.names)
}
data "aws_ssm_parameter" "old_param_name" {
for_each = local.names
name = each.key
}
output "old_params_names" {
value = data.aws_ssm_parameter.old_param_name
sensitive = true
}
resource "aws_ssm_parameter" "new_params" {
for_each = local.names
name = replace(data.aws_ssm_parameter.old_param_name[each.key].name, var.old_env, var.new_env)
type = data.aws_ssm_parameter.old_param_name[each.key].type
value = data.aws_ssm_parameter.old_param_name[each.key].value
}
I have another file like how the helpful poster mentioned and created the initial dataset. But what's interesting is that after you create the set after the second set, it overwrites the first set! The idea is that I would be able to tell terraform, I have this current set of SSM parameters and I want you to copy that info (values, type) and create a brand new set of parameters (and not destroy anything that's already there).
Any and all help would be appreciated!
I understand, It's not easy at the beginning. I will try to elaborate step-by-step on how I achieve that.
Anyway, it's nice to include any code, that you tried before, even if doesn't work.
So, firstly I create some example parameters:
# create_parameters.tf
resource "aws_ssm_parameter" "p" {
count = 3
name = "/test/${count.index}/p${count.index}"
type = "String"
value = "test-${count.index}"
}
Then I try to view them:
# example.tf
data "aws_ssm_parameters_by_path" "parameters" {
path = "/test/"
recursive = true
}
output "params_by_path" {
value = data.aws_ssm_parameters_by_path.parameters
sensitive = true
}
As an output I received:
terraform output params_by_path
{
"arns" = tolist([
"arn:aws:ssm:eu-central-1:999999999999:parameter/test/0/p0",
"arn:aws:ssm:eu-central-1:999999999999:parameter/test/1/p1",
"arn:aws:ssm:eu-central-1:999999999999:parameter/test/2/p2",
])
"id" = "/test/"
"names" = tolist([
"/test/0/p0",
"/test/1/p1",
"/test/2/p2",
])
"path" = "/test/"
"recursive" = true
"types" = tolist([
"String",
"String",
"String",
])
"values" = tolist([
"test-0",
"test-1",
"test-2",
])
"with_decryption" = true
}
aws_ssm_parameters_by_path is unusable without additional processing, so we need to use another data source, to get a suitable object for a copy of provided parameters. n the documentation I found aws_ssm_parameter. However, to use it, I need the full name of the parameter.
List of the parameter names I retrieved in the previous stage, so now only needed is to iterate through them:
# example.tf
locals {
names = toset(data.aws_ssm_parameters_by_path.parameters.names)
}
data "aws_ssm_parameter" "param" {
for_each = local.names
name = each.key
}
output "params" {
value = data.aws_ssm_parameter.param
sensitive = true
}
And as a result, I get:
terraform output params
{
"/test/0/p0" = {
"arn" = "arn:aws:ssm:eu-central-1:999999999999:parameter/test/0/p0"
"id" = "/test/0/p0"
"name" = "/test/0/p0"
"type" = "String"
"value" = "test-0"
"version" = 1
"with_decryption" = true
}
"/test/1/p1" = {
"arn" = "arn:aws:ssm:eu-central-1:999999999999:parameter/test/1/p1"
"id" = "/test/1/p1"
"name" = "/test/1/p1"
"type" = "String"
"value" = "test-1"
"version" = 1
"with_decryption" = true
}
"/test/2/p2" = {
"arn" = "arn:aws:ssm:eu-central-1:999999999999:parameter/test/2/p2"
"id" = "/test/2/p2"
"name" = "/test/2/p2"
"type" = "String"
"value" = "test-2"
"version" = 1
"with_decryption" = true
}
}
Each parameter object has been retrieved, so now it is possible to create new parameters - which can be done like this:
# example.tf
resource "aws_ssm_parameter" "new_param" {
for_each = local.names
name = "/new_path${data.aws_ssm_parameter.param[each.key].name}"
type = data.aws_ssm_parameter.param[each.key].type
value = data.aws_ssm_parameter.param[each.key].value
}

Terraform Variable looping to generate properties

I have to admit, this is the first time I have to ask something that I dont even myself know how to ask for it or explain, so here is my code.
It worth explains that, for specific reasons I CANNOT change the output resource, this, the metadata sent to the resource has to stay as is, otherwise it will cause a recreate and I dont want that.
currently I have a terraform code that uses static/fixed variables like this
user1_name="Ed"
user1_Age ="10"
user2_name="Mat"
user2_Age ="20"
and then those hard typed variables get used in several places, but most importanly they are passed as metadata to instances, like so
resource "google_compute_instance_template" "mytemplate" {
...
metadata = {
othervalues = var.other
user1_name = var.user1_name
user1_Age = var.user1_Age
user2_name = var.user2_name
user2_Age = var.user2_Age
}
...
}
I am not an expert on terraform, thus asking, but I know for fact this is 100% ugly and wrong, and I need to use lists or array or whatever, so I am changing my declarations to this:
users = [
{ "name" : "yo", "age" : "10", "last" : "other" },
{ "name" : "El", "age" : "20", "last" : "other" }
]
but then, how do i get around to generate the same result for that resource? The resulting resource has to still have the same metadata as shown.
Assuming of course that the order of the users will be used as the "index" of the value, first one gets user1_name and so on...
I assume I need to use a for_each loop in there but cant figure out how to get around a loop inside properties of a resource
Not sure if I make myself clear on this, probably not, but didn't found a better way to explain.
From your example it seems like your intent is for these to all ultimately appear as a single map with keys built from two parts.
Your example doesn't make clear what the relationship is between user1 and Ed, though: your first example shows that "user1's" name is Ed, but in your example of the data structure you want to create there is only one "name" and it isn't clear to me whether that name would replace "user1" or "Ed" from your first example.
Instead, I'm going to take a slightly different variable structure which still maintains both the key like "user1" and the name attribute, like this:
variable "users" {
type = map(object({
name = string
age = number
})
}
locals {
# First we'll transform the map of objects into a
# flat set of key/attribute/value objects, because
# that's easier to work with when we generate the
# flattened map below.
users_flat = flatten([
for key, user in var.users : [
for attr, value in user : {
key = key
attr = attr
value = value
}
]
])
}
resource "google_compute_instance_template" "mytemplate" {
metadata = merge(
{
othervalues = var.other
},
{
for vo in local.users_flat : "${vo.key}_${vo.attr}" => vo.value
}
)
}
local.users_flat here is an intermediate data structure that flattens the two-level heirarchy of keys and object attributes from the input. It would be shaped something like this:
[
{ key = "user1", attr = "name", value = "Ed" },
{ key = "user1", attr = "age", value = 10 },
{ key = "user2", attr = "name", value = "Mat" },
{ key = "user2", attr = "age", value = 20 },
]
The merge call in the metadata argument then merges a directly-configured mapping of "other values" with a generated mapping derived from local.users_flat, shaped like this:
{
"user1_name" = "Ed"
"user1_age" = 10
"user2_name" = "Mat"
"user2_age" = 20
}
From the perspective of the caller of the module, the users variable should be defined with the following value in order to get the above results:
users = {
user1 = {
name = "Ed"
age = 10
}
user2 = {
name = "Mat"
age = 20
}
}
metadata is not a block, but a regular attribute of type map. So you can do:
# it would be better to use map, not list for users:
variable "users"
default {
user1 = { "name" : "yo", "age" : "10", "last" : "other" },
user2 = { "name" : "El", "age" : "20", "last" : "other" }
}
}
resource "google_compute_instance_template" "mytemplate" {
for_each = var.users
metadata = each.value
#...
}

Terraform - Specifying multiple possible values for Variables

CloudFormation provides AllowedValues for Parameters which tells that the possible value of the parameter can be from this list. How can I achieve this with Terraform variables? The variable type of list does not provide this functionality. So, in case I want my variable to have value out of only two possible values, how can I achieve this with Terraform. CloudFormation script that I want to replicate is:
"ParameterName": {
"Description": "desc",
"Type": "String",
"Default": true,
"AllowedValues": [
"true",
"false"
]
}
I don't know of an official way, but there's an interesting technique described in a Terraform issue:
variable "values_list" {
description = "acceptable values"
type = "list"
default = ["true", "false"]
}
variable "somevar" {
description = "must be true or false"
}
resource "null_resource" "is_variable_value_valid" {
count = "${contains(var.values_list, var.somevar) == true ? 0 : 1}"
"ERROR: The somevar value can only be: true or false" = true
}
Update:
Terraform now offers custom validation rules in Terraform 0.13:
variable "somevar" {
type = string
description = "must be true or false"
validation {
condition = can(regex("^(true|false)$", var.somevar))
error_message = "Must be true or false."
}
}
Custom validation rules is definitely the way to go. If you want to keep things simple and check the provided value against a list of valid ones, you can use the following in your variables.tf config:
variable "environment" {
type = string
description = "Deployment environment"
validation {
condition = contains(["dev", "prod"], var.environment)
error_message = "Valid value is one of the following: dev, prod."
}
}
Variation on the above answer to use an array/list.
variable "appservice_sku" {
type = string
description = "AppService Plan SKU code"
default = "P1v3"
validation {
error_message = "Please use a valid AppService SKU."
condition = can(regex(join("", concat(["^("], [join("|", [
"B1", "B2", "B3", "D1", "F1",
"FREE", "I1", "I1v2", "I2", "I2v2",
"I3", "I3v2", "P1V2", "P1V3", "P2V2",
"P2V3", "P3V2", "P3V3", "PC2",
"PC3", "PC4", "S1", "S2", "S3",
"SHARED", "WS1", "WS2", "WS3"
])], [")$"])), var.appservice_sku))
}
}