module "s3_bucket" {source = "./module/s3_bucket"}
module "s3_bucket_2" {source = "./module/s3_bucket_2"}
These are my two modules which I am calling in the main.tf file, but I want to use some conditions so that I can call any one module which I want at any point of time and only that module gets executed at that time, so is their any way to do that?
I didnt understand quite your question but i guess what you want or at least would be helpfull to answer your question is the following.
You can create a variable in variables.tf called for example create and then pass it to a module.
# Set a variable to know if the resources inside the module should be created
module "s3_bucket" {
source = "./module/s3_bucket"
create = var.create
}
# For every resource inside use the count to create or not each resource
resource "resource_type" "resource_name" {
count = var.create ? 1 : 0
... other resource attributes
}
Related
Please instead of doing negative voting, kindly read complete problem first.
Hi I am new to terraform.
I have 3 modules in terraform.
/module1/eip/main.tf
/module1/eip/output.tf
/module2/eip/main.tf
/module2/eip/output.tf
/module3/eip/main.tf
/module3/eip/output.tf
These all 3 modules create an eip along with showing it in outputs.
From main.tf on root level i am using these functions like this.
module "module-one-eip" {
source = "./modules/module1/eip/"
instance = module.ec2-for-one-module.ec2-one-id
}
module "module-two-eip" {
source = "./modules/module2/eip/"
instance = module.ec2-for-two-module.ec2-two-id
}
module "module-three-eip" {
source = "./modules/module3/eip/"
instance = module.ec2-for-three-module.ec2-three-id
}
Now I want to remove repetitive files and I want to use one file for all modules, like all code from these 3 will reside in one file, and all outputs mentioned above will have in same file, but main problem here is how I will handle different instance variable data being passed and how it will be synced with right code section in same file.
/module/generic-eip/main.tf
/module/generic-eip/outputs.tf
and from main.tf I want to call it like this.
module "module-generic-eip" {
source = "./modules/generic-eip/"
instance = (how to manage different instance names for same file)
}
I know there is for_each and count stuff of terraform, problem is internal configurations are diffrent, i mean how i can make dynamic naming as well.
inside ./modules/eip/main.tf
resource "aws_eip" "(how to manage different names here)" {
instance = var.instance
vpc = true
}
Assuming you keep your inctance module separate, and only want to consolidate EIP module, it would be as follows:
locals {
instance_ids = [module.ec2-for-one-module.ec2-one-id,
module.ec2-for-two-module.ec2-two-id,
module.ec2-for-three-module.ec2-three-id]
}
module "module-generic-eip" {
source = "./modules/generic-eip/"
count = length(local.instance_ids)
instance = local.instance_ids[count.index]
}
Code inside ./modules/eip/main.tf does not change, as for each iteration of count, var.instance will be a single element from local.instance_ids.
Then you access, individual EIPs, using indieces, 0, 1, 2:
module.module-generic-eip[0]
module.module-generic-eip[1]
module.module-generic-eip[2]
I have parent directory from there i am calling local module ,but Terraform.tfvars file present in parent directory not considered when calling local module .Its taking values from variable file present in local module.
my code is available in GitHub. My code is working fine except that its not considering terraform.tfvars file. Can anyone let me know what is issue in this code ?
https://github.com/sammouse/terraform-code.git
Its taking values from variable file present in local module.
That's how it works. There is no inheritance of variables in TF. You have to explicitly pass all the variables in parent module when to the child module. Otherwise, you have to copy-and-paste all veriables in tfvars in parent module, to tfvars file in child module.
I am trying to assign an output variable from a module to a local variable so that I can conveniently use a local variable. Is there another way ?
variable "vpc_id" {
default = "${module.vpc.vpc_id}"
}
Error I am getting is :
Error: Unsupported argument
on main.tf line 22, in variable "subnetid_private":
22: default = "${module.vpc.subnet_private}"
Variables may not be used here..
I spent good amount of time to google this but could not see any example. Am I missing something here. This is a pretty standard convenience feature of any language.
you can replace with locals
https://www.terraform.io/docs/configuration/locals.html
locals {
vpc_id = module.vpc.vpc_id
}
and later reference it as local.vpc_id
I had the same thought when I first started using Terraform.
The problem is naming. A Terraform variable block is more like a final or constant constructor or method parameter in other languages.
From the Local Value documentation, it says this:
A local value assigns a name to an expression, allowing it to be used multiple times within a module without repeating it.
Comparing modules to functions in a traditional programming language: if input variables are analogous to function arguments and outputs values are analogous to function return values, then local values are comparable to a function's local temporary symbols.
When you write:
variable "vpc_id" {
}
Terraform says "ah, you'd like callers of this module to be able to hand a string in called vpc_id". Definitely not what you are looking for.
For the kind of case you have, a Local Value is what Terraform provides:
locals {
vpc_id = module.vpc.vpc_id
}
In terraform 0.12, you can use variables in variables like so:
variable "var1" {
type = string
default = "this is var 1"
}
variable "var2" {
type = string
default = "$${variable.var1}"
}
output of this is:
$ terraform apply
Apply complete! Resources: 0 added, 0 changed, 0 destroyed.
I am trying to write Python 2.7 code that will
Dynamically load a list / array of modules from a config file at startup
Call functions from those modules. Those functions are also designated in a config file (maybe the same config file, maybe a different one).
The idea is my code will have no idea until startup which modules to load. And the portion that calls the functions will have no idea which functions to call and which modules those functions belong to until runtime.
I'm not succeeding. A simple example of my situation is this:
The following is abc.py, a module that should be dynamically loaded (in my actual application I would have several such modules designated in a list / array in a config file):
def abc_fcn():
print("Hello World!")
def another_fcn():
print("BlahBlah")
The following is the .py code which should load abc.py (my actual code would need to import the entire list / array of modules from the config file). Both this .py file and abc.py are in the same folder / directory. Please note comments next to each statement.
module_to_import = "abc" #<- Will normally come from config file
fcn_to_call = "abc.abc_fcn" #<- Will normally come from config file
__import__(module_to_import) #<- No error
print(help(module_to_import)) #<- Works as expected
eval(fcn_to_call)() #<- NameError: name 'abc' is not defined
When I change the second line to the following...
fcn_to_call = "abc_fcn"
...the NameError changes to "name 'abc_fcn' is not defined".
What am I doing wrong? Thanks in advance for the help!
__import__ only returns the module specified, it does not add it to the global namespace. So to accomplish what you want, save the result as a variable, and then dynamically retrieve the function that you want. That could look like
fcn_to_call = 'abc_fcn'
mod = __import__(module_to_import)
func = getattr(mod, fcn_to_call)
func()
On a side note, abc is the name of name of the Abstract Base Classes builtin Python module, although I know you were probably just using this an example.
You should assign the returning value of __import__ to a variable abc so that you can actually use it as a module.
abc = __import__(module_to_import)
I have a Terraform structure like:
prod
nonprod
applications
+-- continuous/common/iam/iam.tf <-- create the role
+-- dataflow/firehose/firehose.tf <-- want to refer to the role created above
I don't know how to do this. In the iam's .tf file I have:
resource "aws_iam_role" "policy_daily_process_role" {
...
}
output "svc_daily_process_role_arn" {
value = "${aws_iam_role.policy_daily_process_role.arn}"
}
I am not sure how (or if) I can then refer to the svc_daily_process_role_arn from the firehose's .tf.
My understanding that you already use modules to manage terraform codes.
So in your case, there should be two modules at least.
continuous/common
dataflow/firehose
In continuous/common module, you have defined output.tf
output "svc_daily_process_role_arn" {
value = "${aws_iam_role.policy_daily_process_role.arn}"
}
So you create the resources with common module first.
module "common" {
source = "./continuous/common"
...
}
Now you are fine to refer the output from module common with below codes:
module "firehost" {
source = "./dataflow/firehose"
some_variable = "${module.common.svc_daily_process_role_arn}"
...
}
Please go through below documents for better understanding.
https://www.terraform.io/docs/modules/usage.html#outputs
Using Terraform Modules.
https://www.terraform.io/docs/modules/usage.html
From a top level make a call to the two subdirectories.
In module 1 (your IAM role) add an output like you have, but ensure it's outputted from module 1.
In module 2 reference it via ${module..}
If you're not using modules (or even if you are), you can use remote state. This means you will save your state in S3 or Consul and then refer to it from anywhere in your code.