I'm learning more about terraform and AWS. I've seen a code in Stack overflow- Outputs not displaying when using modules
Main.tf
module "identity-provider" {
source = "./modules/identity-provider"
}
module "saml-role1" {
source = "./modules/saml-roles/"
}
Module file
resource "aws_iam_role" "role1" {
name = "saml-role1"
description = "Blah Blah"
path = "/"
assume_role_policy = "${data.aws_iam_policy_document.assume_role.json}"
permissions_boundary = ""
max_session_duration = 43200
resource "aws_iam_role_policy_attachment" "Read-Only" {
role = "${aws_iam_role.role1.name}"
policy_arn = "arn:aws:iam::aws:policy/ReadOnlyAccess"
}
Output.tf
output "Role1-ARN" {
value = "${module.saml-role1.arn}"
}
My doubt is
What is the significance of output.tf file and how will it affect our code if such a file doesn't exist??
Why output is used (my view):-
Terraform output values allow you to export structured data about your resources.
Outputs are also necessary to share data from a child module to your root module.
to get information about the resources you have deployed.
Output values are similar to return values in programming languages.
What is the significance of output.tf file, According to docs
Output values have several uses:
A child module can use outputs to expose a subset of its resource attributes to a parent module.
A root module can use outputs to print certain values in the CLI output after running terraform apply.
When using remote state, root module outputs can be accessed by other configurations via a terraform_remote_state data source.
Output declarations can appear anywhere in Terraform configuration files. However putting them into a separate file called outputs.tf to make it easier for users to understand your configuration and what outputs to expect from it.
When you apply you can see those values on your terminal, they will be also present in your project's state, use the terraform output command to query all of them using terraform output command
how will it affect our code if such a file doesn't exist??
It will simple not output resource info on command line or a module won't be able to use it, if child module is referencing from parent module
Putting outputs in a file called outputs.tf is just a convention. You don't have to do it. You may as well put your outputs in main.tf. But using outputs.tf can be convenient if your tf scripts are large. Its easy to find and inspects your outputs.
Related
I have the following configuration to create aws_cloudfront_origin_access_identity
resource "aws_cloudfront_origin_access_identity" "example" {
comment = "Some comment"
}
How do I find data source for OAI from a different configuration?
For examples I have cloudfront distribution and I need to set cloudfront_access_identity_path
resource "aws_cloudfront_distribution" "s3_distribution" {
origin {
domain_name = "abcd"
origin_id = "foobar"
s3_origin_config {
origin_access_identity = "how do i get cloudfront_access_identity_path here?"
}
}
I cannot use aws_cloudfront_origin_access_identity.example.cloudfront_access_identity_path because its in the different configuration.
I can access the data if I know the id however the id can change in future
data "aws_cloudfront_origin_access_identity" "example" {
id = "EDFDVDB123BHDS7"
}
What are my options to dynamically query aws_cloudfront_origin_access_identity data source?
If they are are managed by different state files, you can't directly reference between them. The exception is if the configuration with OAI is in a remote state, then you can use terraform_remote_state to reference this state's outputs in your code with aws_cloudfront_origin_access_identity data source.
Alternatively, you can develop your own External Data Source which would get the OAI from your setup in your code with aws_cloudfront_origin_access_identity data source.
But the most common way is simply to pass the OAI as input variable to your tf files. If you want to do it automatically, you have to develop some script or wrapper around your tf code that will pass OAI variable automatically.
It looks like there is a data source for aws_cloudfront_origin_access_identities now which will let you search by matching their comments.
So theoretically (actually just tested) you could use the results from this data lookup in the aws_cloudfront_origin_access_identity data lookup as long as the OAI you are looking for has a unique comment.
data "aws_cloudfront_origin_access_identities" "identities" {
comments = ["some unique comment"]
}
data "aws_cloudfront_origin_access_identity" "oai" {
id = one(data.aws_cloudfront_origin_access_identities.identities.ids)
}
It was introduced in AWS Provider version 4.12.0
I'm creating a lambda function using an existing module. Currently I refer the static arn inside the step function definition json file. I need to refer it dynamically i.e. at runtime whatever is the arn created. Here's the code:
module "student_lambda"{
source = git#github...// Some git repo
//Other info like vpc, runtime, memory, etc
}
How can I refer this student_lambda arn in my json file for step function?
Here's the json file snippet
"Student lambda response": {
"Type":"Task",
"Resource":"arn:aws:states:::lambda:invoke",
"Parameters":{
"Payload"......// Generic code
"FunctionName":"arn:aws:lambda:us-east-2:..."// Here I want to use something like Student_Lmabda.arn
}}
Note: module is declared in main.tf file. The step function json file is in another folder.
I am assuming the file structure is something like this.
=====================
main.tf
variables.tf
folder with json/
-json file
modules
=====================
In order for us to achieve this, we can create an output of the lambda function that we are creating within the output.tf file in the module.
output "lambda_arn" {
value = aws_lambda.<name of the lambda fn>.arn
}
once this is done we can refer the variable using
"Student lambda response": {
"Type":"Task",
"Resource":"arn:aws:states:::lambda:invoke",
"Parameters":{
"Payload"......
"FunctionName":"${module.student_lambda.lambda_arn}"
}}
I have a Glue job resource defined in module A, now I want to import it and use the job name in module B, how can I achieve this?
I tried something like, in the module B:
variable "example_glue_name" {
type = string
}
data "aws_glue_job" "example_glue_name" {
example_glue_name = var.example_glue_name
}
then:
module "B"{
source = .....
...
example_glue_name = module.A.example_glue_name
}
But I got error: Unsupported argument, │ An argument named "example_glue_job_name" is not expected here., I read the docs here: https://registry.terraform.io/providers/hashicorp/aws/latest/docs/data-sources/glue_script
It looks like terraform doesn't support data "aws_glue_job", it only has something like data "aws_glue_script", but it's not clear how to reference the glue name....
anyone what's the correct way to do this? Thanks.
There is no data source called aws_glue_job. The only one that exists is aws_glue_script. You would have to create external data source which would required custom script in bash, for instance, as shown in docs.
The script would use AWS CLI to get-job to query AWS Glue for a job of interest and return its details to your TF script for future processing.
But if module A creates a job, then it should output it. Once its output, you can reference in as input to module B.
I am trying to do the following:
module “git_file” {
source = "git::https://githubXX.com/abc.js"
}
data "archive_file" “init” {
type = "zip"
git_file = "${module.git_file.source}"
}
I am not able to make the above work. No matter if use https:// or ssh://
How do you source a JS file as a module in terraform?
A module block is for loading Terraform modules and their attending resources into your module under a particular module path. It cannot be used the way you intend.
To call a module means to include the contents of that module into the
configuration with specific values for its input variables. Modules
are called from within other modules using module blocks:
module "servers" {
source = "./app-cluster"
servers = 5
}
Source: Calling a Child Module - Modules- Configuration Language - Terraform Docs
It's somewhat like import, require, or include in other languages. It cannot be used to download a file for use in a Terraform module.
You could use the http data source to do what you describe:
data "http" "git_file" {
url = "https://githubXX.com/abc.js"
}
data "archive_file" “init” {
type = "zip"
git_file = data.http.git_file.body
}
This is also unlikely to work as you expect. You would definitely need a raw source link to GitHub for it.
You should consider an alternative solution involving having abc.js in the same repository or using a null_resource with a local_exec provisioner to download it with a script.
resource "null_resource" "" {
provisioner "local-exec" {
command = "git clone https://github.com/..."
}
}
Then you'll have the files locally for your use the same way you would if you git cloned on your own shell. I don't recommend this. It is brittle and will likely interact strangely with other tools.
This is setup.tf
data "google_compute_network" "selected" {
name = "${var.network}"
}
It's very basic. I just want to create a network in Google Cloud.
I run this with:
terraform apply -var 'network=net1'
But I still got an error like:
Error: resource 'data.google_compute_network.selected' config: unknown variable referenced: 'network'; define it with a 'variable' block
When I don't use variables in works like expected.
I guess you should have the variable defined to get terraform not complain about it.
variable "network" {
description = "your description goes here"
type = "string/map/list/boolean"
default = "default value here"
}
You can put this in your main file or may be in a separate file called input.tf but it just has to be present in the same directory.
terraform apply -var 'your-var=your-value' will override the value of the default in the variable section.
Terraform Doc: https://www.terraform.io/docs/configuration/variables.html