Share AWS resources between Terraform and CDK - amazon-web-services

My team has two completely different environments: a Terraform one (which allow us to create and manage some AWS resources as databases) and a CDK one, with contains API resources and its logics as well.
We would like to use databases resources created with Terraform in the CDK app.
I was looking for some simple way to import outputs or tfstate from Terraform into CDK app, but I've found nothing.
I'd like to know how'd you achieve something like that?

So, I finally solved this issue by using the tfstate file on CDK : our Remote Backend is AWS, so the tfstate is stored on S3. When we run the CDK app we fetch this file from S3 and we inject its outputs into an application service.
It allows to always get the updated outputs from resources generated with Terraform.

Related

TerraForm AWS: Is it possible to Import all AWS Resources in one time? | terraform import all services in one time

currently I started to work on an old AWS Infrastructure and tried to get know all current service states thru "terraform import" separately, but I would like to import with terraform all service in one time to save time and be sure that I have all aws services.
So is it possible to import all AWS Resources in one time?
Thanks a lot for your help in advance.
Meer
I had to do this recently myself and I would highly recommend terraformer.
A CLI tool that generates tf/json and tfstate files based on existing
infrastructure (reverse Terraform).
It supports quite a lot of Terraform providers. I just tested it for AWS and it works like a charm. ;)
So my is it possible to import all AWS Resources in one time?
Not directly with TF. But you could use a third-party, opensourced tool, called Former2 which can generate TF code from existing resources:
Generate CloudFormation / Terraform / Troposphere templates from your existing AWS resources

Extract Entire AWS Setup into storable Files or Deployment Package(s)

Is there some way to 'dehydrate' or extract an entire AWS setup? I have a small application that uses several AWS components, and I'd like to put the project on hiatus so I don't get charged every month.
I wrote / constructed the app directly through the various services' sites, such as VPN, RDS, etc. Is there some way I can extract my setup into files so I can save these files in Version Control, and 'rehydrate' them back into AWS when I want to re-setup my app?
I tried extracting pieces from Lambda and Event Bridge, but it seems like I can't just 'replay' these files using the CLI to re-create my application.
Specifically, I am looking to extract all code, settings, connections, etc. for:
Lambda. Code, Env Variables, layers, scheduling thru Event Bridge
IAM. Users, roles, permissions
VPC. Subnets, Route tables, Internet gateways, Elastic IPs, NAT Gateways
Event Bridge. Cron settings, connections to Lambda functions.
RDS. MySQL instances. Would like to get all DDL. Data in tables is not required.
Thanks in advance!
You could use Former2. It will scan your account and allow you to generate CloudFormation, Terraform, or Troposphere templates. It uses a browser plugin, but there is also a CLI for it.
What you describe is called Infrastructure as Code. The idea is to define your infrastructure as code and then deploy your infrastructure using that "code".
There are a lot of options in this space. To name a few:
Terraform
Cloudformation
CDK
Pulumi
All of those should allow you to import already existing resources. At least Terraform has a import command to import an already existing resource into your IaC project.
This way you could create a project that mirrors what you currently have in AWS.
Excluded are things that are strictly taken not AWS resources, like:
Code of your Lambdas
MySQL DDL
Depending on the Lambdas deployment "strategy" the code is either on S3 or was directly deployed to the Lambda service. If it is the first, you just need to find the S3 bucket etc and download the code from there. If it is the second you might need to copy and paste it by hand.
When it comes to your MySQL DDL you need to find tools to export that. But there are plenty tools out there to do this.
After you did that, you should be able to destroy all the AWS resources and then deploy them later on again from your new IaC.

using old tfstate for terraform destroy purposes

I created the aws beanstalk resources using terraform and included S3 as the backend for the storage of the tfstate. I'm reusing the same terraform infra code to deploy same resources with different properties like different instance-type, security groups, etc...
My question:, is there a way where I can still destroy the previous beanstalk infra created by same terraform code? Maybe referring to the tfstate files created from s3 then do the terraform destroy? thanks in advance for your answers
If you have the Terraform S3 backend configured in your codebase containing the Terraform state with the resources you would like to destroy, you can run terraform destroy and see the removal plan.
You can also simply run terraform apply and Terraform will converge the previously existing infrastructure to the newly desired one, without the intermediate destroy run

Setting up CodePipeline with Terraform

I am new to Terraform and building a CI setup. When I want to create a CodePipeline that is going to be connected to a GitHub repo, do I run specific commands inside my Terraform codebase that will reach out to AWS and create the CodePipeline config/instance for me? Or would I set this CodePipeline up manually inside AWS console and hook it up to Terraform after the fact?
do I run specific commands inside my Terraform codebase that will reach out to AWS and create the CodePipeline config/instance for me?
Yes, you use aws_codepipeline which will create new pipeline in AWS.
Or would I set this CodePipeline up manually inside AWS console and hook it up to Terraform after the fact?
You can also import existing resources to terraform.
I see you submitted this eight months ago, so I am pretty sure you have your answer, but for those searching that comes across this question, here are my thoughts on it.
As most of you have researched, terraform is infrastructure as code (IaC). As IaC it needs to be executed somewhere. This means that you either execute locally or inside a pipeline. A pipeline consists of docker containers that emulate a local environment and run commands for you to deploy your code. There is more to that, but the premise of understanding how terraform runs remains the same.
So to the magic question, Terraform is Code, and if you intend to use a pipeline, Jenkins, AWS, GitLab, and more, then you need a code repository to put all your code into. In this case, a code repository where you can store your terraform code so a pipeline can consume it when deploying your code. There are other reasons why you should use a code repository, but your question is directed to terraform and its usage with the pipeline.
Now the magnificent argument, the chicken or the egg, when to create your pipeline and how to do it. To your original question, you could do both. You could store all your terraform code in a repository (i recommend), clone it down, and locally run terraform to create your pipeline. This would be ideal for you to save time and leverage automation. Newbies, you will have to research terraform state files which is an element you need to backup in some form or shape once the pipeline is deployed for you.
If you are not so comfortable with Terraform, the GUI in AWS is also fine, and you can configure it easily to hook your pipeline into Github to run jobs.
You must set up Terraform and AWS locally on your machine or within the pipeline to deploy your code in both scenarios. This article is pretty good and will give you the basic understanding of setting up terraform
Don't forget to configure AWS on your local machine. For you Newbies using pipeline, you can leverage some of the pipeline links to get you started. Remember one thing, within AWS Codepipeine; you have to use IAM roles and not access keys. That will make more sense once you have gone through the first link. Please also go to youtube and search Terraform for beginners in AWS. Various videos can provide a lot more substance to help you get started.

How to map AWS ressources to terraform code

When I define terraform code for AWS resources I have not used before, I am sometimes unsure about how to define my components. I then often just apply any component and modify it using the AWS GUI. Then I use terraform plan to see my changes and adapt my code, so that it corresponds to the existing AWS infrastructure.
I wonder whether there is a more direct way to use the AWS GUI to define terraform code. Is there any way to just map an existing AWS infrastructure to terraform code?
What you want is terraform import
Terraform is able to import existing infrastructure. This allows you take resources you've created in aws via the GUI and bring it under Terraform management.
The current implementation of Terraform import can only import resources into the state file. It does not generate configuration. This will however enable you to then configure your resources
This is how to import an instance with ID i-abcd1234 to address aws_instance.bar
terraform import aws_instance.bar i-abcd1234
https://www.terraform.io/docs/import/