CodePipeline how to deploy multiple CFN stacks - amazon-web-services

tldr; I can deploy a single CFN stack as part of my pipeline, but how do I deploy multiple dynamic stacks?
An even better tldr; How would you do this? forEach BuildStage.artifact invoke CloudFormation.build
I am using CodePipeline in a pretty conventional way. The goal is to source control my CloudFormation templates, push them through the pipeline when a template changes, and then automatically deploy the stack.
Source Stage (CodeCommit commit my CFN yaml templates)
Build Stage (CodeBuild finds the new files from the commit, and pushes them to S3)
Deploy Stage (CloudFormation deploys my templates as CFN stacks)
Almost everything is working great. I commit my template changes to CodeCommit, the build stage runs my codeBuild gatekeeper, which gathers only the files that have changed, and uploads them to S3. So far so good.
The challenge is that sometimes I have one template change, and sometimes I have multiple(n). I can detect changed files and get them up to S3 no problem in my build stage. If I commit a change for one template, everything works fine. I can create an exported variable with my template location on S3, pass that to my deploy stage, and have the CloudFormation deploy action use that file as the template source. But how would I handle this if I have 2 templates?
I can't just create endless exported variables in my build stage.
And if I can, AFAIK there is no way to iterate over each entry for the deploy stage.
My thought is I would need to do one of the following:
Inside of my current buildspec (after I upload the files to S3), use the AWS CLI to invoke a CFN stack build. I can add this as part of a loop, so it iterates on each file to be uploaded. OR
After my build stage, use a Lambda to perform the same as #1. Loop through each file, and then use the CLI or SDK to invoke a CFN stack build.
Both of these options seem to defeat the purpose of the deploy stage altogether, which seems clunky to me.
Are there other options I am missing? What would you do?

Just want to answer my own question, in case anyone else is trying to figure out how to do this.
I ended up going with option 1...just doing a cli CFN deployment directly from within CodeBuild. I was really trying to shoehorn the idea of using a CodePipeline deploy stage, but this works just fine.
If anyone else ends up coming along with a better solution, I am all ears.

Related

How to have a simple manual ECS deployment in CodePipeline / CodeDeploy?

Basically I would like to have a simple manual deploy step that's not directly linked to a build. For use cases, when using containers, I wouldn't like to perform a build separately per environment (eg: once my build puts an image tag in ECR, I would like to deploy that to any number of environments).
Now, I know in CodePipeline I can have a number of actions and I can precede them with manual approval.
The problem with that is that should I not want to perform the last manually approved deploy, subsequent executions will pile on - the pipeline execution doesn't complete and what comes next will just have to wait. I can set a timeout, for sure, but there are moments when 20 builds come in fast and I don't know which one of them I may want to deploy to which environment (they generally all go to some QA/staging, but some need to manually deployed to a particular dev-related environment or even to production).
Manually updating task definitions all around in ECS is tedious.
I have a solution where I can manually patch a task definition using awscli and yq but is there a way to have a simple pipeline with one step that takes a manual input (aka image tag) and either uses an ECS deploy step (the only place where you can provide a clean straight patch json to patch the task definition) or uses my yq script to deploy?

AWS Cloudformation nested stacks templates - how do handle local development and versioning?

Just started a simple AWS serverless project to test it out, so I'm developing locally and hosting the project on GitLab.
Wanna try nested stacks just to split current template file into smaller pieces, but TemplateUrl property value must be an url to a template file located in an S3 bucket, so I can't simply move my stack resources to another local yaml file and just include it in the parent one.
Manually upload nested stacks template files to an S3 bucket and than running sam sync from my console looks too intricate IMHO, and setting up a pipeline that take care of all the process too looks too much work for a simple personal learning project.
The fastest solution seems to be replace a deplyment pipeline with a script that can be run locally.
I know AWS cloud services are meant for enterpries-grade projects, but I'm wondering if there is a simpler and built-in/official way to handle all of this.

How to deploy only changed lambda functions in github with aws codepipeline and cloudformation?

I'm using CodePipeline to deploy my CloudFormation templates that contain Lambda functions as AWS::SAM::Functions.
The CodePipeline is triggered by a commit in my main branch on GitHub.
The Source Stage in the CodePipeline retrieves the source files from GitHub. Zero or more Lambda functions could change in a commit. There are several Lambda Functions in this repository.
I intend on running through taskcat for CloudFormation Templates and Unit Tests for Lambda Python code during a test stage and then deploy the CloudFormation templates and Lambda Functions to production. The problem is, I can't figure out how to differentiate between changed and unchanged Lambda functions or automate the deployment of these Lambda functions.
I would like to only test and deploy new or update changed Lambda functions along with my CloudFormation templates - what is the best practice for this (ideally without Terraform or hacks)?
Regarding testing: Best practice is actually to simply test all lambda code in the repo on push before deploying. You might skip some work for example with github actions that you only test the files that have changed, but it definitely takes some scripting and it hardly ever adds much value. Each testing tool has its own way of dealing with that (sometimes you can simply pass the files you want to test as an argument and then its easy, but sometimes test tools are more of a all-or-nothing approach and it gets quite complicatedreal fast).
Also, personally I'm not a big fan of taskcat since it doesn't really add a lot of value and it's not a very intuitive tool (also relatively outdated IMO). Is there a reason you need to do these types of testing?
Regarding deployment: There are a few considerations when trying to only update lambdas that have changed.
Firstly, cloudformation already does this automatically: as long as the cloudformation resource for the lambda doesn't change, the lambda will not be updated.
However, SAM has a small problem there, since it will re-package the lambda code on every pipeline run and update the CodeUri property of the lambda. And thus the lambda gets updated (even though the code might stay the same).
To work around this, you have several options:
Simply accept that SAM updates your function even though the code might not have changed.
Build SAM locally, and use the --cached and --cache-dir option when deploying in your pipeline. Make sure to push the folder that you set as cache-dir.
Use a different file packaging tool than SAM. Either some custom script that or something else that only pushes your code to s3 when the files have changed.
If you're into programming I'd suggest you take a look into CDK. It's a major upgrade from cloudformation/SAM, and it handles code bundling better (only updates when files have changed). Also the testing options are much wider for CDK.

How to deploy AWS Lambda using Codepipeline without rebuilding it if there are no function code changes?

so we have a pipeline that zips code functions for our Lambda, uploads it to S3 and builds
the every lambda we have again with new version of zipped codes.
Now, the problem is, every single Lambda is being Built every pipeline run. even if there are no changes to other lambda code function. ex. (only 1 of 10 lambda has code change)
What would be the best approach or checking that we need to add in our pipeline in order to build the only Lambda that has code change? open for any suggestions even creating new pipeline and breaking this lambdas into pieces
I think the best way is to add a stage before the zipfile to see with files changes in the code in the last merge.
Simply take those names and check which lambda was affected.
Then it will pass the list of lambdas need to redeploy.
What we do at our company is have a single pipeline per lambda / repo. We have a few mono repos that when they deploy they deploy all the lambdas in that repo at once but still through a single pipeline. If you concerned about cost in the pipeline sticking around you could always delete them and then have another job to recreate them when you need to feploy a new change.
We've got everything done through cloud formation scripts so it's all simple scripts running here and there to create pipelines.
Curious what is the reason to have one pipeline deploy all 10 lambdas?

How can I create a pipeline as code in AWS codepipeline

I am using AWS codepipeline as my CI/CD tool. I have a code pipeline template yml file on my git and I wonder how I can link the file to AWS codepipeline. What I want to do is to let codepipeline to create/update the pipeline based on my pipeline yml file in github.
I have searched and tried on AWS console. All I can do is to manually create a pipeline via console and upload the template file. It works but it is not pipeline as code. If I want to change the stages in the pipeline, I will have to manually update the pipeline on AWS console or via cloudformation command.
Let me give an example, if I need to add a new stage in my pipeline. What I'd like to do is to update the yml file in github repo and commit it, then AWS codepipeline reads this yml file to update itself. I don't want to manually update the stage via AWS console.
Is there a way for me to sync the codepipeline to my pipeline yml file under source control?
I have seen lot of people wondering about this setup where everything is managed via code and I personally use this too with CodePipeline. I can see many people have replied but let me put it here with detials so that it can be help to anyone who wants to do this.
There are two ways to achieve this and let me try to explain both option here:
Option:1
Create two Seperate Pipeline:
"Pipeline -1" (Responsible for config change like adding extra stages to main pipeline "Pipeline -2", with two stage source and deploy (CloudFormation)
source_Config (gitrepo_config) --> deploy_Config_Cfn
"Pipeline -2" (Actual deployment Pipeline with stages like source, buid, deploy stage which will be created by using resource.yaml)
source_Resource (gitrepo_resource) --> build_Resource --> Deploy_Resource
Based on above config upload the template you use to create the main pipeline "resource.yaml" to repo "gitrepo_config".
Upload all the Code in repo "gitrepo_resource" based on the deployment provide you are using for "Deploy_Resource"
Once above setup is done when you want to put extra stages in pipeline you can make changes in file "resource.yaml" in git repo and "Pipeline -1" will do the rest.
Option:2 (Little Complex But let me see if I can explain)
I was using option 1 until I came up with this option.
This second way is like 100% code because even in above option I have to create the "Pipeline -1" either manually or via CFN for first time and later for update also I need to go to console.
To overcome this we can include both Pipeline in same CloudFormation template "resource.yaml" and only one time we have to execute that CloudFormation stack and later everything else is automatic.
I hope this will be helpful to everyone.
Note: Also we have to keep in mind in both option if during any config change if pipeline execution is in progress for resource pipeline "Pipeline -2 " then it might be marked as failed so to overcome this issue you can always set additional trigger which will trigger the "Pipeline -2" based on success state of "Pipeline -1" in addition to the source code trigger.