I am using AWS codepipeline as my CI/CD tool. I have a code pipeline template yml file on my git and I wonder how I can link the file to AWS codepipeline. What I want to do is to let codepipeline to create/update the pipeline based on my pipeline yml file in github.
I have searched and tried on AWS console. All I can do is to manually create a pipeline via console and upload the template file. It works but it is not pipeline as code. If I want to change the stages in the pipeline, I will have to manually update the pipeline on AWS console or via cloudformation command.
Let me give an example, if I need to add a new stage in my pipeline. What I'd like to do is to update the yml file in github repo and commit it, then AWS codepipeline reads this yml file to update itself. I don't want to manually update the stage via AWS console.
Is there a way for me to sync the codepipeline to my pipeline yml file under source control?
I have seen lot of people wondering about this setup where everything is managed via code and I personally use this too with CodePipeline. I can see many people have replied but let me put it here with detials so that it can be help to anyone who wants to do this.
There are two ways to achieve this and let me try to explain both option here:
Option:1
Create two Seperate Pipeline:
"Pipeline -1" (Responsible for config change like adding extra stages to main pipeline "Pipeline -2", with two stage source and deploy (CloudFormation)
source_Config (gitrepo_config) --> deploy_Config_Cfn
"Pipeline -2" (Actual deployment Pipeline with stages like source, buid, deploy stage which will be created by using resource.yaml)
source_Resource (gitrepo_resource) --> build_Resource --> Deploy_Resource
Based on above config upload the template you use to create the main pipeline "resource.yaml" to repo "gitrepo_config".
Upload all the Code in repo "gitrepo_resource" based on the deployment provide you are using for "Deploy_Resource"
Once above setup is done when you want to put extra stages in pipeline you can make changes in file "resource.yaml" in git repo and "Pipeline -1" will do the rest.
Option:2 (Little Complex But let me see if I can explain)
I was using option 1 until I came up with this option.
This second way is like 100% code because even in above option I have to create the "Pipeline -1" either manually or via CFN for first time and later for update also I need to go to console.
To overcome this we can include both Pipeline in same CloudFormation template "resource.yaml" and only one time we have to execute that CloudFormation stack and later everything else is automatic.
I hope this will be helpful to everyone.
Note: Also we have to keep in mind in both option if during any config change if pipeline execution is in progress for resource pipeline "Pipeline -2 " then it might be marked as failed so to overcome this issue you can always set additional trigger which will trigger the "Pipeline -2" based on success state of "Pipeline -1" in addition to the source code trigger.
Related
tldr; I can deploy a single CFN stack as part of my pipeline, but how do I deploy multiple dynamic stacks?
An even better tldr; How would you do this? forEach BuildStage.artifact invoke CloudFormation.build
I am using CodePipeline in a pretty conventional way. The goal is to source control my CloudFormation templates, push them through the pipeline when a template changes, and then automatically deploy the stack.
Source Stage (CodeCommit commit my CFN yaml templates)
Build Stage (CodeBuild finds the new files from the commit, and pushes them to S3)
Deploy Stage (CloudFormation deploys my templates as CFN stacks)
Almost everything is working great. I commit my template changes to CodeCommit, the build stage runs my codeBuild gatekeeper, which gathers only the files that have changed, and uploads them to S3. So far so good.
The challenge is that sometimes I have one template change, and sometimes I have multiple(n). I can detect changed files and get them up to S3 no problem in my build stage. If I commit a change for one template, everything works fine. I can create an exported variable with my template location on S3, pass that to my deploy stage, and have the CloudFormation deploy action use that file as the template source. But how would I handle this if I have 2 templates?
I can't just create endless exported variables in my build stage.
And if I can, AFAIK there is no way to iterate over each entry for the deploy stage.
My thought is I would need to do one of the following:
Inside of my current buildspec (after I upload the files to S3), use the AWS CLI to invoke a CFN stack build. I can add this as part of a loop, so it iterates on each file to be uploaded. OR
After my build stage, use a Lambda to perform the same as #1. Loop through each file, and then use the CLI or SDK to invoke a CFN stack build.
Both of these options seem to defeat the purpose of the deploy stage altogether, which seems clunky to me.
Are there other options I am missing? What would you do?
Just want to answer my own question, in case anyone else is trying to figure out how to do this.
I ended up going with option 1...just doing a cli CFN deployment directly from within CodeBuild. I was really trying to shoehorn the idea of using a CodePipeline deploy stage, but this works just fine.
If anyone else ends up coming along with a better solution, I am all ears.
I have a repo for all my docker stuff.
I would like to store copilot configs here as well, instead of adding a new copilot/ directory to the repo of every micro service.
As far as I know this should be possible.
So now I have one single copilot dir in a separate repo which looks like this:
copilot
.workspace
...
- some-service
- mainfest.yml
- other-service
- manifest.yml
etc. This works, I can add more services and I can deploy them.
However I tried to create a pipeline and that failed. According to the docs the pipeline should be able to handle multiple services, but I don't understand how.
I can run
copilot pipeline init
then I pushed the resulting files to my repo.
Then I tried:
copilot pipeline update
But this returns an error:
ACTION REQUIRED! Go to https://console.aws.amazon.com/codesuite/settings/connections to update the status of connection xy-user-service from PENDING to AVAILABLE.
✘ Failed to create a new pipeline: pipeline-myApp-user-service.
✘ create pipeline: check if changeset is empty: create change set copilot-51ef519a-711b-4126-bfbd-3d618ef824a5 for stack pipeline-myApp-user-service: ValidationError: Template format error: Unrecognized resource types: [AWS::CodeStarConnections::Connection]
status code: 400, request id: 8a87f62a-ae14-4fe3-9a3e-8b965d2af794: describe change set copilot-51ef519a-711b-4126-bfbd-3d618ef824a5 for stack pipeline-myApp-user-service: ValidationError: Stack [pipeline-myApp-user-service] does not exist
status code: 400, request id: 44927d7e-2514-466a-94ff-51e932042737
The xy-user-service connection didn't exist. The list of connections is empty. I tried to create it, linking my Bitbucket to AWS. But the error is still there..
What am I doing wrong?
Am I supposed to run copilot app init in the root dir of each and every micro service (they are in separate repos) and then should I create a separate pipeline for each?
Is it not possible to just store copilot configs in one place?
Thanks in advance!
It sounds like you have one repo with all of your configs, and separate repos for each service's source code.
If this is the case, the pipeline is not working because it needs the url to the source repo, not the repo with the config files; this is how it can detect changes to the source code that trigger the pipeline.
You CAN have multiple services in one Copilot app, but if you want a single pipeline for the entire app, the microservices' source code needs to be in one repository.
See:
https://aws.github.io/copilot-cli/docs/concepts/applications/
https://aws.github.io/copilot-cli/docs/concepts/services/
https://aws.github.io/copilot-cli/docs/concepts/pipelines/
(Cross-posted from https://github.com/aws/copilot-cli/issues/2109; we primarily use GitHub Issues for Q&As.)
Thanks so much!
I am trying to work my way to have a ci/cd for the Api part of the application.
I have 3 steps:
1: Source (git hub version2)
2: Build (currently has no commands)
3: Deploy(provider is code deploy(application))
Here is the screenshot of the events in code deploy.
.
While creating the Deployment Group. I chose the option of downloading the code deploy provider from the option(though it was necessary).
While setting up the code pipeline chose
Felt that was appropriate.
This code pipeline has put an object into the S3 bucket for this pipeline.
Code deploy is acting on that source artifact.
Note:
We have nothing on this Ec2 image it's just a place where we have our API.
Currently, Ec2 is empty.
What would be the proper way to implement this? How can I overcome the issues I am facing.
Without appspec.yml your deployment will fail. From docs:
An AppSpec file must be a YAML-formatted file named appspec.yml and it must be placed in the root of the directory structure of an application's source code. Otherwise, deployments fail.
I am trying to make a code pipeline which will build my branch when I make a pull request to the master branch in AWS. I have many developers working in my organisation and all the developers work on their own branch. I am not very familiar with ccreating lambda function. Hoping for a solution
You can dynamically create pipelines everytime a new pull-request has been created. Look for the CodeCommit Triggers (in the old CodePipeline UI), you need lambda for this.
Basically it works like this: Copy existing pipeline and update the the source branch.
It is not the best, but afaik the only way to do what you want.
I was there and would not recommend it for the following reasons:
I hit this limit of 20 in my region: "Maximum number of pipelines with change detection set to periodically checking for source changes" - but, you definitely want this feature ( https://docs.aws.amazon.com/codepipeline/latest/userguide/limits.html )
The branch-deleted trigger does not work correctly, so you can not delete the created pipeline, when the branch has been merged into master.
I would recommend you to use Github.com if you need a workflow as you described. Sorry for this.
I have recently implemented an approach that uses CodeBuild GitHub webhook support to run initial unit tests and build, and then publish the source repository and built artefacts as a zipped archive to S3.
You can then use the S3 archive as a source in CodePipeline, where you can then transition your PR artefacts and code through Integration testing, Staging deployments etc...
This is quite a powerful pattern, although one trap here is that if you have a lot of pull requests being created at a single time, you can get CodePipeline executions being superseded given only one execution can proceed through a given stage at a time (this is actually a really important property, especially if your integration tests run against shared resources and you don't want multiple instances of your application running data setup/teardown tasks at the same time). To overcome this, I publish an S3 notification to an SQS FIFO queue when CodeBuild publishes the S3 artifact, and then poll the queue, copying each artifact to a different S3 location that triggers CodePipeline, but only if there are are currently no executions waiting to execute after the first CodePipeline source stage.
We can very well have dynamic branching support with the following approach.
One of the limitations in AWS code-pipeline is that we have to specify branch names while creating the pipeline. We can however overcome this issue using the architecture shown below.
flow diagram
Create a Lambda function which takes the GitHub web-hook data as input, using boto3 integrate it with AWS pipeline(pull the pipeline and update), have an API gateway to make the call to the Lambda function as a rest call and at last create a web-hook to the GitHub repository.
External links:
https://aws.amazon.com/quickstart/architecture/git-to-s3-using-webhooks/
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/codepipeline.html
Related thread: Dynamically change branches on AWS CodePipeline
I have a specific case which I'm not sure if it's possible with AWS CodePipeline, and I didn't find any information about it in the documentation and event by googling....
So I would like to know if I can set two sources in a pipeline (it could be in the same stage or different stages).
Here is my use case :
I would like my pipeline to start when a file (a specific object) is modified in my s3 bucket
When this file changes and the pipeline is triggered, I would like to clone a codecommit repository and then process the build and other stages...
In the other hand when there is a commit on the master branch of my codecommit repository, I would like the pipeline to start and build my sources.
So The pipeline should be triggered either when the change comes from s3 or codecommit
I don't want to version the s3 file in my codecommit repository because it should be encrypted and used by others teams than dev team working with the git repository
And any time my pipeline starts either if it's from the s3 bucket change or the codecommit push, I should source the commit from the repository for build purposes...
I don't know if my objectives specifications are clear, if yes is it possible to use two source actions in a pipeline as described above and how to achieve this?
Thank you in advance.
Cheers,
Eugène NG
Yes. It is possible to have two sources for an AWS CodePipeline. Or many for that matter. The two sources have to be in your first stage.
Then in your build phase properties, you need to tell it that you are expecting two sources.
Then tell the build project which is your primary source. This is going to be the one that you want your build project to execute the codebuild.
From your buildspec or from any scripts you call, you can then access the source directories by referencing:
$CODEBUILD_SRC_DIR_SourceOutput1
$CODEBUILD_SRC_DIR_SourceOutput2
Just replace SourceOutputX above with what you call your output from the source stage.
I found the following link with more information:
https://docs.aws.amazon.com/codebuild/latest/userguide/sample-multi-in-out.html
Yes, CodePipeline allows multiple source actions in a single pipeline. A change in either source will trigger a pipeline execution. The thing to know is that every pipeline execution will pull the latest source for both actions (not just the one with a change that triggered the pipeline execution).