I want to achieve a Continuous delivery for provisioning AWS resources using Bitbucket & AWS. My use case is to create a kinesis Firehose Delivery stream with destination as Elastic Search. I want this to be achieved by using the AWS cloudformation templates (keeping in mind the different stages for dev, uat, prod). Whenever I update my bitbucket repo the build should get created and the stack will get updated in AWS. Any help will be highly appreciated.
I have searched a lot over the internet but could not find any relevant examples which clearly describes my use case.
Cloudformatiom template committed in bitbucket will provision AWS resources in the cloud
You can use AWS CodePipeline for this purpose. The only issue is that Code Pipeline does not work directly with Bitbucket, only works with AWS CodeCommit or GitHub as the triggering repo.
But there is a workaround for that. You can set up syncing from Bitbucket to GitHub and then set the GitHub repository as the source repository for the pipeline you will create on AWS CodePipeline. You can find many guides for syncing so I won't explain here. The pipeline itself can be defined as a template.
I have already explained Setting up AWS CodePipeline in another answer here that you can follow for this purpose. Hope this helps!
Related
Since the code pipeline does not support git tag-based triggers natively, what's the best way to control what commit should be deployed using code pipeline/code build, in case we do not want to deploy the head of the branch?
I noticed a article about "Customizing triggers for AWS CodePipeline with AWS Lambda and Amazon CloudWatch Events". Hopefully, it can help you.
However, the solution in the article maybe a little complex.
If your source code is stored in a GitHub or Bitbucket repository. You may try to create a custom webhook with some filters
as you like, please refer to this link for details about creating webhook.
Source code in my organization is managed in a GitHub repository. For now, our CI process uses AWS CodePipeline as follows:
Webhooks detect code changes in a specific git branch
The updated branch is then used the input for AWS CodeBuild
The finished build is deployed onto one of our staging environments using Elastic Beanstalk
Tests are run on the Elastic Beanstalk environment.
We want to add detection of new pull requests in our git repository. Whenever a new PR is created in our repo, we'd like to automatically trigger a build to an EB environment, through CodePipeline as above.
Our roadblocks:
Looking at the available settings for GitHub Webhooks in CodePipeline, we cannot find a way to specify that the pipeline's trigger should be a new PR.
In any case, the GitHub source for a CodePipeline must be a specific branch. We'd like PRs to be detected in any branch.
What would be the best approach here? I've seen some methods being discussed, but most of them appear to be on the cumbersome/high-maintenance side. If there's anything new in the AWS toolchain that makes this easy, it'd be cool to know.
Thanks!
The best approach to solving this problem seems to be creating a CodePipeline for each PR using a parameterized CloudFormation stack.
Essentially the steps are:
Define your CodePipeline using CloudFormation and have a parameter that identifies the environment - Prod, QA, PR_xyz etc.
Set up CodeBuild to trigger on any changes to your GitHub repository. When a new PR is created, have CodeBuild construct a new CodePipeline based on your CloudFormation template. Supply the name of the PR as the environment name when creating the CloudFormation stack.
Detailed steps are described here: https://moduscreate.com/blog/track-git-branches-aws-codepipeline/
I want to implement CI/CD in AWS CodeCommit.
I know its possible manually to kickstart the process once the code reached CodeCommit. But I am using Azure DevOps Repo as my source code repo and want to automate the process.
The deployement is done using AWS SAM. I am looking for a method like; when I push a code to Azure Repo , it should reach the AWS CodeCommit and do the CI/CD without any further manual intervention.
Is there any way to do that?
Azure repos and CodeCommit are compliant with the git standard. The git standard allows you to specify multiple remotes. This is useful for if you were maintaining a mirror or, as in your use case, you need to do something in different environments.
You can read about setting multiple remotes here (provided by github; even though you’re not using github the process and commands should be the same).
Once you have your multiple remotes setup, you can configure your CI/CD pipeline to kick off its process to deploy your SAM template based on your push; when you push your code changes they will be sent to both your Azure repo, and your CodeCommit repo, and your CI/CD pipeline that is monitoring your CodeCommit repo will see the change and kickoff its execution.
Its worth pointing out that you’ll need to properly setup and configure your CI/CD pipeline. AWS provides a number of services to support this including AWS CodePipeline, AWS CodeBuild, and AWS CodeDeploy.
Is there any way to automate cloudformation templates deployment? I mean it would be awesome if I just push the changes in the code and somebody looking for those changes in the code and once they appear - deploy the updated template.
Yes, with AWS Code Pipeline !
http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/continuous-delivery-codepipeline.html
With AWS CloudFormation and AWS CodePipeline, you can use continuous delivery to automatically build and test changes to your AWS CloudFormation templates [...]
AWS CodePipeline has built-in integration with AWS CloudFormation, so you can specify AWS CloudFormation-specific actions, such as creating, updating, or deleting a stack, within a pipeline.
CodePipeline is a great way to do what you're looking for.
At Giftbit we do this is by having a Github Repo that has our CloudFormation template in it.
When we want to make a change, we make the changes on a branch in the repo, and create a pull request into the staging branch. CodePipeline monitors the staging branch then automates a CodeBuild to validate the templates, package any SubStacks, then creates a Change Set and Executes it.
Below are have some examples to help Quick Start anyone interested:
Continuous Integration CloudFormation Template Example
Serverless Application Model (SAM) that gets deployed
I have a site in a S3 bucket, configured for web access, for which I run an aws s3 sync command every time I push on a specific git repository (I'm using Gitlab at the moment).
So if I push to stable branch, a Gitlab runner performs the npm start build command for building the site, and then aws s3 sync to synchronize to a specific bucket.
I want to migrate to CodeCommit and use pure AWS tools to do the same.
So far I was able to successfully setup the repository, create a CodeBuild for building the artifact, and the artifact is being stored (not deployed) to a S3 bucket. Difference is that I can't get it to deploy to the root folder of the bucket instead of a subfolder, seems like the process is not made for that. I need it to be on a root folder because of how the web access is configured.
For the deployment process, I was taking a look at CodeDeploy but it doesn't actually let me deploy to S3 bucket, it only uses the bucket as an intermediary for deployment to a EC2 instance. So far I get the feeling CodeDeploy is useful only for deployments involving EC2.
This tutorial with a similar requirement to mine, uses CodePipeline and CodeBuild, but the deployment step is actually a aws s3 sync command (same as I was doing on Gitlab), and the actual deployment step on CodePipeline is disabled.
I was looking into a solution which involves using AWS features made for this specific purpose, but I can't find any.
I'm also aware of LambCI, but to me looks like what CodePipeline / CodeBuild is doing, storing artifacts (not deploying to the root folder of the bucket). Plus, I'm looking for an option which doesn't require me to learn or deploy new configuration files (outside AWS config files).
Is this possible with the current state of AWS features?
Today AWS has announced as a new feature the ability to target S3 in the deployment stage of CodePipeline. The announcement is here, and the documentation contains a tutorial available here.
Using your CodeBuild/CodePipeline approach, you should now be able to choose S3 as the deployment provider in the deployment stage rather than performing the sync in your build script. To configure the phase, you provide an S3 bucket name, specify whether to extract the contents of the artifact zip, and if so provide an optional path for the extraction. This should allow you to deploy your content directly to the root of a bucket by omitting the path.
I was dealing with similar issue and as far as I was able to find out, there is no service which is suitable for deploying app to S3.
AWS CodeDeploy is indeed for deploying code running as server.
My solution was to use CodePipeline with three stages:
Source which takes source code from AWS CodeCommit
Build with AWS CodeBuild
Custom lambda function which after successful build takes artifact from S3 artifact storage, unzip it and copies files to my S3 website host.
I used this AWS lambda function from SeamusJ https://github.com/SeamusJ/deploy-build-to-s3
Several changes had to be made, I used node-unzip-2 instead of unzip-stream for unziping artifict from s3.
Also I had to change ACLs in website.ts file
Uploading from CodeBuild is currently the best solution available.
There's some suggestions on how to orchestrate this deployment via CodePipeline in this answer.