How to integrate Azure Repo with AWS CodeCommit - amazon-web-services

I want to implement CI/CD in AWS CodeCommit.
I know its possible manually to kickstart the process once the code reached CodeCommit. But I am using Azure DevOps Repo as my source code repo and want to automate the process.
The deployement is done using AWS SAM. I am looking for a method like; when I push a code to Azure Repo , it should reach the AWS CodeCommit and do the CI/CD without any further manual intervention.
Is there any way to do that?

Azure repos and CodeCommit are compliant with the git standard. The git standard allows you to specify multiple remotes. This is useful for if you were maintaining a mirror or, as in your use case, you need to do something in different environments.
You can read about setting multiple remotes here (provided by github; even though you’re not using github the process and commands should be the same).
Once you have your multiple remotes setup, you can configure your CI/CD pipeline to kick off its process to deploy your SAM template based on your push; when you push your code changes they will be sent to both your Azure repo, and your CodeCommit repo, and your CI/CD pipeline that is monitoring your CodeCommit repo will see the change and kickoff its execution.
Its worth pointing out that you’ll need to properly setup and configure your CI/CD pipeline. AWS provides a number of services to support this including AWS CodePipeline, AWS CodeBuild, and AWS CodeDeploy.

Related

How to automatically replicate CodeCommit repo to Bitbucket on-prem

I want to push changes to CodeCommit and have a Bitbucket on prem repo automatically updated as a result.
I've seen a few guides/answers on how to replicate from Bitbucket to CodeCommit, but is there any way to do the opposite?
You could use Eventbridge to monitor repository state change events and trigger an appropriate AWS service to send the data to on premise server.
Some possibilities might be:
Lambda function which connects to Bitbucket Rest API
Codebuild project which does not actually build anything but loads the repository and runs a script which pushes repo to Bitbucket
Datasync (if it is possible to copy data directly on file system level (for example, through SMB)

Version Control And Pipeline for AWS Cloudformation

I'm trying to figure out a way to come up with a CI/CD pipeline for CloudFormation. We use Cloudformation Console directly to deploy our infrastructure and app to the cloud.
Does anyone have any examples of how they have created a CI/CD pipeline using Jenkins or other types of CI tools to do some type of linting, CI, version control, and artifact deployment to Artifactory (or similar toolset)? I'd like to execute a pipeline once a new version of the cloud formation templates is uploaded to Artifactory.
You can always use CodePipeline.
see docs:
CodePipeline
CI:
I am using GitHub, so before i can merge a pull request, my code must pass 3 tests.
Those tests are 3 Codebuilds containers that run tests.
CD:
After my code merged it invoke a CodePipeline that use mainly CodeDeploy and CodeBuild.
About your goal:
I'd like to execute a pipeline once a new version of the cloud formation templates is uploaded to Artifactory.
I don't really think you need a pipeline for this.
Let assume your artifacts uploaded to s3 bucket called artifact-bucket.
You can create a CloudWatch rule that will execute StepFunctions state machine when file added to
artifact-bucket.
see docs:
Trigger StepFunctions from S3
You can easily deploy stack with StepFunctions.

Trigger an AWS CodePipeline on every new pull request in GitHub repo

Source code in my organization is managed in a GitHub repository. For now, our CI process uses AWS CodePipeline as follows:
Webhooks detect code changes in a specific git branch
The updated branch is then used the input for AWS CodeBuild
The finished build is deployed onto one of our staging environments using Elastic Beanstalk
Tests are run on the Elastic Beanstalk environment.
We want to add detection of new pull requests in our git repository. Whenever a new PR is created in our repo, we'd like to automatically trigger a build to an EB environment, through CodePipeline as above.
Our roadblocks:
Looking at the available settings for GitHub Webhooks in CodePipeline, we cannot find a way to specify that the pipeline's trigger should be a new PR.
In any case, the GitHub source for a CodePipeline must be a specific branch. We'd like PRs to be detected in any branch.
What would be the best approach here? I've seen some methods being discussed, but most of them appear to be on the cumbersome/high-maintenance side. If there's anything new in the AWS toolchain that makes this easy, it'd be cool to know.
Thanks!
The best approach to solving this problem seems to be creating a CodePipeline for each PR using a parameterized CloudFormation stack.
Essentially the steps are:
Define your CodePipeline using CloudFormation and have a parameter that identifies the environment - Prod, QA, PR_xyz etc.
Set up CodeBuild to trigger on any changes to your GitHub repository. When a new PR is created, have CodeBuild construct a new CodePipeline based on your CloudFormation template. Supply the name of the PR as the environment name when creating the CloudFormation stack.
Detailed steps are described here: https://moduscreate.com/blog/track-git-branches-aws-codepipeline/

CI / CD using Bitbucket repository for resource provisioning in AWS

I want to achieve a Continuous delivery for provisioning AWS resources using Bitbucket & AWS. My use case is to create a kinesis Firehose Delivery stream with destination as Elastic Search. I want this to be achieved by using the AWS cloudformation templates (keeping in mind the different stages for dev, uat, prod). Whenever I update my bitbucket repo the build should get created and the stack will get updated in AWS. Any help will be highly appreciated.
I have searched a lot over the internet but could not find any relevant examples which clearly describes my use case.
Cloudformatiom template committed in bitbucket will provision AWS resources in the cloud
You can use AWS CodePipeline for this purpose. The only issue is that Code Pipeline does not work directly with Bitbucket, only works with AWS CodeCommit or GitHub as the triggering repo.
But there is a workaround for that. You can set up syncing from Bitbucket to GitHub and then set the GitHub repository as the source repository for the pipeline you will create on AWS CodePipeline. You can find many guides for syncing so I won't explain here. The pipeline itself can be defined as a template.
I have already explained Setting up AWS CodePipeline in another answer here that you can follow for this purpose. Hope this helps!

CodeDeploy to S3

I have a site in a S3 bucket, configured for web access, for which I run an aws s3 sync command every time I push on a specific git repository (I'm using Gitlab at the moment).
So if I push to stable branch, a Gitlab runner performs the npm start build command for building the site, and then aws s3 sync to synchronize to a specific bucket.
I want to migrate to CodeCommit and use pure AWS tools to do the same.
So far I was able to successfully setup the repository, create a CodeBuild for building the artifact, and the artifact is being stored (not deployed) to a S3 bucket. Difference is that I can't get it to deploy to the root folder of the bucket instead of a subfolder, seems like the process is not made for that. I need it to be on a root folder because of how the web access is configured.
For the deployment process, I was taking a look at CodeDeploy but it doesn't actually let me deploy to S3 bucket, it only uses the bucket as an intermediary for deployment to a EC2 instance. So far I get the feeling CodeDeploy is useful only for deployments involving EC2.
This tutorial with a similar requirement to mine, uses CodePipeline and CodeBuild, but the deployment step is actually a aws s3 sync command (same as I was doing on Gitlab), and the actual deployment step on CodePipeline is disabled.
I was looking into a solution which involves using AWS features made for this specific purpose, but I can't find any.
I'm also aware of LambCI, but to me looks like what CodePipeline / CodeBuild is doing, storing artifacts (not deploying to the root folder of the bucket). Plus, I'm looking for an option which doesn't require me to learn or deploy new configuration files (outside AWS config files).
Is this possible with the current state of AWS features?
Today AWS has announced as a new feature the ability to target S3 in the deployment stage of CodePipeline. The announcement is here, and the documentation contains a tutorial available here.
Using your CodeBuild/CodePipeline approach, you should now be able to choose S3 as the deployment provider in the deployment stage rather than performing the sync in your build script. To configure the phase, you provide an S3 bucket name, specify whether to extract the contents of the artifact zip, and if so provide an optional path for the extraction. This should allow you to deploy your content directly to the root of a bucket by omitting the path.
I was dealing with similar issue and as far as I was able to find out, there is no service which is suitable for deploying app to S3.
AWS CodeDeploy is indeed for deploying code running as server.
My solution was to use CodePipeline with three stages:
Source which takes source code from AWS CodeCommit
Build with AWS CodeBuild
Custom lambda function which after successful build takes artifact from S3 artifact storage, unzip it and copies files to my S3 website host.
I used this AWS lambda function from SeamusJ https://github.com/SeamusJ/deploy-build-to-s3
Several changes had to be made, I used node-unzip-2 instead of unzip-stream for unziping artifict from s3.
Also I had to change ACLs in website.ts file
Uploading from CodeBuild is currently the best solution available.
There's some suggestions on how to orchestrate this deployment via CodePipeline in this answer.