How to integrate a GIT webhook with AWS CodePipeline? - amazon-web-services

Can give me some input on configuring AWS CodePipeline to trigger a ZipDownload or a Git pull if the developer commits a code into the Git branch ? I'm new to AWS CodePipeline and AWS Lamba.

For CodePipeline to subscribe to a GitHub repo (using the console):
Create a Source Action of "GitHub", and choose your Repo. Choose an Output artifact name that you will remember for the next step. Under advanced, select "Run pipeline when changes in source content are detected".
Create a Build Action using the build provider of your choice. Choose the Input artifact name you chose in step #1.
When a change occurs in the GitHub repo, CodePipeline will execute the Source action, which will build a zip of the repo as is, and put it in an S3 bucket as an Output artifact. It will then pass this file's S3 name into the Build action (using the variable name given in the steps above), so that the zip file can be downloaded and built.
If you have a working buildspec.yml in the root of the repo, you can use the AWS CodeBuild provider, and the artifacts will be copied to the output bucket when done.

CodePipeline is supposed to manage automation from source code to final deployment, which embrace the concept of continuous integration. Though how to use CodePipeline depends on use case, CodePipeline does source code download for you by detecting source change. Which means what you should thinking about is what to do next after pushing the code, such as run a build action or test action.
It's worth to follow the tutorial to build a pipeline and learn how it works: Tutorial: Create a Simple Pipeline (AWS CodeCommit Repository). It may only take half an hour.
Though it's using codecommit, but it works similar as github. For integration with GitHub, there is a blog you may found useful: Integrating Git with AWS CodePipeline

Related

How do you deploy existing deployment artifacts through codepipeline?

Background: I am using github actions for CI and aws codepipeline for CD. Github actions pushes a set of versioned artifacts to S3 and ECR. I setup my AWS codepipeline using CDK.
Question: how do I get the codepipeline to pick up those artifacts and deploy them?
opt 1: Just tag your images and everything else with "latest"
answer: no, having a pipeline that always deploys the latest is not the same as a pipeline that deploys version X.
opt 2: Just send the version number (version X) to codepipeline so that codepipeline knows which artifacts to fetch
answer: no, codepipeline seems to support passing variables between actions (actions generate output variables, other action can pick them up), but I have found no documentation stating that a codepipeline can be triggered with input parameters.
opt 3: tag your commit in github and use a webhook to pass that information along to codepipeline.
answer: no, codepipeline can filter webhooks so that you can trigger the pipeline for certain events, but it does not support parsing the webhook body, picking out stuff you want to use.
opt 4: resolve the version number in cdk synth before that pesky critter tries to update itself.
answer: yeah, that kinda works, I can query an ecr repo, find that actual version number of the release and regenerate the pipeline so that it points to the resolved version. Its not the same as passing a version number from github to codepipeline, but at least my pipeline is versioned and all my deployment units (like ECS services, batch jobs, etc) are pointing to an explicit version after deployment. Unfortunately, this has several drawbacks, like making the deployment pipeline (even) slow(er) and if the pipeline fails I will have update the pipeline by running cdk deploy from my machine.
opt 5: you come in to save the day :-)

CodeBuild extract in to folder in S3 bucket

I've been trying to navigate the CodeBuild docs to find an answer without any success.
I have a pipeline that pulls from github on commit and builds and deploys to S3. However, it takes the dist directory and just replaces anything currently in the bucket.
I wish to edit my buildspec to deploy to S3 in a folder named after the commit ID. Meaning CODEBUILD_RESOLVED_SOURCE_VERSION/...
Is there a way to do this in the buildspec.yml?
I want to keep a history of old builds in case I need to revert to an older.
Figured it out.
In the deployment stage of CodePipeline, add #{SourceVariables.CommitId} to the deployment path option.

Trigger an AWS CodePipeline on every new pull request in GitHub repo

Source code in my organization is managed in a GitHub repository. For now, our CI process uses AWS CodePipeline as follows:
Webhooks detect code changes in a specific git branch
The updated branch is then used the input for AWS CodeBuild
The finished build is deployed onto one of our staging environments using Elastic Beanstalk
Tests are run on the Elastic Beanstalk environment.
We want to add detection of new pull requests in our git repository. Whenever a new PR is created in our repo, we'd like to automatically trigger a build to an EB environment, through CodePipeline as above.
Our roadblocks:
Looking at the available settings for GitHub Webhooks in CodePipeline, we cannot find a way to specify that the pipeline's trigger should be a new PR.
In any case, the GitHub source for a CodePipeline must be a specific branch. We'd like PRs to be detected in any branch.
What would be the best approach here? I've seen some methods being discussed, but most of them appear to be on the cumbersome/high-maintenance side. If there's anything new in the AWS toolchain that makes this easy, it'd be cool to know.
Thanks!
The best approach to solving this problem seems to be creating a CodePipeline for each PR using a parameterized CloudFormation stack.
Essentially the steps are:
Define your CodePipeline using CloudFormation and have a parameter that identifies the environment - Prod, QA, PR_xyz etc.
Set up CodeBuild to trigger on any changes to your GitHub repository. When a new PR is created, have CodeBuild construct a new CodePipeline based on your CloudFormation template. Supply the name of the PR as the environment name when creating the CloudFormation stack.
Detailed steps are described here: https://moduscreate.com/blog/track-git-branches-aws-codepipeline/

CodeDeploy to S3

I have a site in a S3 bucket, configured for web access, for which I run an aws s3 sync command every time I push on a specific git repository (I'm using Gitlab at the moment).
So if I push to stable branch, a Gitlab runner performs the npm start build command for building the site, and then aws s3 sync to synchronize to a specific bucket.
I want to migrate to CodeCommit and use pure AWS tools to do the same.
So far I was able to successfully setup the repository, create a CodeBuild for building the artifact, and the artifact is being stored (not deployed) to a S3 bucket. Difference is that I can't get it to deploy to the root folder of the bucket instead of a subfolder, seems like the process is not made for that. I need it to be on a root folder because of how the web access is configured.
For the deployment process, I was taking a look at CodeDeploy but it doesn't actually let me deploy to S3 bucket, it only uses the bucket as an intermediary for deployment to a EC2 instance. So far I get the feeling CodeDeploy is useful only for deployments involving EC2.
This tutorial with a similar requirement to mine, uses CodePipeline and CodeBuild, but the deployment step is actually a aws s3 sync command (same as I was doing on Gitlab), and the actual deployment step on CodePipeline is disabled.
I was looking into a solution which involves using AWS features made for this specific purpose, but I can't find any.
I'm also aware of LambCI, but to me looks like what CodePipeline / CodeBuild is doing, storing artifacts (not deploying to the root folder of the bucket). Plus, I'm looking for an option which doesn't require me to learn or deploy new configuration files (outside AWS config files).
Is this possible with the current state of AWS features?
Today AWS has announced as a new feature the ability to target S3 in the deployment stage of CodePipeline. The announcement is here, and the documentation contains a tutorial available here.
Using your CodeBuild/CodePipeline approach, you should now be able to choose S3 as the deployment provider in the deployment stage rather than performing the sync in your build script. To configure the phase, you provide an S3 bucket name, specify whether to extract the contents of the artifact zip, and if so provide an optional path for the extraction. This should allow you to deploy your content directly to the root of a bucket by omitting the path.
I was dealing with similar issue and as far as I was able to find out, there is no service which is suitable for deploying app to S3.
AWS CodeDeploy is indeed for deploying code running as server.
My solution was to use CodePipeline with three stages:
Source which takes source code from AWS CodeCommit
Build with AWS CodeBuild
Custom lambda function which after successful build takes artifact from S3 artifact storage, unzip it and copies files to my S3 website host.
I used this AWS lambda function from SeamusJ https://github.com/SeamusJ/deploy-build-to-s3
Several changes had to be made, I used node-unzip-2 instead of unzip-stream for unziping artifict from s3.
Also I had to change ACLs in website.ts file
Uploading from CodeBuild is currently the best solution available.
There's some suggestions on how to orchestrate this deployment via CodePipeline in this answer.

AWS CodePipeline adding artifacts to S3 in less useful format than running steps individually

I've set up a CodePipeline with the end goal of having a core service reside on S3 as a private maven repo for other pipelines to rely on. When the core service is updated and pushed to AWS CodeCommit, the pipeline should run, test it, build a jar using a maven docker image, then push the resulting jar to S3 where it can be accessed by other applications as needed.
Unfortunately, while the CodeBuild service works exactly how I want it to, uploading XYZCore.jar to /release on the bucket, the automated pipeline itself does not. Instead, it uploads to a "XYZCorePipeline" folder, which contains the input and output artifacts of the build. The output artifact itself is a zip file that's just a random string of characters. I checked the pipeline and it's using the service correctly, but pipeline specific builds always output there while standalone builds of the CodeBuild service output how I'd like them, allowing me to take advantage of things like versioning. What's the best way for me to fix the settings so the two builds match?
Unfortunately CodePipeline does not support this use case.
As a workaround you could upload the artifact to S3 by invoking AWS CLI (aws s3 cp ...) from your buildspec.yml post_build.