In AWS CDK, I would like to define a CodeBuild project that will run every time a pull request is opened or updated in CodeCommit. I am doing this to be able to have my tests and build validated prior to merging to the main branch.
How do I have this CodeBuild project run for the branch that is associated with the pull request?
Below is my code:
import { Repository } from 'aws-cdk-lib/aws-codecommit';
import { BuildSpec, Project, Source } from 'aws-cdk-lib/aws-codebuild';
import { CodeBuildProject } from 'aws-cdk-lib/aws-events-targets';
const repo = Repository.fromRepositoryName(this, 'MyRepo', 'my-repo');
const project = new Project(this, 'MyCodeBuildProject', {
source: Source.codeCommit({ repository: repo }),
buildSpec: BuildSpec.fromObject({
version: '0.2',
phases: {
build: {
commands: [ 'npm run build' ],
},
},
}),
});
const myRule = repo.onPullRequestStateChange('MyRule', {
target: new targets.CodeBuildProject(project),
});
I have tried providing it to the project source this way:
import { ReferenceEvent } from 'aws-cdk-lib/aws-codecommit';
...
source: Source.codeCommit({ repository: repo, branchOrRef: ReferenceEvent.name }),
But I receive this error:
reference not found for primary source and source version $.detail.referenceName
CodeCommit -> Open pull request -> CloudWatch event (EventBridge) -> CodeBuild
AWS CDK v2.5.0
TypeScript
I was able to solve this by extracting the branch from the event and then passing that to the target.
import { EventField, RuleTargetInput } from 'aws-cdk-lib/aws-events';
const myRule = repo.onPullRequestStateChange('MyRule', {
target: new targets.CodeBuildProject(project, {
event: RuleTargetInput.fromObject({
sourceVersion: EventField.fromPath('$.detail.sourceReference'),
}),
}),
});
This works because targets.CodeBuildProject() will call the CodeBuild StartBuild API. The event key in CodeBuildProjectProps specifies the payload that will be sent to the StartBuild API. by default, the entire Event is sent, but this is not in the format expected by CodeBuild. The StartBuild API payload allows a branch, commit, or tag to be specified using sourceVersion. We can extract data from the event using EventField. Events from CodeCommit have the reference nested under detail.sourceReference. This will be something like 'refs/heads/my-branch'. Using Eventfield.fromPath() we can use the $. syntax to access the event that triggered this rule, and then provide a dot-notation string for the JSON path to access the data we need.
Related
For simplicity, suppose we have a stack that contains a single lambda function created as a Docker image:
import { Stack, StackProps, Duration } from 'aws-cdk-lib';
import { Construct } from 'constructs';
import * as lambda from 'aws-cdk-lib/aws-lambda';
export class FunStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const exampleFun = new lambda.DockerImageFunction(this, "ExampleFun", {
code: lambda.DockerImageCode.fromImageAsset("lambda/example_fun"),
timeout: Duration.seconds(10)
});
}
}
I'm omitting the contents of lambda/example_fun because it is straightforward, i.e., it contains a single .py file with some dummy handler and a Dockerfile that uses say public.ecr.aws/lambda/python:3.9 as base and uses the handler as cmd.
Now, especially if there were many such lambdas and/or they were large, a CDK pipeline such as the one you construct as part of the AWS CDK Workshop won't cache any of them. Concretely, let us have:
import * as cdk from 'aws-cdk-lib';
import * as codecommit from 'aws-cdk-lib/aws-codecommit';
import { Construct } from 'constructs';
import {CodeBuildStep, CodePipeline, CodePipelineSource} from "aws-cdk-lib/pipelines";
import { FunStack } from "./fun-stack";
import { Stage, StageProps } from "aws-cdk-lib";
export class FunPipelineStage extends Stage {
constructor(scope: Construct, id: string, props?: StageProps) {
super(scope, id, props);
new FunStack(this, 'Fun');
}
}
export class FunPipelineStack extends cdk.Stack {
constructor(scope: Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
const repo = new codecommit.Repository(this, 'FunRepo', {
repositoryName: "FunRepo"
});
const pipeline = new CodePipeline(this, 'Pipeline', {
pipelineName: 'FunLambdaPipeline',
synth: new CodeBuildStep('SynthStep', {
input: CodePipelineSource.codeCommit(repo, 'master'),
installCommands: [
'npm install -g aws-cdk'
],
commands: [
'npm ci',
'npm run build',
'npx cdk synth'
]
})
});
const deploy = new FunPipelineStage(this, 'Deploy');
const deployStage = pipeline.addStage(deploy);
}
}
How should the pipeline be modified to allow us to cache the DockerImageFunction the pipeline generates when deploying?
If I'm reading correctly the documentation for Build caching in AWS CodeBuild and deducing correctly from the CDK docs for BuildSpec, I think I should use codebuild.BuildSpec.fromObject to specify the buildspec file.
With some experimentation, I'm able to do simple install and/or build commands via fromObject and the buildspec file, but can't quite figure out how to cache. In particular, how can the pipeline refer to the Docker image being built as part of the stack? The goal is that on each build, in case the Docker images haven't changed, they would be read from the cache and avoid being rebuilt.
Perhaps another alternative is to set up an ECR repository, somehow on each build check whether the hash of the built container is found and if not, build and push. However, I don't know how to concretely do this as I can't see how to refer to the Docker images built, if that makes sense.
Provide a partial buildspec with the partialBuildSpec prop and specify the caching method using the cache prop as shown in the module overview:
...
synth: new CodeBuildStep('SynthStep', {
input: CodePipelineSource.codeCommit(repo, 'master'),
installCommands: [
'npm install -g aws-cdk'
],
commands: [
'npm ci',
'npm run build',
'npx cdk synth'
],
partialBuildSpec: codebuild.BuildSpec.fromObject({
cache: {
paths: [ "path/to/cache/**/*" ]
}
}),
cache: codebuild.Cache.bucket(new s3.Bucket(this, 'Cache')),
})
Other than that, your premise is faulty: the containers will be built in any case - you can just make it faster. It's impossible to know whether the container hash changed without building it first. If you want to cache docker layers to make builds faster, you can try including /var/lib/docker/overlay2/**/* in the cache.
Possibly relevant:
https://github.com/aws/aws-cdk/issues/19157
https://github.com/aws/aws-cdk/issues/9080
I'm trying to follow tutorials and use AWS's CDK CLI to deploy a stack using AppSync and some resource creations fail with errors like the following showing under events for the Stack in the CloudFormation console:
Invalid principal in policy: "SERVICE":"appsync" (Service: AmazonIdentityManagement; Status Code: 400; Error Code: MalformedPolicyDocument; Request ID: 8d98f07c-d717-4dfe-af96-14f2d72d993f; Proxy: null)
I suspect what happened is that when cleaning up things on my personal developer account I deleted something I shouldn't have, but due to limited AWS experience I don't know what to create, I suspect it's an IAM policy, but I don't know the exact settings to use.
I'm trying on a new clean project created using cdk init sample-app --language=typescript. Running cdk deploy immediately after the above command works fine.
I originally tried using cdk-appsync-transformer to create GraphQL endpoints to a DynamoDB table and encountered the error.
I tried re-running cdk bootstrap after deleting the CDKToolkit CloudFormation stack, but it's not fixing this problem.
To rule out that it's due to something with the 3rd party library, I tried using AWS's own AppSync Construct Library instead and even following the example there I encountered the same error (although on creation of different resource types).
Reproduction steps
Create a new folder.
In the new folder run cdk init sample-app --language=typescript.
Install the AWS AppSync Construct Library: npm i #aws-cdk/aws-appsync-alpha#2.58.1-alpha.0 --save.
As per AWS's docs:
Create lib/schema.graphql with the following:
type demo {
id: String!
version: String!
}
type Query {
getDemos: [ demo! ]
}
input DemoInput {
version: String!
}
type Mutation {
addDemo(input: DemoInput!): demo
}
Update the lib/<projectName>-stack.ts file to be essentially like the following:
import * as appsync from '#aws-cdk/aws-appsync-alpha';
import { Duration, Stack, StackProps } from 'aws-cdk-lib';
import * as dynamodb from 'aws-cdk-lib/aws-dynamodb';
import * as sns from 'aws-cdk-lib/aws-sns';
import * as subs from 'aws-cdk-lib/aws-sns-subscriptions';
import * as sqs from 'aws-cdk-lib/aws-sqs';
import { Construct } from 'constructs';
import * as path from 'path';
export class CdkTest3Stack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const queue = new sqs.Queue(this, 'CdkTest3Queue', {
visibilityTimeout: Duration.seconds(300)
});
const topic = new sns.Topic(this, 'CdkTest3Topic');
topic.addSubscription(new subs.SqsSubscription(queue));
const api = new appsync.GraphqlApi(this, 'Api', {
name: 'demo',
schema: appsync.SchemaFile.fromAsset(path.join(__dirname, 'schema.graphql')),
authorizationConfig: {
defaultAuthorization: {
authorizationType: appsync.AuthorizationType.IAM,
},
},
xrayEnabled: true,
});
const demoTable = new dynamodb.Table(this, 'DemoTable', {
partitionKey: {
name: 'id',
type: dynamodb.AttributeType.STRING,
},
});
const demoDS = api.addDynamoDbDataSource('demoDataSource', demoTable);
// Resolver for the Query "getDemos" that scans the DynamoDb table and returns the entire list.
// Resolver Mapping Template Reference:
// https://docs.aws.amazon.com/appsync/latest/devguide/resolver-mapping-template-reference-dynamodb. html
demoDS.createResolver('QueryGetDemosResolver', {
typeName: 'Query',
fieldName: 'getDemos',
requestMappingTemplate: appsync.MappingTemplate.dynamoDbScanTable(),
responseMappingTemplate: appsync.MappingTemplate.dynamoDbResultList(),
});
// Resolver for the Mutation "addDemo" that puts the item into the DynamoDb table.
demoDS.createResolver('MutationAddDemoResolver', {
typeName: 'Mutation',
fieldName: 'addDemo',
requestMappingTemplate: appsync.MappingTemplate.dynamoDbPutItem(
appsync.PrimaryKey.partition('id').auto(),
appsync.Values.projecting('input'),
),
responseMappingTemplate: appsync.MappingTemplate.dynamoDbResultItem(),
});
//To enable DynamoDB read consistency with the `MappingTemplate`:
demoDS.createResolver('QueryGetDemosConsistentResolver', {
typeName: 'Query',
fieldName: 'getDemosConsistent',
requestMappingTemplate: appsync.MappingTemplate.dynamoDbScanTable(true),
responseMappingTemplate: appsync.MappingTemplate.dynamoDbResultList(),
});
}
}
Run cdk deploy.
I am trying to setup a brand new pipeline with the last version of AWS CDK for typescript (1.128).
The creation of the pipeline is pretty straight forward. I have added sources and build stages with no issues. The objective here is to have an automatic deployment of a static landing page.
So far I have this piece of code:
const landingPageStep = new ShellStep(`${PREFIX}LandingPageCodeBuildStep`, {
input: CodePipelineSource.connection(`${GIT_ORG}/vicinialandingpage`, GIT_MAIN, {
connectionArn: GIT_CONNECTION_ARN, // Created using the AWS console
}),
installCommands: [
'npm ci',
],
commands: [
'npm run build',
],
primaryOutputDirectory: 'out',
})
const pipeline = new CodePipeline(this, `${PREFIX}Pipeline`, {
pipelineName: `${PREFIX}Pipeline`,
synth: new ShellStep(`${PREFIX}Synth`, {
input: CodePipelineSource.connection(`${GIT_ORG}/viciniacdk`, GIT_MAIN, {
connectionArn: GIT_CONNECTION_ARN, // Created using the AWS console
}),
commands: [
'npm ci',
'npm run build',
'npx cdk synth',
],
additionalInputs: {
'landing_page': landingPageStep,
},
}),
});
The step I am not sure how to achieve it is how to deploy to S3 using the output of "landing_page". With previous versions of Pipelines there was a heavy use of Artifacts objects and CodePipelineActions, something similar to this where sourceOutput is an Artifact object:
const targetBucket = new s3.Bucket(this, 'MyBucket', {});
const pipeline = new codepipeline.Pipeline(this, 'MyPipeline');
const deployAction = new codepipeline_actions.S3DeployAction({
actionName: 'S3Deploy',
stage: deployStage,
bucket: targetBucket,
input: sourceOutput,
});
const deployStage = pipeline.addStage({
stageName: 'Deploy',
actions: [deployAction],
});
Now it is completely different since you have access to FileSet objects and apparently the build steps are intended to be used nesting outputs as the example above. Every output file is saved in a bucket with ugly file names, so it is not intended to be accessed directly neither.
I have seen some hacky approaches replacing ShellStep by CodeBuildStep and using as a postbuild command in the buildspec.yml file something like this:
aws s3 sync out s3://cicd-codebuild-static-website/
But it is resolved in the build stage and not in a deployment stage where it will be ideal to exist.
I have not seen anything insightful in the documentation so any suggestion is welcome. Thanks!
You can extend Step and implement ICodePipelineActionFactory. It's an interface that gets codepipeline.IStage and adds whatever actions you need to add.
Once you have the factory step, you pass it as either pre or post options of the addStage() method option.
Something close to the following should work:
class S3DeployStep extends Step implements ICodePipelineActionFactory {
constructor(private readonly provider: codepipeline_actions.JenkinsProvider, private readonly input: FileSet) {
}
public produceAction(stage: codepipeline.IStage, options: ProduceActionOptions): CodePipelineActionFactoryResult {
stage.addAction(new codepipeline_actions.S3DeployAction({
actionName: 'S3Deploy',
stage: deployStage,
bucket: targetBucket,
input: sourceOutput,
runOrder: options.runOrder,
}));
return { runOrdersConsumed: 1 };
}
}
// ...
pipeline.addStage(stage, {post: [new S3DeployStep()]});
But a way way way simpler method would be to use BucketDeployment to do it as part of the stack deployment. It creates a custom resource that copies data to a bucket from your assets or from another bucket. It won't get its own step in the pipeline and it will create a Lambda function under the hood, but it's simpler to use.
I'm using the official AWS documentation to create a pipeline using CDK: https://docs.aws.amazon.com/cdk/latest/guide/cdk_pipeline.html#cdk_pipeline_define (with a slight variation to the docs, where I used a CodeStar connection, as the code comments recommend)
This automatically creates a self-mutating pipeline, with three stages -- Source, Synth, and UpdatePipeline. That's great.
I would like to add a new stage with a CodeBuild action. I'd like the CodeBuild action to be based on the buildspec.yml file in the source directory.
On the console, I can easily do this by clicking "Add new stage", "Add action", and selecting the input artifact from the dropdown menu.
However, on CDK, with this recommended setup there's no easy way to get access to the input artifacts.
I managed to do it by forcing buildPipeline() and doing this:
import * as cdk from "#aws-cdk/core";
import {
CodePipeline,
ShellStep,
CodePipelineSource,
} from "#aws-cdk/pipelines";
import * as codebuild from "#aws-cdk/aws-codebuild";
import * as codepipelineActions from "#aws-cdk/aws-codepipeline-actions";
export class PipelineStack extends cdk.Stack {
public readonly source: cdk.CfnOutput
constructor(scope: cdk.Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
const source = CodePipelineSource.connection("someuser/somerepo", "master", {
connectionArn: "arn:aws:codestar-connections:us-east-1:REDACTED:connection/REDACTED"
});
const synthShellStep = new ShellStep("Synth", {
input: source,
commands: [
"cd infrastructure",
"npm run ci",
"npm run build",
"npx cdk synth"
],
"primaryOutputDirectory": "infrastructure/cdk.out"
});
const pipeline = new CodePipeline(this, "Pipeline", {
pipelineName: "FancyPipeline",
synth: synthShellStep
});
// Need to build the pipeline to access the
// source artifact
pipeline.buildPipeline();
const sourceStage = pipeline.pipeline.stage("Source");
if (sourceStage) {
const sourceOutputs = sourceStage.actions[0].actionProperties.outputs;
if (sourceOutputs && sourceOutputs.length > 0) {
const sourceArtifact = sourceOutputs[0];
const codeBuildProject = new codebuild.PipelineProject(this, 'DockerBuildProject', {
environment: {
privileged: true
}
});
const buildAction = new codepipelineActions.CodeBuildAction({
actionName: 'DockerBuild',
project: codeBuildProject,
input: sourceArtifact,
environmentVariables: {
AWS_DEFAULT_REGION: {
value: this.region
},
AWS_ACCOUNT_ID: {
value: this.account
},
IMAGE_REPO_NAME: {
value: "somereponame"
},
IMAGE_TAG: {
value: "latest"
}
}
});
pipeline.pipeline.addStage({
stageName: "DockerBuildStage",
actions: [buildAction],
});
}
}
}
}
But this feels overall pretty awkward, and I can't call addStage() on the CodePipeline construct anymore. Surely there's a better way to do what I'm trying to do?
Any help/advice would be appreciated. Thanks.
The codepipelineActions.CodeBuildAction method accepts a parameter titled outputs which is for the list of output Artifacts for this action. TypeScript source here. I think it's easier to follow in the python version of the docs though (link here).
I am trying to deploy a Google Cloud Function using Google Cloud Build (including substitutions).
However, I am receiving the error when it builds:
ERROR: (gcloud.functions.deploy) OperationError: code=3, message=Function failed on loading user code. Error message: Provided module can't be loaded.
Detailed stack trace: Error: Incoming webhook URL is required
My cloudbuild.yaml file contains:
steps:
# This step builds the container image.
- name: 'gcr.io/cloud-builders/gcloud'
id: Deploy
args: ['functions', 'deploy', 'subscribeSlack', '--trigger-topic', 'cloud-builds', '--runtime', 'nodejs10', '--set-env-vars', '"SLACK_WEBHOOK_URL=${_SLACK_WEBHOOK_URL}"', '--region', '${_REGION}']
env:
- 'SLACK_WEBHOOK_URL=${_SLACK_WEBHOOK_URL}'
And, I have added the substitutions in the trigger via Google Cloud Console:
screenshot of build trigger with substitutions filled in.
The function itself is based on the example from Google's documentation:
const { IncomingWebhook } = require('#slack/webhook');
const url = process.env.SLACK_WEBHOOK_URL;
const webhook = new IncomingWebhook(url);
// subscribeSlack is the main function called by Cloud Functions.
module.exports.subscribeSlack = (pubSubEvent, context) => {
const build = eventToBuild(pubSubEvent.data);
// Skip if the current status is not in the status list.
// Add additional statuses to list if you'd like:
// QUEUED, WORKING, SUCCESS, FAILURE,
// INTERNAL_ERROR, TIMEOUT, CANCELLED
const status = ['SUCCESS', 'FAILURE', 'INTERNAL_ERROR', 'TIMEOUT'];
if (status.indexOf(build.status) === -1) {
return;
}
// Send message to Slack.
const message = createSlackMessage(build);
webhook.send(message);
};
// eventToBuild transforms pubsub event message to a build object.
const eventToBuild = (data) => {
return JSON.parse(Buffer.from(data, 'base64').toString());
}
// createSlackMessage creates a message from a build object.
const createSlackMessage = (build) => {
console.log(JSON.stringify(build));
const message = {
text: `Build \`${build.status}\``,
mrkdwn: true,
attachments: [
{
title: 'Build logs',
title_link: build.logUrl,
fields: [{
title: 'Status',
value: build.status
}]
}
]
};
if(build.source.repoSource.repoName !== undefined)
message.attachments[0].fields.push({ title: "Repo", value: build.source.repoSource.repoName });
if(build.finishTime !== undefined)
message.attachments[0].fields.push({ title: "Finished", value: (new Date(build.finishTime)).toLocaleString('en-GB', {timeZone: "Australia/Brisbane"}) });
return message;
}
The function deploys fine from the gcloud cli, but only fails when using Cloud Build.
I checked the cloudbuild.yaml and I found that having double and single quotes around SLACK_WEBHOOK_URL=${_SLACK_WEBHOOK_URL} generates this issue, taking out the double quotes seems to solve it.
I attach the modified cloudbuild.yaml
steps:
# This step builds the container image.
- name: 'gcr.io/cloud-builders/gcloud'
id: Deploy
args: ['functions', 'deploy', 'subscribeSlack', '--trigger-topic', 'cloud-builds', '--runtime', 'nodejs10', '--set-env-vars', 'SLACK_WEBHOOK_URL=${_SLACK_WEBHOOK_URL}', '--region', '${_REGION}']
env:
- 'SLACK_WEBHOOK_URL=${_SLACK_WEBHOOK_URL}'