CodeBuild failed for cross-account CI/CD pipeline - amazon-web-services

I'm trying to follow this guide to setup a cross accounts pipeline: https://taimos.de/blog/create-a-cicd-pipeline-for-your-cdk-app
I have Alpha account 111111 which will hold the pipeline, that is going to deploy to Alpha account 111111 and Prod account 222222. I did the bootstrap below to establish a trust relationship to the CI/CD account to allow cross-account deployments.
cdk bootstrap --profile=eCommerceService-Prod --bootstrap-customer-key --cloudformation-execution-policies 'arn:aws:iam::aws:policy/AdministratorAccess' --trust 111111 --trust-for-lookup 111111 aws://222222/us-east-1
The pipeline is setup correctly, however, the ColdBuild failed:
User: arn:aws:sts::111111:assumed-role/eCommerceDatabaseCdk-Pipe-PipelineBuildSynthCdkBui-JE7GJT5LX3SC/AWSCodeBuild-32197845-1f33-44c4-a9ff-d33aae93448e is not authorized to perform: sts:AssumeRole on resource: arn:aws:iam::222222:role/cdk-hnb659fds-lookup-role-222222-us-east-1 . Please make sure that this role exists in the account. If it doesn't exist, (re)-bootstrap the environment with the right '--trust', using the latest version of the CDK CLI.
I checked that arn:aws:iam::222222:role/cdk-hnb659fds-lookup-role-222222-us-east-1 exists in prod account. So it seems to me that there is a missing permission in the codebuild step. However, I did not find anything particular for CodeBuild step in this blog: https://taimos.de/blog/create-a-cicd-pipeline-for-your-cdk-app.
The permissions for role arn:aws:iam::222222:role/cdk-hnb659fds-lookup-role-222222-us-east-1 mentioned in the error message is here:
It's trust entities is:
{
"Version": "2008-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::111111:root"
},
"Action": "sts:AssumeRole"
}
]
}
Any idea what am I missing here?
Code
Pipeline config:
export const pipelineStageInfoList: PipelineStageInfo[] = [
{
stageName: "ALPHA",
awsAccount: "111111",
awsRegion: "us-east-1"
},
{
stageName: "PROD",
awsAccount: "222222",
awsRegion: "us-east-1"
}
]
Pipeline stack:
export class PipelineStack extends Cdk.Stack {
constructor(scope: Cdk.App, id: string, props: PipelineStackProps) {
super(scope, id, props);
// Initialize the pipeline
const pipeline = new pipelines.CodePipeline(this, "Pipeline", {
pipelineName: "eCommerceDatabasePipeline",
// Create KMS keys for the artifact buckets,
// allowing cross-account deployments
crossAccountKeys: true,
// allow the pipeline to reconfigure itself when assets or stages
// are being added to it
selfMutation: true,
// synth is expected to produce the CDK Cloud Assembly as its output
synth: new pipelines.ShellStep("Synth", {
input: pipelines.CodePipelineSource.gitHub(
"pandanyc/eCommerceDatabaseCdk",
"main",
{
authentication: Cdk.SecretValue.secretsManager('github-token')
}
),
// Install dependencies, build and run cdk synth
commands: [
'npm ci',
'npm run build',
'npx cdk synth'
],
}),
});
// Add stages to this pipeline.
pipelineStageInfoList.forEach((pipelineStage: PipelineStageInfo) => {
pipeline.addStage(
new ApplicationStage(this, pipelineStage.stageName, {
stageName: pipelineStage.stageName,
pipelineName: props.pipelineName,
env: {
account: pipelineStage.awsAccount,
region: pipelineStage.awsRegion,
},
})
);
});
}
}

If you want to perform lookups in the pipeline itself, your synth step has to have the explicit permission to assume the lookup role. You would add it like this:
synth: new pipelines.CodeBuildStep("Synth", {
input: pipelines.CodePipelineSource.gitHub(
"pandanyc/eCommerceDatabaseCdk",
"main",
{
authentication: Cdk.SecretValue.secretsManager('github-token')
}
),
// Install dependencies, build and run cdk synth
commands: [
'npm ci',
'npm run build',
'npx cdk synth'
],
rolePolicyStatements: [ iam.PolicyStatement({
effect: iam.Effect.ALLOW,
actions: ['sts:AssumeRole'],
resources: ['*'],
conditions: { StringEquals: { 'iam:ResourceTag/aws-cdk:bootstrap-role': 'lookup' } }
}]
}),
It is also worth noting, however, that this is advised against in the Best Practices. Here's an excerpt, but I suggest you check out the whole thing:
AWS CDK includes a mechanism called context providers to record a
snapshot of non-deterministic values, allowing future synthesis
operations produce exactly the same template. The only changes in the
new template are the changes you made in your code. When you use a
construct's .fromLookup() method, the result of the call is cached in
cdk.context.json, which you should commit to version control along
with the rest of your code to ensure future executions of your CDK app
use the same value.
What this means in practice is that you should run cdk synth once locally, which will perform all the necessary lookups and store the results in cdk.context.json. You should then commit that, and your pipeline will use these cached values instead of doing the lookup every time.
As a result, your synth step also won't need to assume the lookup role.

Related

Update config.json file for a static frontend React site, using AWS CodePipeline and CDK

I'm building a CDK Pipeline that with update another CDK template.
This CDK template is a static frontend react app.
The backend uses an AWS Lambda, API Gateway, and CloudFront Distribution to host the site.
I want to put the api's in the config.json file as I normally would if I were building it manually one service at a time.
The problem seems to be in the cdk pipeline-stack, which builds the static-frontend-stack.
When you initialize a new pipeline, it wants you to add shell steps first, (npm i, cd into correct folder, npm run build, etc) which creates the distribution folder I need.
As well as turning the whole thing into a CF template.
Then you can drop that into different stages you want, e.g., test and prod.
However, I won't receive CfnOutputs until the stages are built. And the CfnOutputs hold the api's and other info I need to put into the config.json file (which was already built first, and created empty values).
There is even a envFromCfnOutputs param to add to the initial codebuild pipeline, but since they are initialized/created later, typescript yells at me for putting it in there before. I understand why that errors, but I can't figure a clever way to fix this issue.
import * as cdk from "aws-cdk-lib";
import { Construct } from "constructs";
import * as pipelines from "aws-cdk-lib/pipelines";
import * as codecommit from "aws-cdk-lib/aws-codecommit";
import { Stages } from "./stages";
import { Stack, Stage } from "aws-cdk-lib";
interface PipelineStackProps extends cdk.StackProps {
env: {
account: string;
region: string;
stage: string;
};
}
export class PipelineStack extends cdk.Stack {
constructor(scope: Construct, id: string, props: PipelineStackProps) {
super(scope, id, props);
/************ Grab Repo ************/
const source = codecommit.Repository.fromRepositoryName(
this,
"PreCallbackSMSSolution",
"PreCallbackSMSSolution"
);
/************ Define Pipeline & Build ShellStep (for Frontend) ************/
const Pipeline = new pipelines.CodePipeline(this, "Pipeline", {
pipelineName: `CodePipeline`,
selfMutation: true,
crossAccountKeys: true,
synthCodeBuildDefaults: {
rolePolicy: [
// #desc Policy to allow CodeBuild to use CodeArtifact
// #external https://docs.aws.amazon.com/codeartifact/latest/ug/using-npm-packages-in-codebuild.html
new cdk.aws_iam.PolicyStatement({
actions: [
"codeartifact:GetAuthorizationToken",
"codeartifact:GetRepositoryEndpoint",
"codeartifact:ReadFromRepository",
],
resources: ["*"],
}),
new cdk.aws_iam.PolicyStatement({
actions: ["sts:GetServiceBearerToken"],
resources: ["*"],
conditions: {
StringEquals: {
"sts:AWSServiceName": "codeartifact.amazonaws.com",
},
},
}),
],
},
synth: new pipelines.ShellStep("Synth", {
input: pipelines.CodePipelineSource.codeCommit(source, "master"),
installCommands: [
"cd $CODEBUILD_SRC_DIR/deployment",
"npm install -g typescript",
"npm run co:login",
"npm i",
],
env: {
stage: props.env.stage,
},
envFromCfnOutputs: {
// TODO: cfn outputs need to go here!
// CcpUrlOutput: TestStage.CcpUrlOutput,
// loginUrlOutput: TestStage.LoginUrlOutput,
// regionOutput: TestStage.RegionOutput,
// apiOutput: TestStage.ApiOutput
},
commands: [
"cd $CODEBUILD_SRC_DIR/frontend",
"pwd",
"apt-get install jq -y",
"chmod +x ./generate-config.sh",
"npm i",
"npm run build-prod",
"pwd",
"cat ./src/config-prod.json",
"cd ../deployment",
"npx cdk synth",
],
primaryOutputDirectory: "$CODEBUILD_SRC_DIR/deployment/cdk.out", // $CODEBUILD_SRC_DIR = starts root path
}),
});
/************ Initialize Test Stack & Add Stage************/
const TestStage = new Stages(this, "TestStage", {
env: { account: "***********", region: "us-east-1", stage: "test" },
}); // Aspen Sandbox
Pipeline.addStage(TestStage);
/************ Initialize Prod Stack & Add Stage ************/
const ProdStage = new Stages(this, "ProdStage", {
env: { account: "***********", region: "us-east-1", stage: "prod" },
}); // Aspen Sandbox
Pipeline.addStage(ProdStage);
/************ Build Pipeline ************/
Pipeline.buildPipeline();
/************ Manual Approve Stage ************/
const ApproveStage = Pipeline.pipeline.addStage({
stageName: "PromoteToProd",
placement: {
justAfter: Pipeline.pipeline.stage("TestStage"),
},
});
ApproveStage.addAction(
new cdk.aws_codepipeline_actions.ManualApprovalAction({
actionName: "Approve",
additionalInformation: "Approve this deployment for production.",
})
);
}
/****/
}

CodeBuild: Artifacts upload location doesn't match

Here is my CodeBuild main page, which says "Artifacts upload location" is "alpha-artifact-bucket":
Here is one of the build run, which is not using above bucket:
What's the difference between the two? Why every build run use a random bucket?
Any way to enforce the CodeBuild use the specified S3 bucket "alpha-artifact-bucket"?
CDK code
CodeBuild stack: I deploy this stack to each AWS along the pipeline first, so that the pipeline stack will just query each AWS and find its corresponding CodeBuild, and add it as a "stage". The reason I'm doing this is because each AWS will have a dedicated CodeBuild stage which will need read some values from its SecretManger.
export interface CodeBuildStackProps extends Cdk.StackProps {
readonly pipelineName: string;
readonly pipelineRole: IAM.IRole;
readonly pipelineStageInfo: PipelineStageInfo;
}
/**
* This stack will create CodeBuild for the target AWS account.
*/
export class CodeBuildStack extends Cdk.Stack {
constructor(scope: Construct, id: string, props: CodeBuildStackProps) {
super(scope, id, props);
// DeploymentRole will be assumed by PipelineRole to perform the CodeBuild step.
const deploymentRoleArn: string = `arn:aws:iam::${props.env?.account}:role/${props.pipelineName}-DeploymentRole`;
const deploymentRole = IAM.Role.fromRoleArn(
this,
`CodeBuild${props.pipelineStageInfo.stageName}DeploymentRoleConstructID`,
deploymentRoleArn,
{
mutable: false,
// Causes CDK to update the resource policy where required, instead of the Role
addGrantsToResources: true,
}
);
const buildspecFile = FS.readFileSync("./config/buildspec.yml", "utf-8");
const buildspecFileYaml = YAML.parse(buildspecFile, {
prettyErrors: true,
});
new CodeBuild.Project(
this,
`${props.pipelineStageInfo.stageName}ColdBuild`,
{
projectName: `${props.pipelineStageInfo.stageName}ColdBuild`,
environment: {
buildImage: CodeBuild.LinuxBuildImage.STANDARD_5_0,
},
buildSpec: CodeBuild.BuildSpec.fromObjectToYaml(buildspecFileYaml),
role: deploymentRole,
logging: {
cloudWatch: {
logGroup: new Logs.LogGroup(
this,
`${props.pipelineStageInfo.stageName}ColdBuildLogGroup`,
{
retention: Logs.RetentionDays.ONE_WEEK,
}
),
},
},
}
);
}
}
Pipeline Stack:
export interface PipelineStackProps extends CDK.StackProps {
readonly description: string;
readonly pipelineName: string;
}
/**
* This stack will contain our pipeline..
*/
export class PipelineStack extends CDK.Stack {
private readonly pipelineRole: IAM.IRole;
constructor(scope: Construct, id: string, props: PipelineStackProps) {
super(scope, id, props);
// Get the pipeline role from pipeline AWS account.
// The pipeline role will assume "Deployment Role" of each AWS account to perform the actual deployment.
const pipelineRoleName: string =
"eCommerceWebsitePipelineCdk-Pipeline-PipelineRole";
this.pipelineRole = IAM.Role.fromRoleArn(
this,
pipelineRoleName,
`arn:aws:iam::${this.account}:role/${pipelineRoleName}`,
{
mutable: false,
// Causes CDK to update the resource policy where required, instead of the Role
addGrantsToResources: true,
}
);
// Initialize the pipeline.
const pipeline = new codepipeline.Pipeline(this, props.pipelineName, {
pipelineName: props.pipelineName,
role: this.pipelineRole,
restartExecutionOnUpdate: true,
});
// Add a pipeline Source stage to fetch source code from repository.
const sourceCode = new codepipeline.Artifact();
this.addSourceStage(pipeline, sourceCode);
// For each AWS account, add a build stage and a deployment stage.
pipelineStageInfoList.forEach((pipelineStageInfo: PipelineStageInfo) => {
const deploymentRoleArn: string = `arn:aws:iam::${pipelineStageInfo.awsAccount}:role/${props.pipelineName}-DeploymentRole`;
const deploymentRole: IAM.IRole = IAM.Role.fromRoleArn(
this,
`DeploymentRoleFor${pipelineStageInfo.stageName}`,
deploymentRoleArn
);
const websiteArtifact = new codepipeline.Artifact();
// Add build stage to build the website artifact for the target AWS.
// Some environment variables will be retrieved from target AWS's secret manager.
this.addBuildStage(
pipelineStageInfo,
pipeline,
deploymentRole,
sourceCode,
websiteArtifact
);
// Add deployment stage to for the target AWS to do the actual deployment.
this.addDeploymentStage(
props,
pipelineStageInfo,
pipeline,
deploymentRole,
websiteArtifact
);
});
}
// Add Source stage to fetch code from GitHub repository.
private addSourceStage(
pipeline: codepipeline.Pipeline,
sourceCode: codepipeline.Artifact
) {
pipeline.addStage({
stageName: "Source",
actions: [
new codepipeline_actions.GitHubSourceAction({
actionName: "Checkout",
owner: "yangliu",
repo: "eCommerceWebsite",
branch: "main",
oauthToken: CDK.SecretValue.secretsManager(
"eCommerceWebsite-GitHubToken"
),
output: sourceCode,
trigger: codepipeline_actions.GitHubTrigger.WEBHOOK,
}),
],
});
}
private addBuildStage(
pipelineStageInfo: PipelineStageInfo,
pipeline: codepipeline.Pipeline,
deploymentRole: IAM.IRole,
sourceCode: codepipeline.Artifact,
websiteArtifact: codepipeline.Artifact
) {
const stage = new CDK.Stage(this, `${pipelineStageInfo.stageName}BuildId`, {
env: {
account: pipelineStageInfo.awsAccount,
},
});
const buildStage = pipeline.addStage(stage);
const targetProject: CodeBuild.IProject = CodeBuild.Project.fromProjectName(
this,
`CodeBuildProject${pipelineStageInfo.stageName}`,
`${pipelineStageInfo.stageName}ColdBuild`
);
buildStage.addAction(
new codepipeline_actions.CodeBuildAction({
actionName: `BuildArtifactForAAAA${pipelineStageInfo.stageName}`,
project: targetProject,
input: sourceCode,
outputs: [websiteArtifact],
role: deploymentRole,
})
);
}
private addDeploymentStage(
props: PipelineStackProps,
pipelineStageInfo: PipelineStageInfo,
pipeline: codepipeline.Pipeline,
deploymentRole: IAM.IRole,
websiteArtifact: codepipeline.Artifact
) {
const websiteBucket = S3.Bucket.fromBucketName(
this,
`${pipelineStageInfo.websiteBucketName}ConstructId`,
`${pipelineStageInfo.websiteBucketName}`
);
const pipelineStage = new PipelineStage(this, pipelineStageInfo.stageName, {
stageName: pipelineStageInfo.stageName,
pipelineName: props.pipelineName,
websiteDomain: pipelineStageInfo.websiteDomain,
websiteBucket: websiteBucket,
env: {
account: pipelineStageInfo.awsAccount,
region: pipelineStageInfo.awsRegion,
},
});
const stage = pipeline.addStage(pipelineStage);
stage.addAction(
new codepipeline_actions.S3DeployAction({
actionName: `DeploymentFor${pipelineStageInfo.stageName}`,
input: websiteArtifact,
bucket: websiteBucket,
role: deploymentRole,
})
);
}
}
buildspec.yml:
version: 0.2
env:
secrets-manager:
REACT_APP_DOMAIN: "REACT_APP_DOMAIN"
REACT_APP_BACKEND_SERVICE_API: "REACT_APP_BACKEND_SERVICE_API"
REACT_APP_GOOGLE_MAP_API_KEY: "REACT_APP_GOOGLE_MAP_API_KEY"
phases:
install:
runtime-versions:
nodejs: 14
commands:
- echo Performing yarn install
- yarn install
build:
commands:
- yarn build
artifacts:
base-directory: ./build
files:
- "**/*"
cache:
paths:
- "./node_modules/**/*"
I figured this out. aws-codepipeline pipeline has a built-in artifacts bucket : CDK's CodePipeline or CodeBuildStep are leaving an S3 bucket behind, is there a way of automatically removing it?. That is different from the CodeBuild artifacts.
Because my pipeline role in Account A need to assume the deployment role in Account B to perform the CodeBuild step(of Account B), I need grant the deployment role in Account B the write permission to the pipeline's built-in artifacts bucket. So I need do this:
pipeline.artifactBucket.grantReadWrite(deploymentRole);

Enabling Logs for AWS WAF WebAcl does not work in CDK

My goal is to enable logging for a regional WebAcl via AWS CDK. This seems to be possible over Cloud Formation and there are the appropriate constructs in CDK. But when using the following code to create a Log Group and linking it in a LoggingConfiguration ...
const webAclLogGroup = new LogGroup(scope, "awsWafLogs", {
logGroupName: `aws-waf-logs`
});
// Create logging configuration with log group as destination
new CfnLoggingConfiguration(scope, "webAclLoggingConfiguration", {
logDestinationConfigs: webAclLogGroup.logGroupArn, // Arn of LogGroup
resourceArn: aclArn // Arn of Acl
});
... I get an exception during cdk deploy, stating that the string in the LogdestinationConfig is not a correct Arn (some parts of the Arn in the log messages have been removed):
Resource handler returned message: "Error reason: The ARN isn't valid. A valid ARN begins with arn: and includes other information separated by colons or slashes., field: LOG_DESTINATION, parameter: arn:aws:logs:xxx:xxx:xxx-awswaflogsF99ED1BA-PAeH9Lt2Y3fi:* (Service: Wafv2, Status Code: 400, Request ID: xxx, Extended Request ID: null)"
I cannot see an error in the generated Cloud Formation code after cdk synth:
"webAclLoggingConfiguration": {
"id": "webAclLoggingConfiguration",
"path": "xxx/xxx/webAclLoggingConfiguration",
"attributes": {
"aws:cdk:cloudformation:type": "AWS::WAFv2::LoggingConfiguration",
"aws:cdk:cloudformation:props": {
"logDestinationConfigs": [
{
"Fn::GetAtt": [
{
"Ref": "awsWafLogs58D3FD01"
},
"Arn"
]
}
],
"resourceArn": {
"Fn::GetAtt": [
"webACL",
"Arn"
]
}
}
},
"constructInfo": {
"fqn": "aws-cdk-lib.aws_wafv2.CfnLoggingConfiguration",
"version": "2.37.1"
}
},
I'm using Cdk with Typescript and the Cdk version is currently set to 2.37.1 but it also did not work with 2.16.0.
WAF has particular requirements to the naming and format of Logging Destination configs as described and shown in their docs.
Specifically, the ARN of the Log Group cannot end in :* which unfortunately is the return value for a Log Group ARN in Cloudformation.
A workaround would be to construct the required ARN format manually like this, which will omit the :* suffix. Also note that logDestinationConfigs takes a List of Strings, though only with exactly 1 element in it.
const webAclLogGroup = new LogGroup(scope, "awsWafLogs", {
logGroupName: `aws-waf-logs`
});
// Create logging configuration with log group as destination
new CfnLoggingConfiguration(scope, "webAclLoggingConfiguration", {
logDestinationConfigs: [
// Construct the different ARN format from the logGroupName
Stack.of(this).formatArn({
arnFormat: ArnFormat.COLON_RESOURCE_NAME,
service: "logs",
resource: "log-group",
resourceName: webAclLogGroup.logGroupName,
})
],
resourceArn: aclArn // Arn of Acl
});
PS: I work for AWS on the CDK team.

How do I make my CloudFormation / CodePipeline update a domain name to point to an S3 bucket when using CDK?

I'm using CDK to deploy a CodePipeline that builds and deploys a React application to S3. All of that is working, but I want the deployment to update a domain name to point that S3 bucket.
I already have the Zone defined in Route53 but it is defined by a different cloud formation stack because there are a lot of details that are not relevant for this app (MX, TXT, etc). What's the right way for my Pipeline/Stacks to set those domain names?
I could think of two solutions:
Delegate the domain to another zone, so zone example.com delegates staging.example.com.
Have my pipeline inject records into the existing zone.
I didn't try the delegation zone method. I was slightly concerned about manually maintaining the generated nameservers from staging.example.com into my CloudFormation for zone example.com.
I did try injecting the records into the existing zone, but I run into some issues. I'm open to either solving these issues or doing this whichever way is correct.
In my stack (full pipeline at the bottom) I first define and deploy to the bucket:
const bucket = new s3.Bucket(this, "Bucket", {...})
new s3d.BucketDeployment(this, "WebsiteDeployment", {
sources: [...],
destinationBucket: bucket
})
then I tried to retrieve the zone and add the CNAME to it:
const dnsZone = route53.HostedZone.fromLookup(this, "DNS zone", {domainName: "example.com"})
new route53.CnameRecord(this, "cname", {
zone: dnsZone,
recordName: "staging",
domainName: bucket.bucketWebsiteDomainName
})
This fails due to lack of permissions to the zone, which is reasonable:
[Container] 2022/01/30 11:35:17 Running command npx cdk synth
current credentials could not be used to assume 'arn:aws:iam::...:role/cdk-hnb659fds-lookup-role-...-us-east-1', but are for the right account. Proceeding anyway.
[Error at /Pipeline/Staging] User: arn:aws:sts::...:assumed-role/Pipeline-PipelineBuildSynthCdkBuildProje-1H5AV7C28FZ3S/AWSCodeBuild-ada5ef88-cc82-4309-9acf-11bcf0bae878 is not authorized to perform: route53:ListHostedZonesByName because no identity-based policy allows the route53:ListHostedZonesByName action
Found errors
To try to solve that, I added rolePolicyStatements to my CodeBuildStep
rolePolicyStatements: [
new iam.PolicyStatement({
actions: ["route53:ListHostedZonesByName"],
resources: ["*"],
effect: iam.Effect.ALLOW
})
]
which might make more sense in the context of the whole file (at the bottom of this question). That had no effect. I'm not sure if the policy statement is wrong or I'm adding it to the wrong role.
After adding that rolePolicyStatements, I run cdk deploy which showed me this output:
> cdk deploy
✨ Synthesis time: 33.43s
This deployment will make potentially sensitive changes according to your current security approval level (--require-approval broadening).
Please confirm you intend to make the following modifications:
IAM Statement Changes
┌───┬──────────┬────────┬───────────────────────────────┬───────────────────────────────────────────────────────────┬───────────┐
│ │ Resource │ Effect │ Action │ Principal │ Condition │
├───┼──────────┼────────┼───────────────────────────────┼───────────────────────────────────────────────────────────┼───────────┤
│ + │ * │ Allow │ route53:ListHostedZonesByName │ AWS:${Pipeline/Pipeline/Build/Synth/CdkBuildProject/Role} │ │
└───┴──────────┴────────┴───────────────────────────────┴───────────────────────────────────────────────────────────┴───────────┘
(NOTE: There may be security-related changes not in this list. See https://github.com/aws/aws-cdk/issues/1299)
After deployment finishes, there's a role that I can see in the AWS console that has:
{
"Action": "route53:ListHostedZonesByName",
"Resource": "*",
"Effect": "Allow"
}
The ARN of the role is arn:aws:iam::...:role/Pipeline-PipelineBuildSynthCdkBuildProje-1H5AV7C28FZ3S. I'm not 100% if the permissions are being granted to the right thing.
This is my whole CDK pipeline:
import * as path from "path";
import {Construct} from "constructs"
import * as pipelines from "aws-cdk-lib/pipelines"
import * as cdk from "aws-cdk-lib"
import * as s3 from "aws-cdk-lib/aws-s3"
import * as s3d from "aws-cdk-lib/aws-s3-deployment"
import * as iam from "aws-cdk-lib/aws-iam"
import * as route53 from "aws-cdk-lib/aws-route53";
export class MainStack extends cdk.Stack {
constructor(scope: Construct, id: string, props?: cdk.StageProps) {
super(scope, id, props)
const bucket = new s3.Bucket(this, "Bucket", {
websiteIndexDocument: "index.html",
websiteErrorDocument: "error.html",
publicReadAccess: true,
})
const dnsZone = route53.HostedZone.fromLookup(this, "DNS zone", {domainName: "example.com"})
new route53.CnameRecord(this, "cname", {
zone: dnsZone,
recordName: "staging",
domainName: bucket.bucketWebsiteDomainName
})
new s3d.BucketDeployment(this, "WebsiteDeployment", {
sources: [s3d.Source.asset(path.join(process.cwd(), "../build"))],
destinationBucket: bucket
})
}
}
export class DeployStage extends cdk.Stage {
public readonly mainStack: MainStack
constructor(scope: Construct, id: string, props?: cdk.StageProps) {
super(scope, id, props)
this.mainStack = new MainStack(this, "MainStack", {env: props?.env})
}
}
export interface PipelineStackProps extends cdk.StackProps {
readonly githubRepo: string
readonly repoBranch: string
readonly repoConnectionArn: string
}
export class PipelineStack extends cdk.Stack {
constructor(scope: Construct, id: string, props: PipelineStackProps) {
super(scope, id, props)
const pipeline = new pipelines.CodePipeline(this, id, {
pipelineName: id,
synth: new pipelines.CodeBuildStep("Synth", {
input: pipelines.CodePipelineSource.connection(props.githubRepo, props.repoBranch, {connectionArn: props.repoConnectionArn}),
installCommands: [
"npm install -g aws-cdk"
],
commands: [
// First build the React app.
"npm ci",
"npm run build",
// Now build the CF stack.
"cd infra",
"npm ci",
"npx cdk synth"
],
primaryOutputDirectory: "infra/cdk.out",
rolePolicyStatements: [
new iam.PolicyStatement({
actions: ["route53:ListHostedZonesByName"],
resources: ["*"],
effect: iam.Effect.ALLOW
})
]
},
),
})
const deploy = new DeployStage(this, "Staging", {env: props?.env})
const deployStage = pipeline.addStage(deploy)
}
}
You cannot depend on CDK pipeline to fix itself if the synth stage is failing, since the Pipeline CloudFormation Stack is changed in the SelfMutate stage which uses the output of the synth stage. You will need to do one of the following options to fix your pipeline:
Run cdk synth and cdk deploy PipelineStack locally (or anywhere outside the pipeline, where you have the required AWS IAM permissions). Edit: You will need to temporarily set selfMutatation to false for this to work (Reference)
Temporarily remove route53.HostedZone.fromLookup and route53.CnameRecord from your MainStack while still keeping the rolePolicyStatements change. Commit and push your code, let CodePipeline run once, making sure that the Pipeline self mutates and the IAM role has the required additional permissions. Add back the route53 constructs, commit, push again and check whether your code works with the new changes.

create resources using given template file using CDK with pipeline automation

[![codecommit containing cloudformation template][1]][1]I have a requirement where I need to create pipeline which is responsible for taking template yaml file as an input and create resources accordingly.
The approach which I took is providing the path of template yaml file in codebuild stage with command as:
"aws cloudformation deploy --template-file D:/pipeline/aws-waf.yml --stack-name waf-deployment"
export class PipelineStack extends Stack {
constructor(app: App, id: string, props: PipelineStackProps) {
super(app, id, props);
const code = codecommit.Repository.fromRepositoryName(this, 'ImportedRepo',
props.repoName);
const cdkBuild = new codebuild.PipelineProject(this, 'CdkBuild', {
buildSpec: codebuild.BuildSpec.fromObject({
version: '0.2',
phases: {
install: {
commands: 'npm install',
},
build: {
commands: [
'npm run build',
'npm run cdk synth -- -o dist',
'aws cloudformation deploy --template-file D:/pipeline/aws-waf.yml --stack-name waf-deployment',
'echo $?'
],
},
},
artifacts: {
'base-directory': 'dist',
files: [
'LambdaStack.template.json',
],
},
}),
environment: {
buildImage: codebuild.LinuxBuildImage.AMAZON_LINUX_2_3,
},
});
const sourceOutput = new codepipeline.Artifact();
const cdkBuildOutput = new codepipeline.Artifact('CdkBuildOutput');
//const lambdaBuildOutput = new codepipeline.Artifact('LambdaBuildOutput');
new codepipeline.Pipeline(this, 'Pipeline', {
stages: [
{
stageName: 'Source',
actions: [
new codepipeline_actions.CodeCommitSourceAction({
actionName: 'CodeCommit_Source',
repository: code,
output: sourceOutput,
}),
],
},
{
stageName: 'Build',
actions: [
new codepipeline_actions.CodeBuildAction({
actionName: 'CDK_Build',
project: cdkBuild,
input: sourceOutput,
outputs: [cdkBuildOutput],
}),
],
},
],
});
}
}
```[![enter image description here][2]][2]
[1]: https://i.stack.imgur.com/u7rRe.jpg
[2]: https://i.stack.imgur.com/xzk6v.png
Im not fully sure of exactly what you are looking for so maybe consider updating your question to be more specific. However, I took the question as you are looking for the correct way to deploy cloud formation/cdk given a file in a codepipeline?
The way that we handle deployments of cloudformation via codepipeline is by leverage codebuild and codedeploy. The pipeline sources the file / change from a repository (optional, could use many other triggers), codebuild then uploads the file to s3 using the aws cli, once that file has been uploaded to s3 you can use codedeploy to deploy cloudformation from a source file in s3.
So for your example above, I would update the build to upload the new artifact to s3, and then create a new step in your pipeline to use codedeploy to deploy that s3 template.
Its entirely possible to build a script/codebuild commands to do the cloudformation deploy as well but because codedeploy already supports tracking that change, error handling etc I would recommend using a codedeploy for cloudformation deploys.
Note:
if you are not using an existing cloudformation template (json/yaml) and instead using cdk you will need synthesize your cdk into a cloudformation template before uploading to s3.