Add IAM capabilities to stack in CDK - amazon-web-services

In the bin directory in my CDK project I have this:
#!/usr/bin/env node
import 'source-map-support/register';
import * as cdk from '#aws-cdk/core';
import {PipelineStack} from "../lib/pipeline-stack";
const app = new cdk.App();
new PipelineStack(app, 'PipelineStack', {
env: {account: '12345678912', region: 'us-east-1'},
});
app.synth();
Where PipelineStack is defined as (in my ../lib directory):
import {Construct, Stack, StackProps} from '#aws-cdk/core';
import {CodePipeline, CodePipelineSource, ShellStep} from '#aws-cdk/pipelines';
import {HadesStage} from './hades-stage';
/**
* The stack that defines the application pipeline
*/
export class PipelineStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const appName = 'MyApp';
const pipeline = new CodePipeline(this, 'Pipeline', {
// The pipeline name
pipelineName: 'MyAppPipeline',
// How it will be built and synthesized
synth: new ShellStep('Synth', {
// Where the source can be found
input: CodePipelineSource.gitHub('OWNER/REPO', 'master'),
// Install dependencies, build and run cdk synth
commands: [
'npm ci',
'npm run build',
'npx cdk synth'
],
}),
});
pipeline.addStage(new MyAppStage(this, 'MyAppProdStage', 'Prod', appName, 'mydomain.com', {
env: {account: '12345678912', region: 'us-east-1'}
}));
}
}
And MyAppStage is:
import {Construct, Stage, StageProps} from '#aws-cdk/core';
import {HadesStack} from './hades-stack';
/**
* Deployable unit of web service app
*/
export class MyAppStage extends Stage {
constructor(scope: Construct, id: string, stageName: string, appName: string, domainName: string, props?: StageProps) {
super(scope, id, props);
new MyAppStack(this, `${appName}${stageName}Stack`, stageName, appName, domainName, {
stackName: `${appName}${stageName}Stack`,
});
}
}
And MyAppStack is a stack with my actual resources. Basically I followed this guide.
It works fine, until I add a secret rotation for RDS credentials. The MyAppStack stack fails with:
Requires capabilities : [CAPABILITY_AUTO_EXPAND]
Which makes sense; however, I can't find a way to add the IAM capabilities to the stack through CDK. Am I doing something wrong? Or is the approach covered by the guide not meant to cover this? Can I add the capabilities somehow?

It appears that this was, indeed, a bug, and Otavio Macedo fixed it in GitHub pull request #15819.

Related

Update config.json file for a static frontend React site, using AWS CodePipeline and CDK

I'm building a CDK Pipeline that with update another CDK template.
This CDK template is a static frontend react app.
The backend uses an AWS Lambda, API Gateway, and CloudFront Distribution to host the site.
I want to put the api's in the config.json file as I normally would if I were building it manually one service at a time.
The problem seems to be in the cdk pipeline-stack, which builds the static-frontend-stack.
When you initialize a new pipeline, it wants you to add shell steps first, (npm i, cd into correct folder, npm run build, etc) which creates the distribution folder I need.
As well as turning the whole thing into a CF template.
Then you can drop that into different stages you want, e.g., test and prod.
However, I won't receive CfnOutputs until the stages are built. And the CfnOutputs hold the api's and other info I need to put into the config.json file (which was already built first, and created empty values).
There is even a envFromCfnOutputs param to add to the initial codebuild pipeline, but since they are initialized/created later, typescript yells at me for putting it in there before. I understand why that errors, but I can't figure a clever way to fix this issue.
import * as cdk from "aws-cdk-lib";
import { Construct } from "constructs";
import * as pipelines from "aws-cdk-lib/pipelines";
import * as codecommit from "aws-cdk-lib/aws-codecommit";
import { Stages } from "./stages";
import { Stack, Stage } from "aws-cdk-lib";
interface PipelineStackProps extends cdk.StackProps {
env: {
account: string;
region: string;
stage: string;
};
}
export class PipelineStack extends cdk.Stack {
constructor(scope: Construct, id: string, props: PipelineStackProps) {
super(scope, id, props);
/************ Grab Repo ************/
const source = codecommit.Repository.fromRepositoryName(
this,
"PreCallbackSMSSolution",
"PreCallbackSMSSolution"
);
/************ Define Pipeline & Build ShellStep (for Frontend) ************/
const Pipeline = new pipelines.CodePipeline(this, "Pipeline", {
pipelineName: `CodePipeline`,
selfMutation: true,
crossAccountKeys: true,
synthCodeBuildDefaults: {
rolePolicy: [
// #desc Policy to allow CodeBuild to use CodeArtifact
// #external https://docs.aws.amazon.com/codeartifact/latest/ug/using-npm-packages-in-codebuild.html
new cdk.aws_iam.PolicyStatement({
actions: [
"codeartifact:GetAuthorizationToken",
"codeartifact:GetRepositoryEndpoint",
"codeartifact:ReadFromRepository",
],
resources: ["*"],
}),
new cdk.aws_iam.PolicyStatement({
actions: ["sts:GetServiceBearerToken"],
resources: ["*"],
conditions: {
StringEquals: {
"sts:AWSServiceName": "codeartifact.amazonaws.com",
},
},
}),
],
},
synth: new pipelines.ShellStep("Synth", {
input: pipelines.CodePipelineSource.codeCommit(source, "master"),
installCommands: [
"cd $CODEBUILD_SRC_DIR/deployment",
"npm install -g typescript",
"npm run co:login",
"npm i",
],
env: {
stage: props.env.stage,
},
envFromCfnOutputs: {
// TODO: cfn outputs need to go here!
// CcpUrlOutput: TestStage.CcpUrlOutput,
// loginUrlOutput: TestStage.LoginUrlOutput,
// regionOutput: TestStage.RegionOutput,
// apiOutput: TestStage.ApiOutput
},
commands: [
"cd $CODEBUILD_SRC_DIR/frontend",
"pwd",
"apt-get install jq -y",
"chmod +x ./generate-config.sh",
"npm i",
"npm run build-prod",
"pwd",
"cat ./src/config-prod.json",
"cd ../deployment",
"npx cdk synth",
],
primaryOutputDirectory: "$CODEBUILD_SRC_DIR/deployment/cdk.out", // $CODEBUILD_SRC_DIR = starts root path
}),
});
/************ Initialize Test Stack & Add Stage************/
const TestStage = new Stages(this, "TestStage", {
env: { account: "***********", region: "us-east-1", stage: "test" },
}); // Aspen Sandbox
Pipeline.addStage(TestStage);
/************ Initialize Prod Stack & Add Stage ************/
const ProdStage = new Stages(this, "ProdStage", {
env: { account: "***********", region: "us-east-1", stage: "prod" },
}); // Aspen Sandbox
Pipeline.addStage(ProdStage);
/************ Build Pipeline ************/
Pipeline.buildPipeline();
/************ Manual Approve Stage ************/
const ApproveStage = Pipeline.pipeline.addStage({
stageName: "PromoteToProd",
placement: {
justAfter: Pipeline.pipeline.stage("TestStage"),
},
});
ApproveStage.addAction(
new cdk.aws_codepipeline_actions.ManualApprovalAction({
actionName: "Approve",
additionalInformation: "Approve this deployment for production.",
})
);
}
/****/
}

Can I tag my code on Github when building it through a CDK Pipeline on AWS?

I have some GitHub repositories with my project source codes and I build them through CDK Pipelines on AWS. I basically grab the source code, build the docker images and push them to the ECR. I was wondering if I could tag the versions on the code on GitHub through any step or code on the Pipeline, so I can keep track of the builds on the code. I tried looking it up but didn't find anything so I thought maybe I would have more luck here if anyone has done that.
Here a working example of a pipeline initialization with CDKv2 using a CodeCommit repo:
import { Stack, StackProps, Stage, StageProps } from 'aws-cdk-lib';
import { Repository } from 'aws-cdk-lib/aws-codecommit';
import { PolicyStatement } from 'aws-cdk-lib/aws-iam';
import { CodeBuildStep, CodePipeline, CodePipelineSource, ManualApprovalStep } from 'aws-cdk-lib/pipelines';
import { Construct } from 'constructs';
// This will be your infrastructure code
import { InfrastructureStack } from './infrastructure-stack';
export interface PipelineStackProps extends StackProps {
readonly repository: string;
}
export class PipelineStack extends Stack {
constructor(scope: Construct, id: string, props: PipelineStackProps) {
super(scope, id, props);
const pipeline = new CodePipeline(this, 'pipelineID', {
pipelineName: "pipeline-name",
synth: new CodeBuildStep('Synth', {
input: CodePipelineSource.codeCommit(Repository.fromRepositoryArn(this, `repository`, props.repository), 'master'),
commands: [
'npm ci',
'npm run build',
'npx cdk synth',
],
rolePolicyStatements: [
new PolicyStatement({
actions: ['ssm:GetParameter'],
resources: [`*`],
})
]
}),
dockerEnabledForSynth: true
});
pipeline.addStage(new InfrastructureStage(this, 'qa'));
pipeline.addStage(new InfrastructureStage(this, 'prod'), {
pre: [new ManualApprovalStep('PromoteToProd')]
});
}
}
class InfrastructureStage extends Stage {
constructor(scope: Construct, id: string, props?: StageProps) {
super(scope, id, props);
new InfrastructureStack(this, "InfrastructureStack", {
environment: id
})
}
}
If you have a look at the following link you can view that CodePipelineSource can interact with Github using GitHubSourceOptions options and gitHub method.

CodeBuild: Artifacts upload location doesn't match

Here is my CodeBuild main page, which says "Artifacts upload location" is "alpha-artifact-bucket":
Here is one of the build run, which is not using above bucket:
What's the difference between the two? Why every build run use a random bucket?
Any way to enforce the CodeBuild use the specified S3 bucket "alpha-artifact-bucket"?
CDK code
CodeBuild stack: I deploy this stack to each AWS along the pipeline first, so that the pipeline stack will just query each AWS and find its corresponding CodeBuild, and add it as a "stage". The reason I'm doing this is because each AWS will have a dedicated CodeBuild stage which will need read some values from its SecretManger.
export interface CodeBuildStackProps extends Cdk.StackProps {
readonly pipelineName: string;
readonly pipelineRole: IAM.IRole;
readonly pipelineStageInfo: PipelineStageInfo;
}
/**
* This stack will create CodeBuild for the target AWS account.
*/
export class CodeBuildStack extends Cdk.Stack {
constructor(scope: Construct, id: string, props: CodeBuildStackProps) {
super(scope, id, props);
// DeploymentRole will be assumed by PipelineRole to perform the CodeBuild step.
const deploymentRoleArn: string = `arn:aws:iam::${props.env?.account}:role/${props.pipelineName}-DeploymentRole`;
const deploymentRole = IAM.Role.fromRoleArn(
this,
`CodeBuild${props.pipelineStageInfo.stageName}DeploymentRoleConstructID`,
deploymentRoleArn,
{
mutable: false,
// Causes CDK to update the resource policy where required, instead of the Role
addGrantsToResources: true,
}
);
const buildspecFile = FS.readFileSync("./config/buildspec.yml", "utf-8");
const buildspecFileYaml = YAML.parse(buildspecFile, {
prettyErrors: true,
});
new CodeBuild.Project(
this,
`${props.pipelineStageInfo.stageName}ColdBuild`,
{
projectName: `${props.pipelineStageInfo.stageName}ColdBuild`,
environment: {
buildImage: CodeBuild.LinuxBuildImage.STANDARD_5_0,
},
buildSpec: CodeBuild.BuildSpec.fromObjectToYaml(buildspecFileYaml),
role: deploymentRole,
logging: {
cloudWatch: {
logGroup: new Logs.LogGroup(
this,
`${props.pipelineStageInfo.stageName}ColdBuildLogGroup`,
{
retention: Logs.RetentionDays.ONE_WEEK,
}
),
},
},
}
);
}
}
Pipeline Stack:
export interface PipelineStackProps extends CDK.StackProps {
readonly description: string;
readonly pipelineName: string;
}
/**
* This stack will contain our pipeline..
*/
export class PipelineStack extends CDK.Stack {
private readonly pipelineRole: IAM.IRole;
constructor(scope: Construct, id: string, props: PipelineStackProps) {
super(scope, id, props);
// Get the pipeline role from pipeline AWS account.
// The pipeline role will assume "Deployment Role" of each AWS account to perform the actual deployment.
const pipelineRoleName: string =
"eCommerceWebsitePipelineCdk-Pipeline-PipelineRole";
this.pipelineRole = IAM.Role.fromRoleArn(
this,
pipelineRoleName,
`arn:aws:iam::${this.account}:role/${pipelineRoleName}`,
{
mutable: false,
// Causes CDK to update the resource policy where required, instead of the Role
addGrantsToResources: true,
}
);
// Initialize the pipeline.
const pipeline = new codepipeline.Pipeline(this, props.pipelineName, {
pipelineName: props.pipelineName,
role: this.pipelineRole,
restartExecutionOnUpdate: true,
});
// Add a pipeline Source stage to fetch source code from repository.
const sourceCode = new codepipeline.Artifact();
this.addSourceStage(pipeline, sourceCode);
// For each AWS account, add a build stage and a deployment stage.
pipelineStageInfoList.forEach((pipelineStageInfo: PipelineStageInfo) => {
const deploymentRoleArn: string = `arn:aws:iam::${pipelineStageInfo.awsAccount}:role/${props.pipelineName}-DeploymentRole`;
const deploymentRole: IAM.IRole = IAM.Role.fromRoleArn(
this,
`DeploymentRoleFor${pipelineStageInfo.stageName}`,
deploymentRoleArn
);
const websiteArtifact = new codepipeline.Artifact();
// Add build stage to build the website artifact for the target AWS.
// Some environment variables will be retrieved from target AWS's secret manager.
this.addBuildStage(
pipelineStageInfo,
pipeline,
deploymentRole,
sourceCode,
websiteArtifact
);
// Add deployment stage to for the target AWS to do the actual deployment.
this.addDeploymentStage(
props,
pipelineStageInfo,
pipeline,
deploymentRole,
websiteArtifact
);
});
}
// Add Source stage to fetch code from GitHub repository.
private addSourceStage(
pipeline: codepipeline.Pipeline,
sourceCode: codepipeline.Artifact
) {
pipeline.addStage({
stageName: "Source",
actions: [
new codepipeline_actions.GitHubSourceAction({
actionName: "Checkout",
owner: "yangliu",
repo: "eCommerceWebsite",
branch: "main",
oauthToken: CDK.SecretValue.secretsManager(
"eCommerceWebsite-GitHubToken"
),
output: sourceCode,
trigger: codepipeline_actions.GitHubTrigger.WEBHOOK,
}),
],
});
}
private addBuildStage(
pipelineStageInfo: PipelineStageInfo,
pipeline: codepipeline.Pipeline,
deploymentRole: IAM.IRole,
sourceCode: codepipeline.Artifact,
websiteArtifact: codepipeline.Artifact
) {
const stage = new CDK.Stage(this, `${pipelineStageInfo.stageName}BuildId`, {
env: {
account: pipelineStageInfo.awsAccount,
},
});
const buildStage = pipeline.addStage(stage);
const targetProject: CodeBuild.IProject = CodeBuild.Project.fromProjectName(
this,
`CodeBuildProject${pipelineStageInfo.stageName}`,
`${pipelineStageInfo.stageName}ColdBuild`
);
buildStage.addAction(
new codepipeline_actions.CodeBuildAction({
actionName: `BuildArtifactForAAAA${pipelineStageInfo.stageName}`,
project: targetProject,
input: sourceCode,
outputs: [websiteArtifact],
role: deploymentRole,
})
);
}
private addDeploymentStage(
props: PipelineStackProps,
pipelineStageInfo: PipelineStageInfo,
pipeline: codepipeline.Pipeline,
deploymentRole: IAM.IRole,
websiteArtifact: codepipeline.Artifact
) {
const websiteBucket = S3.Bucket.fromBucketName(
this,
`${pipelineStageInfo.websiteBucketName}ConstructId`,
`${pipelineStageInfo.websiteBucketName}`
);
const pipelineStage = new PipelineStage(this, pipelineStageInfo.stageName, {
stageName: pipelineStageInfo.stageName,
pipelineName: props.pipelineName,
websiteDomain: pipelineStageInfo.websiteDomain,
websiteBucket: websiteBucket,
env: {
account: pipelineStageInfo.awsAccount,
region: pipelineStageInfo.awsRegion,
},
});
const stage = pipeline.addStage(pipelineStage);
stage.addAction(
new codepipeline_actions.S3DeployAction({
actionName: `DeploymentFor${pipelineStageInfo.stageName}`,
input: websiteArtifact,
bucket: websiteBucket,
role: deploymentRole,
})
);
}
}
buildspec.yml:
version: 0.2
env:
secrets-manager:
REACT_APP_DOMAIN: "REACT_APP_DOMAIN"
REACT_APP_BACKEND_SERVICE_API: "REACT_APP_BACKEND_SERVICE_API"
REACT_APP_GOOGLE_MAP_API_KEY: "REACT_APP_GOOGLE_MAP_API_KEY"
phases:
install:
runtime-versions:
nodejs: 14
commands:
- echo Performing yarn install
- yarn install
build:
commands:
- yarn build
artifacts:
base-directory: ./build
files:
- "**/*"
cache:
paths:
- "./node_modules/**/*"
I figured this out. aws-codepipeline pipeline has a built-in artifacts bucket : CDK's CodePipeline or CodeBuildStep are leaving an S3 bucket behind, is there a way of automatically removing it?. That is different from the CodeBuild artifacts.
Because my pipeline role in Account A need to assume the deployment role in Account B to perform the CodeBuild step(of Account B), I need grant the deployment role in Account B the write permission to the pipeline's built-in artifacts bucket. So I need do this:
pipeline.artifactBucket.grantReadWrite(deploymentRole);

In AWS CDK, how should CodeStarConnectionsSourceAction be used as input to a CodePipeline?

I'm trying to have a pipeline that fetches code from Github and deploys it (Lambda, DynamoDB, etc, using CDK). I'm trying to make it work with CodeStarConnectionsSourceAction at the moment and my code is failing with this error:
[Example4BePipeline/Example4BePipeline/Pipeline] Action Synth in stage Build: first stage may only contain Source actions
[Example4BePipeline/Example4BePipeline/Pipeline] Action 'Synth' is using input Artifact 'Sources', which is not being produced in this pipeline
My code producing that error is:
import * as cdk from "aws-cdk-lib"
import {Construct} from "constructs"
import {Example4BeDeployStage} from "./example4-be-deploy-stage"
import {CodeBuildStep, CodePipeline, CodePipelineFileSet} from "aws-cdk-lib/pipelines"
import {CodeStarConnectionsSourceAction} from "aws-cdk-lib/aws-codepipeline-actions";
import {Artifact} from "aws-cdk-lib/aws-codepipeline";
export class PipelineStack extends cdk.Stack {
constructor(scope: Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props)
const sourceOutput = new Artifact("Sources");
const sourceAction = new CodeStarConnectionsSourceAction({
actionName: "Github",
owner: "username",
repo: "example4-be",
output: sourceOutput,
connectionArn: "arn:aws:codestar-connections:us-east-1:....:connection/...."
})
const pipeline = new CodePipeline(this, "Example4BePipeline", {
pipelineName: "Example4BePipeline",
synth: new CodeBuildStep("Synth", {
input: CodePipelineFileSet.fromArtifact(sourceOutput),
installCommands: [
"npm install -g aws-cdk"
],
commands: [
"npm ci",
"npm run build",
"npx cdk synth"
]
}
),
selfMutation: false // TODO: remove before committing.
})
}
}
What am I doing wrong here?
I figured it out, this is the working code, much shorter/concise:
import * as cdk from "aws-cdk-lib"
import {Construct} from "constructs"
import {Example4BeDeployStage} from "./example4-be-deploy-stage"
import {CodeBuildStep, CodePipeline, CodePipelineSource} from "aws-cdk-lib/pipelines"
export class PipelineStack extends cdk.Stack {
constructor(scope: Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props)
const pipeline = new CodePipeline(this, "Example4BePipeline", {
pipelineName: "Example4BePipeline",
synth: new CodeBuildStep("Synth", {
input: CodePipelineSource.connection("username/example4-be", "main", {
connectionArn: "arn:aws:codestar-connections:us-east-1:....:connection/....",
}),
installCommands: [
"npm install -g aws-cdk"
],
commands: [
"npm ci",
"npm run build",
"npx cdk synth"
]
}
),
selfMutation: false // TODO: remove before committing.
})
}
}

Delete code when sync github source code to S3 with CDK

I am trying to use code pipeline in AWS CDK to automatically deploy code from GitHub source to S3 bucket. The code is as follows:
import * as codepipeline from '#aws-cdk/aws-codepipeline';
import * as codepipeline_actions from '#aws-cdk/aws-codepipeline-actions';
import * as s3 from '#aws-cdk/aws-s3';
import { Construct, Stack, StackProps } from '#aws-cdk/core';
export class S3PipelineStack extends Stack {
constructor(scope: Construct, id: string, props: StackProps = {}) {
super(scope, id, props);
const dagsBucket = s3.Bucket.fromBucketName(this, 'my-bucket', `test-bucket`);
const pipeline = new codepipeline.Pipeline(this, 'my-s3-pipeline', {
pipelineName: 'MyS3Pipeline',
});
const sourceOutput = new codepipeline.Artifact();
const sourceAction = new codepipeline_actions.CodeStarConnectionsSourceAction({
actionName: 'Source',
owner: '***',
repo: '***',
connectionArn: 'arn:aws:***',
output: sourceOutput,
branch: 'master',
});
const deployAction = new codepipeline_actions.S3DeployAction({
actionName: 'S3Deploy',
bucket: dagsBucket,
input: sourceOutput,
});
pipeline.addStage({
stageName: 'Source',
actions: [sourceAction],
});
pipeline.addStage({
stageName: 'Deploy',
actions: [deployAction],
});
}
}
This code works but the only problem is the S3 bucket can only add or change the code when source change in GitHub but cannot delete any code when anything deleted from source.
And I found a note in aws docs that said:
Another possible solution is from s3deploy.BucketDeployment but again it doesn't support connecting source from git but only can pass a source from local or another S3 bucket.
So does anybody know how to sync the GitHub and S3 bucket with add/change/delete from source in the right way?