How to give particular/full access to Elastic beanstalk using aws CDK? - amazon-web-services

I have created aws pipeline to deploy my dot net framework app to elastic beanstalk using CDK typescript code.
But in deploy step it fails with below error
Insufficient permissions
The provided role does not have the elasticbeanstalk:CreateApplicationVersion permission
I am not sure how to assign permissions using aws cdk.
How should I add permissions in below code
Below is my deploy stage code
const appName = "SampleDotNetMVCWebApp";
const newRole = new iam.Role(this, "Role", {
assumedBy: new iam.ServicePrincipal("elasticbeanstalk.amazonaws.com")
});
newRole.addToPolicy(
new iam.PolicyStatement({
resources: ["*"],
actions: ["elasticbeanstalk:CreateApplicationVersion"]
})
);
const app = new elasticbeanstalk.CfnApplication(this, "EBApplication", {
applicationName: appName
});
const elbEnv = new elasticbeanstalk.CfnEnvironment(this, "Environment", {
environmentName: "SampleMVCEBEnvironment",
applicationName: appName,
platformArn: platform,
solutionStackName: "64bit Windows Server 2012 R2 v2.5.0 running IIS 8.5"
});
pipeline.addStage({
stageName: "Deploy",
actions: [
new ElasticBeanStalkDeployAction({
actionName: "DeployToEB",
applicationName: appName,
environmentName: "SampleMVCEBEnvironment",
input: cdkBuildOutput,
role: newRole
})
]
});
NOTE: In above code aws pipeline action "ElasticBeanStalkDeployAction" is custom action as aws cdk has not release this deploy to eb action feature yet.
You can check the issue and code for implementation of IAction here
https://github.com/aws/aws-cdk/issues/2516

You need to add permissions to ElasticBeanStalkDeployAction to create CreateApplicationVersion, Use .addToPolicy
These policies will be created with the role, whereas those added by
addToPolicy are added using a separate CloudFormation resource
(allowing a way around circular dependencies that could otherwise be
introduced).
const elasticBeanStalkDeployAction = new ElasticBeanStalkDeployAction({
actionName: 'DeployToEB',
applicationName: appName,
environmentName: 'SampleMVCEBEnvironment',
input: cdkBuildOutput
})
elasticBeanStalkDeployAction.addToRolePolicy(new PolicyStatement({
effect: Effect.ALLOW,
resources: ['*'],
actions: ['elasticbeanstalk:*']
}));
Later, use the object you created and pass it into the action:
pipeline.addStage({
stageName: 'Deploy',
actions: [elasticBeanStalkDeployAction]
});

Alternatively as suggested here - https://github.com/aws/aws-cdk/issues/2516
This issue can be resolved by adding policy in bind method of ElasticBeanStalkDeployAction
public bind(scope: Construct, stage: codepipeline.IStage, options: codepipeline.ActionBindOptions):
codepipeline.ActionConfig {
options.role.addManagedPolicy(
iam.ManagedPolicy.fromAwsManagedPolicyName(
"AWSElasticBeanstalkFullAccess"
));
}

Related

Export CDK Generated InternetGateway Resource [AWS Cloudformation]

So I have a basic VPC setup on CDK, after running cdk synth I see that an AWS::EC2::InternetGateway construct has been generated in my template under cdk.out. The problem is that I need to export this resource to another stack, but I can't reference this.vpc.internetGatewayId in my main .ts cdk file to export.
I know that this resource has been created, so I won't use CfnInternetGateway to create another, but at the same time this.vpc.internetGatewayId seems to have returned undefined even though I have a public subnet created and the resulting resource is in the template file.
Below is my VPC construct:
this.vpc = new ec2.Vpc(this, 'VPC', {
cidr: CONFIG.REGIONAL_CONFIG[region].VPC_CIDR,
maxAzs: CONFIG.REGIONAL_CONFIG[region].AVAILABILITY_ZONES,
subnetConfiguration: [
{
name: 'public',
subnetType: ec2.SubnetType.PUBLIC,
},
{
name: 'private',
subnetType: ec2.SubnetType.PRIVATE_WITH_NAT,
},
{
name: 'isolated',
subnetType: ec2.SubnetType.PRIVATE_ISOLATED,
},
],
});
Create a CloudFormation output on the exporting stack with with a CfnOutput:
// MyExportingStack.ts
// Cause the synth to fail if the ID is undefined. Alternatively, throw an error.
if (!vpc.internetGatewayId)
cdk.Annotations.of(this).addError('An Internet Gateway ID is required!');
// Create the CloudFormation Output
new cdk.CfnOutput(this, 'IgwId', { value: vpc.internetGatewayId ?? 'MISSING', exportName: 'my-igw-id',});

CodeBuild: Artifacts upload location doesn't match

Here is my CodeBuild main page, which says "Artifacts upload location" is "alpha-artifact-bucket":
Here is one of the build run, which is not using above bucket:
What's the difference between the two? Why every build run use a random bucket?
Any way to enforce the CodeBuild use the specified S3 bucket "alpha-artifact-bucket"?
CDK code
CodeBuild stack: I deploy this stack to each AWS along the pipeline first, so that the pipeline stack will just query each AWS and find its corresponding CodeBuild, and add it as a "stage". The reason I'm doing this is because each AWS will have a dedicated CodeBuild stage which will need read some values from its SecretManger.
export interface CodeBuildStackProps extends Cdk.StackProps {
readonly pipelineName: string;
readonly pipelineRole: IAM.IRole;
readonly pipelineStageInfo: PipelineStageInfo;
}
/**
* This stack will create CodeBuild for the target AWS account.
*/
export class CodeBuildStack extends Cdk.Stack {
constructor(scope: Construct, id: string, props: CodeBuildStackProps) {
super(scope, id, props);
// DeploymentRole will be assumed by PipelineRole to perform the CodeBuild step.
const deploymentRoleArn: string = `arn:aws:iam::${props.env?.account}:role/${props.pipelineName}-DeploymentRole`;
const deploymentRole = IAM.Role.fromRoleArn(
this,
`CodeBuild${props.pipelineStageInfo.stageName}DeploymentRoleConstructID`,
deploymentRoleArn,
{
mutable: false,
// Causes CDK to update the resource policy where required, instead of the Role
addGrantsToResources: true,
}
);
const buildspecFile = FS.readFileSync("./config/buildspec.yml", "utf-8");
const buildspecFileYaml = YAML.parse(buildspecFile, {
prettyErrors: true,
});
new CodeBuild.Project(
this,
`${props.pipelineStageInfo.stageName}ColdBuild`,
{
projectName: `${props.pipelineStageInfo.stageName}ColdBuild`,
environment: {
buildImage: CodeBuild.LinuxBuildImage.STANDARD_5_0,
},
buildSpec: CodeBuild.BuildSpec.fromObjectToYaml(buildspecFileYaml),
role: deploymentRole,
logging: {
cloudWatch: {
logGroup: new Logs.LogGroup(
this,
`${props.pipelineStageInfo.stageName}ColdBuildLogGroup`,
{
retention: Logs.RetentionDays.ONE_WEEK,
}
),
},
},
}
);
}
}
Pipeline Stack:
export interface PipelineStackProps extends CDK.StackProps {
readonly description: string;
readonly pipelineName: string;
}
/**
* This stack will contain our pipeline..
*/
export class PipelineStack extends CDK.Stack {
private readonly pipelineRole: IAM.IRole;
constructor(scope: Construct, id: string, props: PipelineStackProps) {
super(scope, id, props);
// Get the pipeline role from pipeline AWS account.
// The pipeline role will assume "Deployment Role" of each AWS account to perform the actual deployment.
const pipelineRoleName: string =
"eCommerceWebsitePipelineCdk-Pipeline-PipelineRole";
this.pipelineRole = IAM.Role.fromRoleArn(
this,
pipelineRoleName,
`arn:aws:iam::${this.account}:role/${pipelineRoleName}`,
{
mutable: false,
// Causes CDK to update the resource policy where required, instead of the Role
addGrantsToResources: true,
}
);
// Initialize the pipeline.
const pipeline = new codepipeline.Pipeline(this, props.pipelineName, {
pipelineName: props.pipelineName,
role: this.pipelineRole,
restartExecutionOnUpdate: true,
});
// Add a pipeline Source stage to fetch source code from repository.
const sourceCode = new codepipeline.Artifact();
this.addSourceStage(pipeline, sourceCode);
// For each AWS account, add a build stage and a deployment stage.
pipelineStageInfoList.forEach((pipelineStageInfo: PipelineStageInfo) => {
const deploymentRoleArn: string = `arn:aws:iam::${pipelineStageInfo.awsAccount}:role/${props.pipelineName}-DeploymentRole`;
const deploymentRole: IAM.IRole = IAM.Role.fromRoleArn(
this,
`DeploymentRoleFor${pipelineStageInfo.stageName}`,
deploymentRoleArn
);
const websiteArtifact = new codepipeline.Artifact();
// Add build stage to build the website artifact for the target AWS.
// Some environment variables will be retrieved from target AWS's secret manager.
this.addBuildStage(
pipelineStageInfo,
pipeline,
deploymentRole,
sourceCode,
websiteArtifact
);
// Add deployment stage to for the target AWS to do the actual deployment.
this.addDeploymentStage(
props,
pipelineStageInfo,
pipeline,
deploymentRole,
websiteArtifact
);
});
}
// Add Source stage to fetch code from GitHub repository.
private addSourceStage(
pipeline: codepipeline.Pipeline,
sourceCode: codepipeline.Artifact
) {
pipeline.addStage({
stageName: "Source",
actions: [
new codepipeline_actions.GitHubSourceAction({
actionName: "Checkout",
owner: "yangliu",
repo: "eCommerceWebsite",
branch: "main",
oauthToken: CDK.SecretValue.secretsManager(
"eCommerceWebsite-GitHubToken"
),
output: sourceCode,
trigger: codepipeline_actions.GitHubTrigger.WEBHOOK,
}),
],
});
}
private addBuildStage(
pipelineStageInfo: PipelineStageInfo,
pipeline: codepipeline.Pipeline,
deploymentRole: IAM.IRole,
sourceCode: codepipeline.Artifact,
websiteArtifact: codepipeline.Artifact
) {
const stage = new CDK.Stage(this, `${pipelineStageInfo.stageName}BuildId`, {
env: {
account: pipelineStageInfo.awsAccount,
},
});
const buildStage = pipeline.addStage(stage);
const targetProject: CodeBuild.IProject = CodeBuild.Project.fromProjectName(
this,
`CodeBuildProject${pipelineStageInfo.stageName}`,
`${pipelineStageInfo.stageName}ColdBuild`
);
buildStage.addAction(
new codepipeline_actions.CodeBuildAction({
actionName: `BuildArtifactForAAAA${pipelineStageInfo.stageName}`,
project: targetProject,
input: sourceCode,
outputs: [websiteArtifact],
role: deploymentRole,
})
);
}
private addDeploymentStage(
props: PipelineStackProps,
pipelineStageInfo: PipelineStageInfo,
pipeline: codepipeline.Pipeline,
deploymentRole: IAM.IRole,
websiteArtifact: codepipeline.Artifact
) {
const websiteBucket = S3.Bucket.fromBucketName(
this,
`${pipelineStageInfo.websiteBucketName}ConstructId`,
`${pipelineStageInfo.websiteBucketName}`
);
const pipelineStage = new PipelineStage(this, pipelineStageInfo.stageName, {
stageName: pipelineStageInfo.stageName,
pipelineName: props.pipelineName,
websiteDomain: pipelineStageInfo.websiteDomain,
websiteBucket: websiteBucket,
env: {
account: pipelineStageInfo.awsAccount,
region: pipelineStageInfo.awsRegion,
},
});
const stage = pipeline.addStage(pipelineStage);
stage.addAction(
new codepipeline_actions.S3DeployAction({
actionName: `DeploymentFor${pipelineStageInfo.stageName}`,
input: websiteArtifact,
bucket: websiteBucket,
role: deploymentRole,
})
);
}
}
buildspec.yml:
version: 0.2
env:
secrets-manager:
REACT_APP_DOMAIN: "REACT_APP_DOMAIN"
REACT_APP_BACKEND_SERVICE_API: "REACT_APP_BACKEND_SERVICE_API"
REACT_APP_GOOGLE_MAP_API_KEY: "REACT_APP_GOOGLE_MAP_API_KEY"
phases:
install:
runtime-versions:
nodejs: 14
commands:
- echo Performing yarn install
- yarn install
build:
commands:
- yarn build
artifacts:
base-directory: ./build
files:
- "**/*"
cache:
paths:
- "./node_modules/**/*"
I figured this out. aws-codepipeline pipeline has a built-in artifacts bucket : CDK's CodePipeline or CodeBuildStep are leaving an S3 bucket behind, is there a way of automatically removing it?. That is different from the CodeBuild artifacts.
Because my pipeline role in Account A need to assume the deployment role in Account B to perform the CodeBuild step(of Account B), I need grant the deployment role in Account B the write permission to the pipeline's built-in artifacts bucket. So I need do this:
pipeline.artifactBucket.grantReadWrite(deploymentRole);

AWS CodePipeline and CodeCommit in different regions (or AWS accounts) using CDK

Basically, I want to build the pipeline that exists in one AWS acc, uses CodeCommit from another AWS acc, and deploys something in a third acc. I have this code for deploying my pipeline:
import * as codecommit from '#aws-cdk/aws-codecommit';
import * as codepipeline from '#aws-cdk/aws-codepipeline';
import * as codepipeline_actions from '#aws-cdk/aws-codepipeline-actions';
const repoArn = `arn:aws:codecommit:${vars.sourceRegion}:${vars.sourceAccount}:${vars.sourceRepo}`
const repo = codecommit.Repository.fromRepositoryArn(this, 'Source-repo', repoArn);
const sourceArtifact = new codepipeline.Artifact();
let trigger = CodeCommitTrigger.EVENTS
const sourceAction = new codepipeline_actions.CodeCommitSourceAction({
branch: vars.sourceBranch,
actionName: 'Source',
trigger: trigger,
output: sourceArtifact,
repository: repo,
variablesNamespace: 'SourceVariables',
codeBuildCloneOutput: true,
});
const pipelineBucket = s3.Bucket.fromBucketArn(this, 'pipelineBucket', BucketArn);
const pipeline = new codepipeline.Pipeline(this, 'CodePipeline', {
artifactBucket: pipelineBucket,
crossAccountKeys: true,
role: roles.codePiplineRole,
pipelineName: name,
stages: [
{
stageName: 'Source',
actions: [sourceAction],
},
],
});
If I run this I'll get the error: Source action 'Source' must be in the same region as the pipeline But they are both in the same region, and even in the same acc.
If I change codecommit.Repository.fromRepositoryArn to codecommit.Repository.fromRepositoryName then there will be no errors.
Is there any way to import an existing repo from ARN?

ECS task unable to pull secrets or registry auth

I have a CDK project that creates a CodePipeline which deploys an application on ECS. I had it all previously working, but the VPC was using a NAT gateway, which ended up being too expensive. So now I am trying to recreate the project without requiring a NAT gateway. I am almost there, but I have now run into issues when the ECS service is trying to start tasks. All tasks fail to start with the following error:
ResourceInitializationError: unable to pull secrets or registry auth: execution resource retrieval failed: unable to retrieve secret from asm: service call has been retried 5 time(s): failed to fetch secret
At this point I've kind of lost track of the different things I have tried, but I will post the relevant bits here as well as some of my attempts.
const repository = ECR.Repository.fromRepositoryAttributes(
this,
"ecr-repository",
{
repositoryArn: props.repository.arn,
repositoryName: props.repository.name,
}
);
// vpc
const vpc = new EC2.Vpc(this, this.resourceName(props, "vpc"), {
maxAzs: 2,
natGateways: 0,
enableDnsSupport: true,
});
const vpcSecurityGroup = new SecurityGroup(this, "vpc-security-group", {
vpc: vpc,
allowAllOutbound: true,
});
// tried this to allow the task to access secrets manager
const vpcEndpoint = new EC2.InterfaceVpcEndpoint(this, "secrets-manager-task-vpc-endpoint", {
vpc: vpc,
service: EC2.InterfaceVpcEndpointAwsService.SSM,
});
const secrets = SecretsManager.Secret.fromSecretCompleteArn(
this,
"secrets",
props.secrets.arn
);
const cluster = new ECS.Cluster(this, this.resourceName(props, "cluster"), {
vpc: vpc,
clusterName: `api-cluster`,
});
const ecsService = new EcsPatterns.ApplicationLoadBalancedFargateService(
this,
"ecs-service",
{
taskSubnets: {
subnetType: SubnetType.PUBLIC,
},
securityGroups: [vpcSecurityGroup],
serviceName: "api-service",
cluster: cluster,
cpu: 256,
desiredCount: props.scaling.desiredCount,
taskImageOptions: {
image: ECS.ContainerImage.fromEcrRepository(
repository,
this.ecrTagNameParameter.stringValue
),
secrets: getApplicationSecrets(secrets), // returns
logDriver: LogDriver.awsLogs({
streamPrefix: "api",
logGroup: new LogGroup(this, "ecs-task-log-group", {
logGroupName: `${props.environment}-api`,
}),
logRetention: RetentionDays.TWO_MONTHS,
}),
},
memoryLimitMiB: 512,
publicLoadBalancer: true,
domainZone: this.hostedZone,
certificate: this.certificate,
redirectHTTP: true,
}
);
const scalableTarget = ecsService.service.autoScaleTaskCount({
minCapacity: props.scaling.desiredCount,
maxCapacity: props.scaling.maxCount,
});
scalableTarget.scaleOnCpuUtilization("cpu-scaling", {
targetUtilizationPercent: props.scaling.cpuPercentage,
});
scalableTarget.scaleOnMemoryUtilization("memory-scaling", {
targetUtilizationPercent: props.scaling.memoryPercentage,
});
secrets.grantRead(ecsService.taskDefinition.taskRole);
repository.grantPull(ecsService.taskDefinition.taskRole);
I read somewhere that it probably has something to do with Fargate version 1.4.0 vs 1.3.0, but I'm not sure what I need to change to allow the tasks to access what they need to run.
You need to create an interface endpoints for Secrets Manager, ECR (two types of endpoints), CloudWatch, as well as a gateway endpoint for S3.
Refer to the documentation on the topic.
Here's an example in Python, it'd work the same in TS:
vpc.add_interface_endpoint(
"secretsmanager_endpoint",
service=ec2.InterfaceVpcEndpointAwsService.SECRETS_MANAGER,
)
vpc.add_interface_endpoint(
"ecr_docker_endpoint",
service=ec2.InterfaceVpcEndpointAwsService.ECR_DOCKER,
)
vpc.add_interface_endpoint(
"ecr_endpoint",
service=ec2.InterfaceVpcEndpointAwsService.ECR,
)
vpc.add_interface_endpoint(
"cloudwatch_logs_endpoint",
service=ec2.InterfaceVpcEndpointAwsService.CLOUDWATCH_LOGS,
)
vpc.add_gateway_endpoint(
"s3_endpoint",
service=ec2.GatewayVpcEndpointAwsService.S3
)
Keep in mind that interface endpoints cost money as well, and may not be cheaper than a NAT.

How to specify AWS pipeline as source provider for codebuild project in CDK?

When I create a codebuild project in AWS console, I can select AWS CodePipeline as source provider. See below screenshot.
But in CDK, https://docs.aws.amazon.com/cdk/api/latest/docs/#aws-cdk_aws-codebuild.Source.html, I can't find how I can specify AWS CodePipeline as source provider. How can I achieve it in CDK?
Seems like you assign a codebuild action to the codepipeline instead of adding a codepipeline to the codebuild project source.
https://docs.aws.amazon.com/cdk/api/latest/docs/aws-codebuild-readme.html#codepipeline
To add a CodeBuild Project as an Action to CodePipeline, use the PipelineProject class instead of Project. It's a simple class that doesn't allow you to specify sources, secondarySources, artifacts or secondaryArtifacts, as these are handled by setting input and output CodePipeline Artifact instances on the Action, instead of setting them on the Project.
https://docs.aws.amazon.com/cdk/api/latest/docs/aws-codepipeline-actions-readme.html#build--test
Example of a CodeBuild Project used in a Pipeline, alongside CodeCommit:
import * as codebuild from '#aws-cdk/aws-codebuild';
import * as codecommit from '#aws-cdk/aws-codecommit';
import * as codepipeline_actions from '#aws-cdk/aws-codepipeline-actions';
const repository = new codecommit.Repository(this, 'MyRepository', {
repositoryName: 'MyRepository',
});
const project = new codebuild.PipelineProject(this, 'MyProject');
const sourceOutput = new codepipeline.Artifact();
const sourceAction = new codepipeline_actions.CodeCommitSourceAction({
actionName: 'CodeCommit',
repository,
output: sourceOutput,
});
const buildAction = new codepipeline_actions.CodeBuildAction({
actionName: 'CodeBuild',
project,
input: sourceOutput,
outputs: [new codepipeline.Artifact()], // optional
executeBatchBuild: true // optional, defaults to false
});
new codepipeline.Pipeline(this, 'MyPipeline', {
stages: [
{
stageName: 'Source',
actions: [sourceAction],
},
{
stageName: 'Build',
actions: [buildAction],
},
],
});