A simple buildspec like:
version: 0.2
phases:
install:
commands:
- (cd lambda/src; npm install)
- aws cloudformation package --template-file lambda/sam.yml --s3-bucket skynet-lambda --output-template-file SkynetLambdaPackaged.yml
artifacts:
type: zip
files:
- SkynetLambdaPackaged.yml
Works fine when I have one action in my build stage. But what if I want to have more build actions for example: I want to build my api server and frontend files in parallel. How do I model this?
UPDATE
In CodePipeline I can create actions that run in parallel like below, how is this modeled in buildspec? Or isit impossible?
You can use two different CodeBuild projects from the same source as two separate parallel actions in your CodePipeline.
For this to happen, you can use two buildspec files in your source.
e.g.
buildspec-frontend.yml
phases:
install:
commands:
- (cd frontend/src; npm run build)
- aws s3 sync frontend/dist s3://<insert s3 bucket url here>/ --delete
buildspec-backend.yml
phases:
install:
commands:
- (cd lambda/src; npm install)
- aws cloudformation package --template-file lambda/sam.yml --s3-bucket skynet-lambda --output-template-file SkynetLambdaPackaged.yml
Then, create a frontend CodeBuild project that uses the frontend buildspec. Repeat for the backend.
Then, when you go to your Build stage in your CodePipeline, use the two CodeBuild projects as parallel actions.
Update: The information below is now irrelevant since I misunderstood the question.
If your frontend can be deployed to s3, just add its deployment commands where you put your api deployment commands.
e.g.
phases:
install:
commands:
- (cd lambda/src; npm install)
- aws cloudformation package --template-file lambda/sam.yml --s3-bucket skynet-lambda --output-template-file SkynetLambdaPackaged.yml
- (cd frontend/src; npm run build)
- aws s3 sync frontend/dist s3://<insert s3 bucket url here>/ --delete
If your frontend is not on s3, just replace those lines with your own frontend deployment commands.
CodeBuild executes those commands in sequence. If you really need to run them in parallel, there are many ways to do it.
My preference is to put the commands in a Makefile and call them from your buildspec.yml (e.g. make --jobs 2 backend frontend).
enter image description here
From my understanding, if you just have one source, you cannot have two buildspec file because you can only name that file as buildspec. Maybe you can try "Insert build commands" option
Related
When i push changes in AWS CodeCommit Repo, I want to make JAR file with mvn install command for that Java Code and upload it to AWS Lambda function. Location of that Jar file should be inside src/main/target. Can anyone suggest buildspec.yaml file?
Assuming that you're using AWS SAM (Serverless Application Model), this is as simple as calling a single command in the post_build section of your buildspec.yaml. Example:
version: 0.2
phases:
install:
runtime-versions:
java: corretto8
pre_build:
commands:
- mvn clean
build:
commands:
- mvn install
post_build:
commands:
- sam deploy --stack-name lambda-java --no-confirm-changeset
artifacts:
files:
- target/lambda-java.jar
discard-paths: no
Please note though that you'll also have to set up a mechanism that kicks off the build process when you push any changes to your repository. The easiest way doing this is using AWS CodePipeline, as that nicely integrates with CodeCommit. Simply create a new pipeline, choose your existing CodeCommit repository where the Java-based Lambda is stored, and select CodeBuild as the build provider (skip the deploy stage).
Also note that your CodeBuild service role will have to have the appropriate permissions to deploy the Lambda function. As SAM is leveraged, this includes permissions to upload to S3 and update the corresponding CloudFormation stack (see stack-name parameter above).
From here on, whenever you push any changes to your repo, CodePipeline will trigger a build using CodeCommit, which will then deploy a new version of your Lambda via the sam deploy command in your buildspec.yaml.
I am having trouble with AWS Codepipeline, rather than deploying the entire Codecommit repo to an S3 bucket, I would like Codepipeline to only deploy a single file that was updated.
The same question is asked here:
Codepipeline: Deploy only specific files to s3 but I do not want any specific file. Any file whether its HTML or Css, only the one that is updated should be deployed to S3
I am using AWS CodeBuild, my deployment yaml file is this:
version: 0.2
phases:
install:
commands:
- echo Entered the install phase...
- apt-get update -y
build:
commands:
- echo Building...
artifacts:
files:
- index.html
discard-paths: yes
secondary-artifacts:
artifact1:
files:
- index.html
Now in the above code instead of index.html in artifacts what should I write so that even if any other file is changed or updated, then only that single file should be updated in S3 bucket.
Unfortunately, there is no build in functionality for this in CodeBuild.This means that you would have to program it yourself in your CodeBuild procedure.
A possible way could be by listing files in your CodeCommit repository which were changed in the last commit. You can do this using git's functionality.
I have created a working CodePipeline for my AWS SAM application.
It is using only Source and Build phases with the following buildspec.yaml file
version: 0.2
phases:
install:
runtime-versions:
python: 3.7
commands:
- pip install --user aws-sam-cli
- USER_BASE_PATH=$(python -m site --user-base)
- export PATH=$PATH:$USER_BASE_PATH/bin
build:
commands:
- sam build
post_build:
commands:
sam package --s3-bucket deploy-bucket --output-template-file deployment.yaml
# finally:
# sam deploy --template-file deployment.yaml --stack-name MyStackSAM--region us-east-1 --capabilities CAPABILITY_IAM
As you can see I have commented out the last two lines as I want to move that action to a Deploy stage in CodePipeline
My Deploy step looks like this:
My CloudFormationPipelineServiceRole has full admin permission at this point, never the less, I'm still getting the following error as the result of executing this stage.
Action execution failed
Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: XXXXFFFFFXXXX; S3 Extended Request ID: XXXXFFFFFXXXXFFFFFXXXXX=)
I am stuck as to why I'm getting this error. Any help would be greatly appreciated.
First, sam package expects source template file that needs to be passed via --template-file flag. I don't see that template file anywhere in your code. Which template file are you trying to package?
Second, you are not uploading the necessary artifacts to the s3 bucket. The only thing that you are uploading is zipped code but as you can see from the command that you have commented out:
sam deploy --template-file deployment.yaml --stack-name MyStackSAM--region us-east-1 --capabilities CAPABILITY_IAM
you also need this file deployment.yaml but you didn't specify that in your code. There is no way for CodeBuild to guess which artifacts you want to preserve.
You will need to add additional artifacts section to the bottom of your buildspec file and specify those artifacts.
artifacts:
type: zip
files:
- template.yaml # (where do you have this file?)
- outputtemplate.yaml # (deployment.yaml in your case)
Note that the artifacts section needs to be on the same level as version and phases
version: 0.2
phases:
...
artifacts:
...
I just created a brand new AWS Codestar project.
As far as I can tell, that Codestar is just a dashboard that integrates multiple AWS products.
There is one thing that I don't know how to configure yet, and it is branch deployments.
In my git repository, I have 3 branches: master, develop and staging
In an ideal world, master deploys to production, develop to the development environment and staging to the QA environment.
I don't know how to configure this pipeline using AWS, and I haven't been able to locate to relevant documentation in their developers portal.
This is my buildspec.yml file just in case it can be configured there:
version: 0.2
phases:
install:
commands:
- echo Installing NPM Packages...
- npm install
build:
commands:
- aws cloudformation package --template template.yml --s3-bucket $S3_BUCKET --output-template template-export.yml
artifacts:
type: zip
files:
- template-export.yml
This is a project that uses AWS API Gateway to route requests to AWS Lambda functions if that matters.
Sadly AWS CodePipline doesn't support passing in the git branch. Last year they have only added support to pass the git commit sha1 (more can be found here).
I'd suggest you follow the CodePipline docs here, to create 3 pipelines one for each branch (you can even create a special buildspec_dev.yaml or buildspec_prod.yaml, check out more examples here).
I'm attempting to setup AWS Codepipeline with a Cloudformation SAM stack that deploys to a Lambda and am doing this via the Pipeline console. The pipeline passes the clone from Codecommit process and the build process with Codebuild, however fails on the deploy with Cloudformation.
I get the following error message:
Action execution failed
Invalid TemplatePath: MyAppBuild::samTemplate.yml
I've been following the documentation (http://docs.aws.amazon.com/lambda/latest/dg/automating-deployment.html) pretty closely and in field 16 of the codepipeline setup form (Template File) I have been putting samTemplate.yml.
I also have samTemplate.yml in the root of my repo (which is also the root of the project).
I've matched the Codebuild output artifact name with the Cloudformation input artifact name and they match exactly.
Am I missing something here? How do I get the Cloudformation deploy process to recognize the sam template?
EDIT 1 I've switched to using Codestar instead of directly using CodePipeline. Nothing special in my buildspec.yml, but it's below if anyone's interested.
version: 0.2
phases:
install:
commands:
- echo "install"
pre_build:
commands:
- echo "pre_build"
build:
commands:
- aws cloudformation package --template template.yml --s3-bucket $S3_BUCKET --output-template template-export.json
artifacts:
type: zip
files:
- template-export.json
One thing to keep in mind for anyone new to CodeBuild, the zip file that gets created as a result is of the root level directory of your build operations and IS the zip file that gets transferred to final deployment (in my case, Lambda).
You need to add aws cloudformation package command in you buildspec.yml.
aws cloudformation package --debug --template-file <YourSamTemplate.yml> --s3-bucket <YourbucketName> --output-template-file <YourOutputSamTemplate.yml>
where YourSamTemplate.yml is the name of samTemplate.yml in your project root.
and YourOutputSamTemplate.yml is the new name which you eant to give your output file after cloud formation package command is done.
Then in the next stage you need to define your input-artifact as output of your aws codebuild stage and then use this input-artifact to map your template.
build-output::YourOutputSamTemplate.yml
I believe the output of the build stage is what needs to be passed into the CloudFormation action.
In the tutorial the build output is called NewSamTemplate.yaml
So try updating your TemplatePath to MyAppBuild::NewSamTemplate.yml
In my case, the error was in the filename. Instead of template.yml the actual file was template.yaml