AWS CodeBuild - copy yaml file to S3 after successful build - amazon-web-services

I'm using CodeBuild to build Scala sbt project. during the build I'm doing the following steps.
checkout code from Git.
Install all sbt tools.
Build code.
Create Docker
Publish docker file to AWS ECR
Update K8s deployment file located in project root folder deployment/deployment.yaml with latest vesrion of published image using kustomize utility.
Publish this file to AWS S3 bucket to run further deployment on AWS EKS.
Unfortunately, the last step fails and I can't move this deployment file to AWS S3 bucket.
artifacts:
files:
- deployment/deployment.yaml
discard-paths: yes
base-directory: 'deployment'
What am I doing wrong?

base-directory: 'deployment'
... would mean that CodeBuild will look into 'deployment' directory for "deployment/deployment.yaml" which is probably wrong.
Try this:
artifacts:
files:
- deployment/deployment.yaml
discard-paths: yes
One trick I have found useful is to install 'tree' in my buildspec and inspect the tree structure of directory to make sure I am specifying the correct path.
In Install Phase:
- apt-get install tree
Later in Buildspec:
- tree .

Related

Upload jar to Lambda when I do CodeCommit in AWS

When i push changes in AWS CodeCommit Repo, I want to make JAR file with mvn install command for that Java Code and upload it to AWS Lambda function. Location of that Jar file should be inside src/main/target. Can anyone suggest buildspec.yaml file?
Assuming that you're using AWS SAM (Serverless Application Model), this is as simple as calling a single command in the post_build section of your buildspec.yaml. Example:
version: 0.2
phases:
install:
runtime-versions:
java: corretto8
pre_build:
commands:
- mvn clean
build:
commands:
- mvn install
post_build:
commands:
- sam deploy --stack-name lambda-java --no-confirm-changeset
artifacts:
files:
- target/lambda-java.jar
discard-paths: no
Please note though that you'll also have to set up a mechanism that kicks off the build process when you push any changes to your repository. The easiest way doing this is using AWS CodePipeline, as that nicely integrates with CodeCommit. Simply create a new pipeline, choose your existing CodeCommit repository where the Java-based Lambda is stored, and select CodeBuild as the build provider (skip the deploy stage).
Also note that your CodeBuild service role will have to have the appropriate permissions to deploy the Lambda function. As SAM is leveraged, this includes permissions to upload to S3 and update the corresponding CloudFormation stack (see stack-name parameter above).
From here on, whenever you push any changes to your repo, CodePipeline will trigger a build using CodeCommit, which will then deploy a new version of your Lambda via the sam deploy command in your buildspec.yaml.

How to add files from S3 bucket to output artifacts in AWS CodePipeline? (NodeJS)

I am using a AWS CodePipeline which fetches the source from my git repository, uses CodeBuild buildSpec to build and save output artifacts to S3 bucket, which ultimately gets deployed to Elastic BeanStalk (NodeJS Environment).
Everything works fine but I require the pipeline to copy 1 particular file from one of my AWS S3 buckets and add it to the output artifacts before deploying it to the EB
Can it be done using the buildSpec?
artifacts:
files:
- '**/*'
# - How to add a file from S3 to the artifacts?
My recommendation is as part of the build or post_build, copy the required file from s3 into your build directory.
build:
commands:
- echo "Build commands"
- aws s3 cp --region=xx-xxxx-x "s3://file/in/s3" "local-file-instance-to-include"
Then you will have the file copied from s3, available for your build, and you can add it to the artifacts output.

Can we allow codepipeline to store only the updated file to S3?

I am having trouble with AWS Codepipeline, rather than deploying the entire Codecommit repo to an S3 bucket, I would like Codepipeline to only deploy a single file that was updated.
The same question is asked here:
Codepipeline: Deploy only specific files to s3 but I do not want any specific file. Any file whether its HTML or Css, only the one that is updated should be deployed to S3
I am using AWS CodeBuild, my deployment yaml file is this:
version: 0.2
phases:
install:
commands:
- echo Entered the install phase...
- apt-get update -y
build:
commands:
- echo Building...
artifacts:
files:
- index.html
discard-paths: yes
secondary-artifacts:
artifact1:
files:
- index.html
Now in the above code instead of index.html in artifacts what should I write so that even if any other file is changed or updated, then only that single file should be updated in S3 bucket.
Unfortunately, there is no build in functionality for this in CodeBuild.This means that you would have to program it yourself in your CodeBuild procedure.
A possible way could be by listing files in your CodeCommit repository which were changed in the last commit. You can do this using git's functionality.

AWS CodePipeline Action execution failed

I'm trying to hook my GitHub repo with S3 so every time there's a commit, AWS CodePipeline will deploy the ./<path>/public folder to a specified S3 bucket.
So far in my pipeline, the Source works (hooked to GitHub and picks up new commits) but the Deploy failed because: Action execution failed
BundleType must be either YAML or JSON.
This is how I set them up:
CodePipeline
Action name: Source
Action provider: GitHub
Repository: account/repo
Branch: master
GitHub webhooks
CodeDeploy
Compute type: AWS Lambda
Service role: myRole
Deployment settings: CodeDeployDefault.LambdaAllAtOnce
IAM Role: myRole
AWS Service
Choose the service that will use this role: Lambda / CodeDeploy
Select your use case: CodeDeploy
Policies: AWSCodeDeployRole
I understand that there must be a buildspec.yml file in the root folder. I've tried using a few files I could find but they don't seem to work. What did I do wrong or how should I edit the buildspec file to do what I want?
Update
Thanks to #Milan Cermak. I understand I need to do:
CodePipeline:
Stage 1: Source: hook with GitHub repo. This one is working.
Stage 2: Build: use CodeBuild to grab only the wanted folder using a buildspec.yml file in the root folder of the repo.
Stage 3: Deploy: use
Action Provider: S3
Input Artifacts: OutputArtifacts (result of stage 2).
Bucket: the bucket that hosts the static website.
CodePipeline works. However, the output contains only files (.html) not folders nested inside the public folder.
I've checked this and figured how to remove path of a nested folder with discard-paths: yes but I'm unable to get all the sub-folders inside the ./<path>/public folder. Any suggestion?
CodeBuild use buildspec, but CodeDeploy use appspec.
Is there any appspec file?
You shouldn't use CodeDeploy, as that's a service for automation of deployments of applications, but rather CodeBuild, which executes commands and prepares the deployment artifact for further use in the pipeline.
These commands are in thebuildspec.yml file (typically in the root directory of the repo, but it's configurable). For your use case, it won't be too complicated, as you're not compiling anything or running tests, etc.
Try this as a starting point:
version: 0.2
phases:
build:
commands:
- ls
artifacts:
files:
- public/*
The phases section is required, that's why it's included (at least, thanks to the ls command, you'll see what files are present in the CodeBuild environment), but it's not interesting for your case. What is interesting is the artifacts section. That's where you define what is the output of the CodeBuild phase, i.e. what gets passed further to the next step in the pipeline.
Depending on how you want to have the files structured (for example, do you want to have the public directory also in the artifact or do you only want to have the files themselves, without the parent dir), you might want to use other configuration that's possible in the artifacts section. See the buildspec reference for details.
Remember to use the output artifact of the CodeBuild step as the input artifact of the Deploy to S3 step.
Buildspec is for CodeBuild as t_yamo pointed out.
You are using CodeDeploy which uses an appspec.yml file, which looks something like this for my config.
version: 0.0
os: linux
files:
- source: /
destination: /path/to/destination
hooks:
BeforeInstall:
- location: /UnzipResourceBundle.sh
ApplicationStart:
- location: /RestartServer.sh
timeout: 3600
UnzipResourceBundle.sh is just a bash script which can be used to do any number of things.
#!/bin/bash
// Do something
You can find a sample for the AppSpec.yml file from Amazon Documentation here - https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file-example.html#appspec-file-example-lambda
CodePipeline recently announced a deploy to S3 action: https://aws.amazon.com/about-aws/whats-new/2019/01/aws-codepipeline-now-supports-deploying-to-amazon-s3/

CodePipeline buildspec and multiple build actions

A simple buildspec like:
version: 0.2
phases:
install:
commands:
- (cd lambda/src; npm install)
- aws cloudformation package --template-file lambda/sam.yml --s3-bucket skynet-lambda --output-template-file SkynetLambdaPackaged.yml
artifacts:
type: zip
files:
- SkynetLambdaPackaged.yml
Works fine when I have one action in my build stage. But what if I want to have more build actions for example: I want to build my api server and frontend files in parallel. How do I model this?
UPDATE
In CodePipeline I can create actions that run in parallel like below, how is this modeled in buildspec? Or isit impossible?
You can use two different CodeBuild projects from the same source as two separate parallel actions in your CodePipeline.
For this to happen, you can use two buildspec files in your source.
e.g.
buildspec-frontend.yml
phases:
install:
commands:
- (cd frontend/src; npm run build)
- aws s3 sync frontend/dist s3://<insert s3 bucket url here>/ --delete
buildspec-backend.yml
phases:
install:
commands:
- (cd lambda/src; npm install)
- aws cloudformation package --template-file lambda/sam.yml --s3-bucket skynet-lambda --output-template-file SkynetLambdaPackaged.yml
Then, create a frontend CodeBuild project that uses the frontend buildspec. Repeat for the backend.
Then, when you go to your Build stage in your CodePipeline, use the two CodeBuild projects as parallel actions.
Update: The information below is now irrelevant since I misunderstood the question.
If your frontend can be deployed to s3, just add its deployment commands where you put your api deployment commands.
e.g.
phases:
install:
commands:
- (cd lambda/src; npm install)
- aws cloudformation package --template-file lambda/sam.yml --s3-bucket skynet-lambda --output-template-file SkynetLambdaPackaged.yml
- (cd frontend/src; npm run build)
- aws s3 sync frontend/dist s3://<insert s3 bucket url here>/ --delete
If your frontend is not on s3, just replace those lines with your own frontend deployment commands.
CodeBuild executes those commands in sequence. If you really need to run them in parallel, there are many ways to do it.
My preference is to put the commands in a Makefile and call them from your buildspec.yml (e.g. make --jobs 2 backend frontend).
enter image description here
From my understanding, if you just have one source, you cannot have two buildspec file because you can only name that file as buildspec. Maybe you can try "Insert build commands" option