How can I get AWS CodeBuild's outputted zip extracted in ElasticBeanstalk? - amazon-web-services

I'm trying to get AWS CodePipeline working with S3 source, CodeBuild and Elastic Beanstalk (nodejs environment)
My problem lies between CodeBuild and Beanstalk.
I have CodeBuild outputting a zip file of the final nodeJS app via the artifacts. Here is my CodeBuild buildspec.yml
version: 0.1
phases:
install:
commands:
- echo Installing Node Modules...
- npm install -g mocha
- npm install
post_build:
commands:
- echo Performing Test
- npm test
- zip -r app-api.zip .
artifacts:
files:
- app-api.zip
When I manually run CodeBuild it successfully puts the zip into S3. When I run CodePipeline it puts the zip on each Elastic Beanstalk instance in /var/app/current as app-api.zip
What I would like is for it to extract app-api.zip as /var/app/current. Just like the manual deploy via the Elastic Beanstalk console interface.

First, a quick explanation. CodePipeline sends whatever files you specified as artifacts to Elastic Beanstalk. In your case, you are sending app-api.zip
What you probably want to do instead, is to send all the files, but not wrap them in a ZIP.
Let's change your buildspec.yml to not create app-api.zip and instead, send the raw files to CodePipeline.
version: 0.1
phases:
install:
commands:
- echo Installing Node Modules...
- npm install -g mocha
- npm install
post_build:
commands:
- echo Performing Test
- npm test
# - zip -r app-api.zip . **<< Remove this line**
artifacts:
files:
- '**/*'
# Replace artifacts/files with the value shown above

Related

Upload jar to Lambda when I do CodeCommit in AWS

When i push changes in AWS CodeCommit Repo, I want to make JAR file with mvn install command for that Java Code and upload it to AWS Lambda function. Location of that Jar file should be inside src/main/target. Can anyone suggest buildspec.yaml file?
Assuming that you're using AWS SAM (Serverless Application Model), this is as simple as calling a single command in the post_build section of your buildspec.yaml. Example:
version: 0.2
phases:
install:
runtime-versions:
java: corretto8
pre_build:
commands:
- mvn clean
build:
commands:
- mvn install
post_build:
commands:
- sam deploy --stack-name lambda-java --no-confirm-changeset
artifacts:
files:
- target/lambda-java.jar
discard-paths: no
Please note though that you'll also have to set up a mechanism that kicks off the build process when you push any changes to your repository. The easiest way doing this is using AWS CodePipeline, as that nicely integrates with CodeCommit. Simply create a new pipeline, choose your existing CodeCommit repository where the Java-based Lambda is stored, and select CodeBuild as the build provider (skip the deploy stage).
Also note that your CodeBuild service role will have to have the appropriate permissions to deploy the Lambda function. As SAM is leveraged, this includes permissions to upload to S3 and update the corresponding CloudFormation stack (see stack-name parameter above).
From here on, whenever you push any changes to your repo, CodePipeline will trigger a build using CodeCommit, which will then deploy a new version of your Lambda via the sam deploy command in your buildspec.yaml.

AWS Elastic Beanstalk - deploy by CodePipeline not working - health status - Severe

I have created pipeline in CodePipeline. Pipelien is runnig fine, every stage are succeeded.
Even deploy stage is in state succeeded, but there is issue with running app on Elastic Beanstalk. I don't know why, mabe by Enhanced health authorization.
When i upload .war file manually to Elastic Beanstalk, app is runnig well. But when CodePipeline is uploading .war file to Elastic Beanstalk its not working.
My buildspec.yml
version: 0.2
phases:
pre_build:
commands:
- echo In the pre_build phase...
build:
commands:
- echo Build started on `date`
- mvn clean package
post_build:
commands:
- echo Build completed on `date`
artifacts:
files:
- target/myapp.war
discard-paths: yes
I would guess that this is because a wrong artifact format is deployed. You can check this by going to the S3 bucket the pipeline is using and checking what artifact is produced. I anyways recommend that you upload a zip file and not a single war. This will allow you to use an .ebextensions file and deploy more than one war, if needed. See here for details.
You can deploy a zip file like this:
version: 0.2
phases:
pre_build:
commands:
- echo In the pre_build phase...
build:
commands:
- echo Build started on `date`
- mvn clean package
- mkdir build-output
- cp target/myapp.war build-output/myapp.war
post_build:
commands:
- echo Build completed on `date`
artifacts:
files:
- build-output/**/*
- base-directory: build-output*

Can we allow codepipeline to store only the updated file to S3?

I am having trouble with AWS Codepipeline, rather than deploying the entire Codecommit repo to an S3 bucket, I would like Codepipeline to only deploy a single file that was updated.
The same question is asked here:
Codepipeline: Deploy only specific files to s3 but I do not want any specific file. Any file whether its HTML or Css, only the one that is updated should be deployed to S3
I am using AWS CodeBuild, my deployment yaml file is this:
version: 0.2
phases:
install:
commands:
- echo Entered the install phase...
- apt-get update -y
build:
commands:
- echo Building...
artifacts:
files:
- index.html
discard-paths: yes
secondary-artifacts:
artifact1:
files:
- index.html
Now in the above code instead of index.html in artifacts what should I write so that even if any other file is changed or updated, then only that single file should be updated in S3 bucket.
Unfortunately, there is no build in functionality for this in CodeBuild.This means that you would have to program it yourself in your CodeBuild procedure.
A possible way could be by listing files in your CodeCommit repository which were changed in the last commit. You can do this using git's functionality.

Serverless Deploy From another Directory

Here is my root folder and i want to deploy AWS Lambda functions to codecommit from the Backend folder.
Therefore i wrote this command, but AWS CodeBuild gives this error (This command can only be run in a Serverless service directory).
version: 0.1
phases:
install:
commands:
- npm install -g serverless#1.20.2
post_build:
commands:
- cd Backend
- serverless deploy --region eu-west-1 --verbose
How can i deploy it from the backend folder?
Edit: I forgot to edit the version. Now i changed it to version: 0.2 and it works fine.
can you change to
- cd Backend && serverless deploy --region eu-west-1 --verbose
I forgot to edit the version of buildspec.yml. Now i changed it to version: 0.2 and it works fine.
version: 0.2
phases:
install:
commands:
- npm install -g serverless#1.20.2
post_build:
commands:
- cd Backend
- serverless deploy --region eu-west-1 --verbose

CodePipeline buildspec and multiple build actions

A simple buildspec like:
version: 0.2
phases:
install:
commands:
- (cd lambda/src; npm install)
- aws cloudformation package --template-file lambda/sam.yml --s3-bucket skynet-lambda --output-template-file SkynetLambdaPackaged.yml
artifacts:
type: zip
files:
- SkynetLambdaPackaged.yml
Works fine when I have one action in my build stage. But what if I want to have more build actions for example: I want to build my api server and frontend files in parallel. How do I model this?
UPDATE
In CodePipeline I can create actions that run in parallel like below, how is this modeled in buildspec? Or isit impossible?
You can use two different CodeBuild projects from the same source as two separate parallel actions in your CodePipeline.
For this to happen, you can use two buildspec files in your source.
e.g.
buildspec-frontend.yml
phases:
install:
commands:
- (cd frontend/src; npm run build)
- aws s3 sync frontend/dist s3://<insert s3 bucket url here>/ --delete
buildspec-backend.yml
phases:
install:
commands:
- (cd lambda/src; npm install)
- aws cloudformation package --template-file lambda/sam.yml --s3-bucket skynet-lambda --output-template-file SkynetLambdaPackaged.yml
Then, create a frontend CodeBuild project that uses the frontend buildspec. Repeat for the backend.
Then, when you go to your Build stage in your CodePipeline, use the two CodeBuild projects as parallel actions.
Update: The information below is now irrelevant since I misunderstood the question.
If your frontend can be deployed to s3, just add its deployment commands where you put your api deployment commands.
e.g.
phases:
install:
commands:
- (cd lambda/src; npm install)
- aws cloudformation package --template-file lambda/sam.yml --s3-bucket skynet-lambda --output-template-file SkynetLambdaPackaged.yml
- (cd frontend/src; npm run build)
- aws s3 sync frontend/dist s3://<insert s3 bucket url here>/ --delete
If your frontend is not on s3, just replace those lines with your own frontend deployment commands.
CodeBuild executes those commands in sequence. If you really need to run them in parallel, there are many ways to do it.
My preference is to put the commands in a Makefile and call them from your buildspec.yml (e.g. make --jobs 2 backend frontend).
enter image description here
From my understanding, if you just have one source, you cannot have two buildspec file because you can only name that file as buildspec. Maybe you can try "Insert build commands" option