AWS Elastic Beanstalk - deploy by CodePipeline not working - health status - Severe - amazon-web-services

I have created pipeline in CodePipeline. Pipelien is runnig fine, every stage are succeeded.
Even deploy stage is in state succeeded, but there is issue with running app on Elastic Beanstalk. I don't know why, mabe by Enhanced health authorization.
When i upload .war file manually to Elastic Beanstalk, app is runnig well. But when CodePipeline is uploading .war file to Elastic Beanstalk its not working.
My buildspec.yml
version: 0.2
phases:
pre_build:
commands:
- echo In the pre_build phase...
build:
commands:
- echo Build started on `date`
- mvn clean package
post_build:
commands:
- echo Build completed on `date`
artifacts:
files:
- target/myapp.war
discard-paths: yes

I would guess that this is because a wrong artifact format is deployed. You can check this by going to the S3 bucket the pipeline is using and checking what artifact is produced. I anyways recommend that you upload a zip file and not a single war. This will allow you to use an .ebextensions file and deploy more than one war, if needed. See here for details.
You can deploy a zip file like this:
version: 0.2
phases:
pre_build:
commands:
- echo In the pre_build phase...
build:
commands:
- echo Build started on `date`
- mvn clean package
- mkdir build-output
- cp target/myapp.war build-output/myapp.war
post_build:
commands:
- echo Build completed on `date`
artifacts:
files:
- build-output/**/*
- base-directory: build-output*

Related

AWS CodeBuild failing when building docker image

I'm trying to setup my codebuild pipeline using AWS documentation. Everything goes fine to the moment when CodeBuild starts building docker image with this command
docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG .
When processing this command AWS is throwing this error
Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG .. Reason: exit status 125
ENV variables are set using CodeBuild and locally image building is working with the same command.
My buildspec file
version: 0.2
phases:
pre_build:
commands:
- echo Logging in to Amazon ECR...
- aws ecr get-login-password --region $AWS_DEFAULT_REGION | docker login --username AWS --password-stdin $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com
build:
commands:
- echo Build started on `date`
- echo Building the Docker image...
- IMAGE_TAG=$IMAGE_TAG
- IMAGE_REPO_NAME=$IMAGE_REPO_NAME
- docker build -t $IMAGE_REPO_NAME:$IMAGE_TAG .
- docker tag "$IMAGE_REPO_NAME:$IMAGE_TAG" "$AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG"
post_build:
commands:
- echo Build completed on `date`
- echo Pushing the Docker image...
- docker push $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG

How to deploy docker image on EC2 instance using codedeploy?

I am pushing a docker image to ecr repo using the buildspec.yml file. This is what my buildspec.yml look like :-
version: 0.2
phases:
install:
runtime-versions:
nodejs: 16
pre_build:
commands:
- echo Logging in to Amazon ECR.....
- aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin <account-id>.dkr.ecr.us-east-1.amazonaws.com
- REPOSITORY_URI=<account-id>.dkr.ecr.us-east-1.amazonaws.com/vuejs
- IMAGE_TAG=$(echo $CODEBUILD_RESOLVED_SOURCE_VERSION | cut -c 1-7)
build:
commands:
- docker build -t $REPOSITORY_URI:$IMAGE_TAG .
post_build:
commands:
- echo Pushing image now...
- docker push $REPOSITORY_URI:$IMAGE_TAG
- printf '[{"name":"vuejs","imageUri":"%s"}]' $REPOSITORY_URI:$IMAGE_TAG > imagedefinitions.json
artifacts:
files:
- imagedefinitions.json
- appspec.yml
and this is my appspec.yml file :-
version: 0.0
os: linux
files:
- source: /
destination: /
Now my pipeline ( codepipeline ) passes but how can i configure my appspec.yml file to deploy docker container on my EC2 instance? Any help will be appreciated.

CodePipeline unable to find the image definition file

I have created a codebuild whereby the buildspec.yml is as follows (following the standard template given by AWS with minor modifications):
version: 0.2
phases:
pre_build:
commands:
- echo Logging in to Amazon ECR...
- aws --version
- $(aws ecr get-login --region $AWS_DEFAULT_REGION --no-include-email)
- REPOSITORY_URI=xxx.amazonaws.com/projectName
- COMMIT_HASH=$(echo $CODEBUILD_RESOLVED_SOURCE_VERSION | cut -c 1-7)
- IMAGE_TAG=${COMMIT_HASH:=test-cicd}
build:
commands:
- echo Building the docker image...
- docker build -t $REPOSITORY_URI:$COMMIT_HASH -t $REPOSITORY_URI:test-cicd .
- echo Finish building the docker image.
post_build:
commands:
- echo Pushing the docker images...
- docker push $REPOSITORY_URI:$IMAGE_TAG
- docker push $REPOSITORY_URI:test-cicd
- echo Finish pushing the docker images.
- echo Writing image definitions file...
- printf '[{"name":"testcicd","imageUri":"%s"}]' $REPOSITORY_URI:test-cicd > imagedefinitions.json
- cat imagedefinitions.json
artifacts:
files: imagedefinitions.json
The codebuild is successfully pushing the new docker image to ECR and creating the output artifact in S3:
Next I tried to create a codepipeline in which the source is ECR and next stage is to perform codedeploy to ECS. This is the codepipeline created:
However, in the codepipeline status, it shows that the output artifact could not be found:
However, I noticed that the output artifact is indeed in S3!?

Serverless Deploy From another Directory

Here is my root folder and i want to deploy AWS Lambda functions to codecommit from the Backend folder.
Therefore i wrote this command, but AWS CodeBuild gives this error (This command can only be run in a Serverless service directory).
version: 0.1
phases:
install:
commands:
- npm install -g serverless#1.20.2
post_build:
commands:
- cd Backend
- serverless deploy --region eu-west-1 --verbose
How can i deploy it from the backend folder?
Edit: I forgot to edit the version. Now i changed it to version: 0.2 and it works fine.
can you change to
- cd Backend && serverless deploy --region eu-west-1 --verbose
I forgot to edit the version of buildspec.yml. Now i changed it to version: 0.2 and it works fine.
version: 0.2
phases:
install:
commands:
- npm install -g serverless#1.20.2
post_build:
commands:
- cd Backend
- serverless deploy --region eu-west-1 --verbose

How can I get AWS CodeBuild's outputted zip extracted in ElasticBeanstalk?

I'm trying to get AWS CodePipeline working with S3 source, CodeBuild and Elastic Beanstalk (nodejs environment)
My problem lies between CodeBuild and Beanstalk.
I have CodeBuild outputting a zip file of the final nodeJS app via the artifacts. Here is my CodeBuild buildspec.yml
version: 0.1
phases:
install:
commands:
- echo Installing Node Modules...
- npm install -g mocha
- npm install
post_build:
commands:
- echo Performing Test
- npm test
- zip -r app-api.zip .
artifacts:
files:
- app-api.zip
When I manually run CodeBuild it successfully puts the zip into S3. When I run CodePipeline it puts the zip on each Elastic Beanstalk instance in /var/app/current as app-api.zip
What I would like is for it to extract app-api.zip as /var/app/current. Just like the manual deploy via the Elastic Beanstalk console interface.
First, a quick explanation. CodePipeline sends whatever files you specified as artifacts to Elastic Beanstalk. In your case, you are sending app-api.zip
What you probably want to do instead, is to send all the files, but not wrap them in a ZIP.
Let's change your buildspec.yml to not create app-api.zip and instead, send the raw files to CodePipeline.
version: 0.1
phases:
install:
commands:
- echo Installing Node Modules...
- npm install -g mocha
- npm install
post_build:
commands:
- echo Performing Test
- npm test
# - zip -r app-api.zip . **<< Remove this line**
artifacts:
files:
- '**/*'
# Replace artifacts/files with the value shown above