How to get CodeBuild project name from buildspec file - amazon-web-services

This is my buildspec file to build an Angular, React, Vue or similar project through AWS CodeCommit and publish the resulting artifact to an S3 bucket:
version: 0.2
env:
variables:
S3_BUCKET: "my-bucket"
phases:
install:
runtime-versions:
nodejs: 16
pre_build:
commands:
- echo Installing source NPM dependencies...
- npm install
build:
commands:
- echo Build started on `date`
- npm run build
post_build:
commands:
- aws s3 cp dist s3://${S3_BUCKET} --recursive
- echo Build completed on `date`
What I would like to do is to use a subfolder with the name of the project when publishing the result files in the bucket. Now all files go to my-bucket but I would like them to go to my-bucket/name-of-the-project
I could change the post-build command to something like
- aws s3 cp dist s3://${S3_BUCKET}/name-of-the-project --recursive
That way it would be always the same directory name. What I want is to get dynamically the name of the CodeBuild project or from the package.json or similar to make that directory match the project name.

Here are two ways to read a project identifier from the build context at run-time:
Option 1: Read the project name from package.json:
PROJECT_NAME=$(cat package.json | jq -r '.name')
echo $PROJECT_NAME # -> name-of-the-project
Option 2: Extract the CodeCommit repo name from the source URL. Each CodeBuild execution exposes several environment variables, including CODEBUILD_SOURCE_REPO_URL.
echo $CODEBUILD_SOURCE_REPO_URL # -> https://git-codecommit.us-east-1.amazonaws.com/v1/repos/my-repo
REPO_NAME=$(echo $CODEBUILD_SOURCE_REPO_URL | awk -F\"/\" '{print $NF}') # split the url at '/', return the last item
echo $REPO_NAME # -> my-repo
Pass one of the captured names to the S3 command:
aws s3 cp dist s3://${S3_BUCKET}/${PROJECT_NAME} --recursive

Related

Copy build artifacts to s3 bucket using Buildspec.yml

I am trying to understand codebuild recently. I am trying to copy the build artifact to S3 bucket. I seen in one of my project's they are copying the files to S3 bucket using the command "aws s3 cp" instead of using the artifact phase.
Cant we achieve this using the artifact phase after post build phase instead of copying using "aws s3 cp" command in post build phase?
What is the purpose of the artifacts phase in Buildspec.yml?
Can we have Buildspec.yml without artifacts?
version: 0.2
env:
variables:
CACHE_CONTROL: "100"
S3_BUCKET: animation-project
BUILD_ENV: dev
phases:
install:
runtime-versions:
nodejs: 10
commands:
- echo Installing source NPM dependencies...
- echo environment printing ${STAGE}
- npm install
- npm install -g #angular/cli
build:
commands:
- echo Build started on `date`
- ng build --configuration=${BUILD_ENV}
# - ng test
- echo build completed
post_build:
commands:
- echo Post Build Started successfully on `date`
- aws s3 cp dist/* s3://${S3_BUCKET} --recursive --acl public-read --cache-control "max-age=${CACHE_CONTROL}"
- echo Build completed on `date`
artifacts:
files:
- '**/*'
base-directory: 'dist*'
discard-paths: yes
Thanks in advance for your answers.

AWS s3 cp cannot find file from a gitlab build?

I'm working on a Gitlab CI project where i have to push the APK to our aws S3 Bucket for that i have specified the keys in environment variables in the project setting of our repository, now here is my gitlab-ci.yml file:
stages:
- build
- deploy
variables:
AWS_DEFAULT_REGION: us-east-2 # The region of our S3 bucket
BUCKET_NAME: abc.bycket.info # bucket name
FILE_NAME: ConfuRefac.apk
assembleDebug:
stage: build
script:
- export ANDROID_HOME=/home/bitnami/android-sdk-linux
- export ANDROID_NDK_HOME=/opt/android-ndk
- export PATH=$PATH:/home/bitnami/android-sdk-linux/platform-tools/
- export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-amd64
- export PATH=/usr/lib/jvm/java-1.8.0-openjdk-amd64/bin:$PATH
- chmod +x ./gradlew
- ./gradlew assembleDebug
- cd app/build/outputs/apk/debug
- mv app-debug.apk ${FILE_NAME}
artifacts:
paths:
- app/build/outputs/apk/debug/${FILE_NAME}
deploys3:
image: "python:latest" # We use python because there is a well-working AWS Sdk
stage: deploy
dependencies:
- assembleDebug
script:
- pip install awscli
- cd app/build/outputs/apk/debug/
- ls && pwd
- aws s3 cp ${FILE_NAME} s3://${BUCKET_NAME}/${FILE_NAME} --recursive
So when the deploy stage starts kicking in it cannot find the file even though in ls you can clearly see that the file is indeed there.
Collecting futures<4.0.0,>=2.2.0; python_version == "2.7" (from s3transfer<0.4.0,>=0.3.0->awscli)
Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting six>=1.5 (from python-dateutil<3.0.0,>=2.1->botocore==1.16.13->awscli)
Using cached https://files.pythonhosted.org/packages/65/eb/1f97cb97bfc2390a276969c6fae16075da282f5058082d4cb10c6c5c1dba/six-1.14.0-py2.py3-none-any.whl
Installing collected packages: urllib3, docutils, jmespath, six, python-dateutil, botocore, pyasn1, rsa, futures, s3transfer, PyYAML, colorama, awscli
Successfully installed PyYAML-5.3.1 awscli-1.18.63 botocore-1.16.13 colorama-0.4.3 docutils-0.15.2 futures-3.3.0 jmespath-0.10.0 pyasn1-0.4.8 python-dateutil-2.8.1 rsa-3.4.2 s3transfer-0.3.3 six-1.14.0 urllib3-1.25.9
You are using pip version 8.1.1, however version 20.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
$ cd ${PWD}/app/build/outputs/apk/debug/
$ ls && pwd
ConfuRefac.apk
/home/gitlab-runner/builds/CeGhSYCJ/0/root/confu-android/app/build/outputs/apk/debug
$ aws s3 cp ${PWD}/${FILE_NAME} s3://${BUCKET_NAME}/${FILE_NAME} --recursive
warning: Skipping file /home/gitlab-runner/builds/CeGhSYCJ/0/root/confu-android/app/build/outputs/apk/debug/ConfuRefac.apk/. File does not exist.
Running after_script
00:00
Uploading artifacts for failed job
00:01
ERROR: Job failed: exit status 1
As I indicated in the comments, the issue was caused because --recursive is treating ${FILE_NAME} as a directory, not file.
Which of course would make sense, because one can't recursively copy a single file.

How to upload a generated folder content into S3 using CodeBuild?

I am trying to configure a CodePipeline on AWS that it takes my Nuxt website on Github, run the command npm run generate to generate the static website then upload the dist folder on an S3 bucket.
Here what my buildspec.yml it looks like:
version: 0.2
phases:
install:
commands:
- npm install
build:
commands:
- npm run generate
post_build:
commands:
- aws s3 sync dist $S3_BUCKET
The error I get is: The user-provided path dist does not exist. Is anyone know how to correct this? I read a lot about artefacts but I never use them beforeā€¦
Thanks in advance,
You can use artifacts to upload dist folder to s3. I will suggest not to use post build command to achieve this because the post build command runs even when the build is failed, this is the known limitation of codebuild. Just replace your buildspec with the following.
version: 0.2
phases:
install:
commands:
- npm install
build:
commands:
- npm run generate
artifacts:
files:
- '**/*'
base-directory: 'dist'
'**/*' means it will upload all the files and folder under the base directory "dist". You need to mention your bucket name in your aws console ( browser).
Also make sure that your codebuild IAM role has sufficient permission to access your bucket.

Using codepipeline to create a build and deploy it in aws s3

I have created a codepipeline in aws that creates the build and copies the dist folder in my aws s3. I am able to save all the files inside the dist folder but assets folder is not saving. What should be the buildspec.yml file to copy assets folder in s3.
version: 0.2
env:
variables:
S3_BUCKET: bucketName
APP_NAME: "Walter"
BUILD_ENV : "prod"
phases:
install:
commands:
- echo Installing source NPM dependencies...
- npm install
- npm install -g #angular/cli
build:
commands:
# Builds Angular application. You can also build using custom environment here like mock or staging
- echo Build change this started on this fg date `date`
- ng build --${BUILD_ENV}
post_build:
commands:
# Clear S3 bucket.
- aws s3 rm s3://${S3_BUCKET} --recursive
- echo S3 bucket is cleared.
# Copy dist folder to S3 bucket, As of Angular 6, builds are stored inside an app folder in distribution and not at the root of the dist folder
- aws s3 cp dist/walter s3://${S3_BUCKET}/${APP_NAME} --recursive
- echo Build completed on `date`
artifacts:
files:
- '**/*'
discard-paths: yes
base-directory: 'dist/Walter'
For someone looking to achieve this. I solved it by changing the 'discard-paths' value to 'no'.
Setting its value to 'yes' neglects the folder structure and copies all the files to the specified location. With setting it to 'no', it maintains the folder structure.

AWS CodeBuild - Unable to find DockerFile during build

Started playing with AWS CodeBuild.
Goal is to have a docker images as a final results with the nodejs, hapi and sample app running inside.
Currently i have an issue with:
"unable to prepare context: unable to evaluate symlinks in Dockerfile path: lstat /tmp/src049302811/src/Dockerfile: no such file or directory"
Appears on BUILD stage.
Project details:
S3 bucket used as a source
ZIP file stored in respective S3 bucket contains buildspec.yml, package.json, sample *.js file and DockerFile.
aws/codebuild/docker:1.12.1 is used as a build environment.
When i'm building an image using Docker installed on my laptop there is no issues so i can't understand which directory i need to specify to get rid off this error message.
Buildspec and DockerFile attached below.
Thanks for any comments.
buildspec.yml
version: 0.1
phases:
pre_build:
commands:
- echo Logging in to Amazon ECR...
- $(aws ecr get-login --region eu-west-1)
build:
commands:
- echo Build started on `date`
- echo Building the Docker image...
- docker build -t <CONTAINER_NAME> .
- docker tag <CONTAINER_NAME>:latest <ID>.dkr.ecr.eu-west-1.amazonaws.com/<CONTAINER_NAME>:latest
post_build:
commands:
- echo Build completed on `date`
- echo Pushing the Docker image...
- docker push <id>.eu-west-1.amazonaws.com/<image>:latest
DockerFile
FROM alpine:latest
RUN apk update && apk upgrade
RUN apk add nodejs
RUN rm -rf /var/cache/apk/*
COPY . /src
RUN cd /src; npm install hapi
EXPOSE 80
CMD ["node", "/src/server.js"]
Ok, so the solution was simple.
Issue was related to the Dockerfile name.
It was not accepting DockerFile (with capital F, strange it was working locally) but Dockerfile (with lower-case f) worked perfectly.
Can you validate that Dockerfile exists in the root of the directory? One way of doing this would be to run ls -altr as part of the pre-build phase in your buildspec (even before ecr login).