I am attempting to get CodePipeline to fetch my code from GitHub and build it with CodeBuild. The first (Source) step works fine. But the second (Build) step fails during the "UPLOAD_ARTIFACTS" part. Here are the relevant log statements:
[Container] 2017/01/12 17:21:31 Assembling file list
[Container] 2017/01/12 17:21:31 Expanding MyApp
[Container] 2017/01/12 17:21:31 Skipping invalid artifact path MyApp
[Container] 2017/01/12 17:21:31 Phase complete: UPLOAD_ARTIFACTS Success: false
[Container] 2017/01/12 17:21:31 Phase context status code: ARTIFACT_ERROR Message: No matching artifact paths found
[Container] 2017/01/12 17:21:31 Runtime error (No matching artifact paths found)
My app has a buildspec.yml in its root folder. It looks like:
version: 0.1
phases:
build:
commands:
- echo `$BUILD_COMMAND`
artifacts:
discard-paths: yes
files:
- MyApp
It would appear that the "MyApp" in my buildspec.yml should be something different, but I'm pouring through all of the AWS docs to no avail (what else is new?). How can I get it to upload the artifact correctly?
The artifacts should refer to files downloaded from your Source action or generated as part of the Build action in CodePipeline. For example, this is from a buildspec.yml I wrote:
artifacts:
files:
- appspec.yml
- target/SampleMavenTomcatApp.war
- scripts/*
When I see that you used MyApp in your artifacts section, it makes me think you're referring to the OutputArtifacts of the Source action of CodePipeline. Instead, you need to refer to the files it downloads and stores there (i.e. S3) and/or it generates and stores there.
You can find a sample of a CloudFormation template that uses CodePipeline, CodeBuild, CodeDeploy, and CodeCommit here: https://github.com/stelligent/aws-codedeploy-sample-tomcat/blob/master/codebuild-cpl-cd-cc.json The buildspec.yml is in the same forked repo.
Buildspec artifacts are information about where CodeBuild can find the build output and how CodeBuild prepares it for uploading to the Amazon S3 output bucket.
For the error "No matching artifact paths found" Couple of things to check:
Artifacts file(s) specified on buildspec.yml file has correct path and file name.
artifacts:
files:
-'FileNameWithPath'
If you are using .gitignore file, make sure file(s) specified on Artifacts section
is not included in .gitignore file.
Hope this helps.
In my case I received this error because I had changed directory in my build stage (the java project I am building is in a subdirectory) and did not change back to the root. Adding cd ..at the end of the build stage did the trick.
I had the similar issue, and the solution to fix the problem was "packaging directories and files inside the archive with no further root folder creation".
https://docs.aws.amazon.com/codebuild/latest/userguide/sample-war-hw.html
Artifacts are the stuff you want from your build process - whether compiled in some way or just files copied straight from the source. So the build server pulls in the code, compiles it as per your instructions, then copies the specified files out to S3.
In my case using Spring Boot + Gradle, the output jar file (when I gradle bootJar this on my own system) is placed in build/libs/demo1-0.0.1-SNAPSHOT.jar, so I set the following in buildspec.yml:
artifacts:
files:
- build/libs/*.jar
This one file appears for me in S3, optionally in a zip and/or subfolder depending on the options chosen in the rest of the Artifacts section
try using the version 0.2 of the buildspec
here is a typical example for nodejs
version: 0.2
phases:
pre_build:
commands:
- echo Nothing to do in the pre_build phase...
build:
commands:
- npm install
- npm run build
post_build:
commands:
- echo Build completed on
artifacts:
files:
- appspec.yml
- build/*
If you're like me and ran into this problem whilst using Codebuild within a CodePipeline arrangement.
You need to use the following
- printf '[{"name":"container-name-here","imageUri":"%s"}]' $REPOSITORY_URI:$IMAGE_TAG > $CODEBUILD_SRC_DIR/imagedefinitions.json
There was the same issue as #jd96 wrote. I needed to return to the root directory of the project to export artifact.
build:
commands:
- cd tasks/jobs
- make build
- cd ../..
post_build:
commands:
- printf '[{"name":"%s","imageUri":"%s"}]' $IMAGE_REPO_NAME $REPOSITORY_URI:$IMAGE_TAG > imagedefinitions.json
artifacts:
files: imagedefinitions.json
Related
I have angular client and Nodejs server deployed into one elasticBeanstalk.
The structure is that I put angular client files in 'html' folder and the proxy is defined in .ebextensions folder.
-html
-other serverapp folder
-other serverapp folder
-.ebextensions
....
-package.json
-server.js
Everytime when I do a release, I build angular app and put it into html folder in the node app, zip it and upload to elasticBeanstalk.
Now I want to move on to CICD. Basically I want to automate the above step, use two source(angular and node app), do the angular build and put it into html folder of node app and generate only one output artifact.
I've got to the stage where have separate pipeline works for each app. I'm not very familiar with AWS yet, I just have vague idea that I might need to use aws lambda.
Any help would be really appreciated.
The output artifact your CodeBuild job creates can be thought of as a directory location that you ask CodeBuild to zip as artifact. You can use regular UNIX commands to manipulate this directory before the packaging of the artifacts. Following 'buildspec.yml' is an example:
version: 0.2
phases:
build:
commands:
# build commands
#- command
post_build:
commands:
- mkdir /tmp/html
- cp -R ./ /tmp/html
artifacts:
files:
- '**/*'
base-directory: /tmp/html
Buildspec reference: https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html#build-spec-ref-syntax
In my existing aws pipeline I have the following buildspec.yml:
version: 0.2
phases:
build:
commands:
- cd media/web/front_dev
- echo "Hello" > ../web/txt/hello.txt
artifacts:
files:
- ./media/web/hello.txt
And the appspec.yml has the following
version: 0.0
os: linux
files:
- source: /
destination: /webserver/src/public
But the file hello.txt is not being deployed to the server on the deploy phase? Once I ssh into the machine I run the following commands:
/webserver/src/public/media/web/hello.txt
But the file is not shown. Do you have any idea why?
My pipeline initially had only a source and a deployment step then I edited it in order to have a codebuild step as well.
Check your pipeline. You may have added the build step but the deployment just fetches the code from the version control instead of the deployment. In order to solve that follow these steps:
Specify a name for the output artifact at the build step.
Select as input artifact the artifact you have placed into as output artifact at the build step.
I've got a CodePipeline working with a Java application. I'm pulling the source from GitHub, building a package with Maven using CodeBuild, and deploying to ElasticBeanstalk in the Deploy stage. My problem is that CodeBuild is returning the artifact in a zip file:
[Container] 2019/03/21 13:23:07 Expanding target/*.war
[Container] 2019/03/21 13:23:07 Found 1 file(s)
[Container] 2019/03/21 13:23:09 Phase complete: UPLOAD_ARTIFACTS Success: true
I'm grabbing the resulting war file after the Maven package. I only want the war file to be picked up by ElasticBeanstalk. How I can force CodePipeline/CodeBuild to NOT compress the file?
You can specify any type of file, with or without compression in the artifacts section of your buildspec.yaml file.
Here is an example I am using with docker :
artifacts:
files: imagedefinitions.json
You will find the full doc of possible values and other examples here : https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html
I have a simple CodeBuild spec that defines artifacts to be uploaded after tests run:
artifacts:
files:
- cypress/**/*.png
discard-paths: yes
These artifacts are only generated if the test-action fails (a screenshot is captured of the failing test screen) and are being successfully uploaded to S3.
In the case that tests succeed, no .png files will be generated and the CodeBuild action fails:
[Container] 2018/09/21 20:06:34 Expanding cypress/**/*.png
[Container] 2018/09/21 20:06:34 Phase complete: UPLOAD_ARTIFACTS Success: false
[Container] 2018/09/21 20:06:34 Phase context status code: CLIENT_ERROR Message: no matching artifact paths found
Is there a way to conditionally upload files if they exist in the buildspec?
Alternatively I could use the s3 cli -- in which case I would need a way to easily access the bucket name and artifact key.
To get around this, I'm creating a placeholder file that matches the glob pattern if build succeeds:
post_build:
commands:
- if [ -z "$CODEBUILD_BUILD_SUCCEEDING" ]; then echo "Build failing, no need to create placeholder image"; else touch cypress/0.png; fi
artifacts:
files:
- cypress/**/*.png
discard-paths: yes
If any one still looking for solution base on tgk answer.
in my cas I wanna upload the artifact only in master ENV , so other than master I create a place holder and upload in a TMP folder.
post_build:
commands:
#creating a fake file to workaround fail upload in non prod build
- |
if [ "$ENV" = "master" ]; then
export FOLDERNAME=myapp-$(date +%Y-%m-%d).$((BUILD_NUMBER))
else
touch myapp/0.tmp;
export FOLDERNAME="TMP"
fi
artifacts:
files:
- myapp/build/outputs/apk/prod/release/*.apk
- myapp/*.tmp
discard-paths: yes
name: $FOLDERNAME
This is what my current buildspec looks like:
phases:
build:
commands:
- ./gradlew soakTest -s
cache:
paths:
- '/root/.gradle/caches/**/*'
- '.gradle/**/*'
But when this buildspec runs in CodeBuild, it prints messages that it is downloading gradle 4.7.
It appears that other things are being cached correctly - I don't see log messages about downloading jar dependencies, for example.
What should the buildspec cache specifications look like, in order to make sure the Gradle version that the Gradle wrapper downloads gets cached?
Add the wrapper directory to the cache paths:
- '/root/.gradle/wrapper/**/*'