I'm trying to hook my GitHub repo with S3 so every time there's a commit, AWS CodePipeline will deploy the ./<path>/public folder to a specified S3 bucket.
So far in my pipeline, the Source works (hooked to GitHub and picks up new commits) but the Deploy failed because: Action execution failed
BundleType must be either YAML or JSON.
This is how I set them up:
CodePipeline
Action name: Source
Action provider: GitHub
Repository: account/repo
Branch: master
GitHub webhooks
CodeDeploy
Compute type: AWS Lambda
Service role: myRole
Deployment settings: CodeDeployDefault.LambdaAllAtOnce
IAM Role: myRole
AWS Service
Choose the service that will use this role: Lambda / CodeDeploy
Select your use case: CodeDeploy
Policies: AWSCodeDeployRole
I understand that there must be a buildspec.yml file in the root folder. I've tried using a few files I could find but they don't seem to work. What did I do wrong or how should I edit the buildspec file to do what I want?
Update
Thanks to #Milan Cermak. I understand I need to do:
CodePipeline:
Stage 1: Source: hook with GitHub repo. This one is working.
Stage 2: Build: use CodeBuild to grab only the wanted folder using a buildspec.yml file in the root folder of the repo.
Stage 3: Deploy: use
Action Provider: S3
Input Artifacts: OutputArtifacts (result of stage 2).
Bucket: the bucket that hosts the static website.
CodePipeline works. However, the output contains only files (.html) not folders nested inside the public folder.
I've checked this and figured how to remove path of a nested folder with discard-paths: yes but I'm unable to get all the sub-folders inside the ./<path>/public folder. Any suggestion?
CodeBuild use buildspec, but CodeDeploy use appspec.
Is there any appspec file?
You shouldn't use CodeDeploy, as that's a service for automation of deployments of applications, but rather CodeBuild, which executes commands and prepares the deployment artifact for further use in the pipeline.
These commands are in thebuildspec.yml file (typically in the root directory of the repo, but it's configurable). For your use case, it won't be too complicated, as you're not compiling anything or running tests, etc.
Try this as a starting point:
version: 0.2
phases:
build:
commands:
- ls
artifacts:
files:
- public/*
The phases section is required, that's why it's included (at least, thanks to the ls command, you'll see what files are present in the CodeBuild environment), but it's not interesting for your case. What is interesting is the artifacts section. That's where you define what is the output of the CodeBuild phase, i.e. what gets passed further to the next step in the pipeline.
Depending on how you want to have the files structured (for example, do you want to have the public directory also in the artifact or do you only want to have the files themselves, without the parent dir), you might want to use other configuration that's possible in the artifacts section. See the buildspec reference for details.
Remember to use the output artifact of the CodeBuild step as the input artifact of the Deploy to S3 step.
Buildspec is for CodeBuild as t_yamo pointed out.
You are using CodeDeploy which uses an appspec.yml file, which looks something like this for my config.
version: 0.0
os: linux
files:
- source: /
destination: /path/to/destination
hooks:
BeforeInstall:
- location: /UnzipResourceBundle.sh
ApplicationStart:
- location: /RestartServer.sh
timeout: 3600
UnzipResourceBundle.sh is just a bash script which can be used to do any number of things.
#!/bin/bash
// Do something
You can find a sample for the AppSpec.yml file from Amazon Documentation here - https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file-example.html#appspec-file-example-lambda
CodePipeline recently announced a deploy to S3 action: https://aws.amazon.com/about-aws/whats-new/2019/01/aws-codepipeline-now-supports-deploying-to-amazon-s3/
Related
I have a codepipeline which trigger a few codebuild projects in different stage. In my codebuild project i have this configuration to
#codebuild.yml
Artifacts:
Type: CODEPIPELINE
...
Source:
Type: CODEPIPELINE
BuildSpec: buildspec.yml
# buildspec.yml
version: 0.2
phases:
...
artifacts:
name: test-result
files:
- '**/*'
- '*'
I this configuration it specifies the artifact and use CODEPIPELINE as artifact type. so in codepipeline, how can I upload them to s3 bucket?
What I can think of is to write another codebuild project and use aws s3 command line to upload the files. But it is too manual. Is there an automatic way to do the job?
The build artifact includes test results and I'd like to upload the test results regardless previous failure or not. Is it possible to achieve this in codepipeline?
how can I upload them to s3 bucket?
There are two ways. One you already pointed out - use CB action (same or different) to copy files to S3 using AWS CLI.
Second, is to use S3 deploy action. This allows you to deploy zipped or unzipped artifact to your destination bucket.
I am having trouble with AWS Codepipeline, rather than deploying the entire Codecommit repo to an S3 bucket, I would like Codepipeline to only deploy a single file that was updated.
The same question is asked here:
Codepipeline: Deploy only specific files to s3 but I do not want any specific file. Any file whether its HTML or Css, only the one that is updated should be deployed to S3
I am using AWS CodeBuild, my deployment yaml file is this:
version: 0.2
phases:
install:
commands:
- echo Entered the install phase...
- apt-get update -y
build:
commands:
- echo Building...
artifacts:
files:
- index.html
discard-paths: yes
secondary-artifacts:
artifact1:
files:
- index.html
Now in the above code instead of index.html in artifacts what should I write so that even if any other file is changed or updated, then only that single file should be updated in S3 bucket.
Unfortunately, there is no build in functionality for this in CodeBuild.This means that you would have to program it yourself in your CodeBuild procedure.
A possible way could be by listing files in your CodeCommit repository which were changed in the last commit. You can do this using git's functionality.
I'm using CodeBuild to build Scala sbt project. during the build I'm doing the following steps.
checkout code from Git.
Install all sbt tools.
Build code.
Create Docker
Publish docker file to AWS ECR
Update K8s deployment file located in project root folder deployment/deployment.yaml with latest vesrion of published image using kustomize utility.
Publish this file to AWS S3 bucket to run further deployment on AWS EKS.
Unfortunately, the last step fails and I can't move this deployment file to AWS S3 bucket.
artifacts:
files:
- deployment/deployment.yaml
discard-paths: yes
base-directory: 'deployment'
What am I doing wrong?
base-directory: 'deployment'
... would mean that CodeBuild will look into 'deployment' directory for "deployment/deployment.yaml" which is probably wrong.
Try this:
artifacts:
files:
- deployment/deployment.yaml
discard-paths: yes
One trick I have found useful is to install 'tree' in my buildspec and inspect the tree structure of directory to make sure I am specifying the correct path.
In Install Phase:
- apt-get install tree
Later in Buildspec:
- tree .
Hello I created a codepipeline project with the following configuration:
Source Code in S3 pulled from Bitbucket.
Build with CodeBuild, generating an docker image and storing it into a Amazon ECS repository.
Deployment provider Amazon ECS.
All the process works ok until when it tries to deploy, for some reason I am getting the following error during deployment:
Insufficient permissions Unable to access the artifact with Amazon S3
object key 'FailedScanSubscriber/MyAppBuild/Wmu5kFy' located in the
Amazon S3 artifact bucket 'codepipeline-us-west-2-913731893217'. The
provided role does not have sufficient permissions.
During the building phase, it is even able to create a new docker image in the ECS repository.
I tried everything, changed IAM roles and policies, add full access to S3, I have even setted the S3 bucket as public, nothing worked. I am without options, if someone could help me that would be wonderful, I have poor experience with AWS, so any help is appreciated.
I was able to find a solution. The true issue is that when the deployment provider is set as Amazon ECS, we need to generate an output artifact indicating the name of the task definition and the image uri, for example:
post_build:
commands:
- printf '[{"name":"your.task.definition.name","imageUri":"%s"}]' $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG > imagedefinitions.json
artifacts:
files: imagedefinitions.json
This happens when AWS CodeDeploy cannot find the build artifact from AWS CodeBuild. If you go into the S3 bucket and check the path you would actually see that the artifact object is NOT THERE!
Even though the error says about a permission issue. This can happen due the absent of the artifact object.
Solution: Properly configure artifacts section in buildspec.yml and configure AWS Codepipeline stages properly specifying input and output artifact names.
artifacts:
files:
- '**/*'
base-directory: base_dir
name: build-artifact-name
discard-paths: no
Refer this article - https://medium.com/#shanikae/insufficient-permissions-unable-to-access-the-artifact-with-amazon-s3-247f27e6cdc3
For me the issue was that my CodeBuild step was encrypting the artifacts using the Default AWS Managed S3 key.
My Deploy step uses a Cross-Account role, and so it couldn't retrieve the artifact. Once I changed the Codebuild encryption key to my CMK as it should've been originally, my deploy step succeeded.
I'm attempting to setup AWS Codepipeline with a Cloudformation SAM stack that deploys to a Lambda and am doing this via the Pipeline console. The pipeline passes the clone from Codecommit process and the build process with Codebuild, however fails on the deploy with Cloudformation.
I get the following error message:
Action execution failed
Invalid TemplatePath: MyAppBuild::samTemplate.yml
I've been following the documentation (http://docs.aws.amazon.com/lambda/latest/dg/automating-deployment.html) pretty closely and in field 16 of the codepipeline setup form (Template File) I have been putting samTemplate.yml.
I also have samTemplate.yml in the root of my repo (which is also the root of the project).
I've matched the Codebuild output artifact name with the Cloudformation input artifact name and they match exactly.
Am I missing something here? How do I get the Cloudformation deploy process to recognize the sam template?
EDIT 1 I've switched to using Codestar instead of directly using CodePipeline. Nothing special in my buildspec.yml, but it's below if anyone's interested.
version: 0.2
phases:
install:
commands:
- echo "install"
pre_build:
commands:
- echo "pre_build"
build:
commands:
- aws cloudformation package --template template.yml --s3-bucket $S3_BUCKET --output-template template-export.json
artifacts:
type: zip
files:
- template-export.json
One thing to keep in mind for anyone new to CodeBuild, the zip file that gets created as a result is of the root level directory of your build operations and IS the zip file that gets transferred to final deployment (in my case, Lambda).
You need to add aws cloudformation package command in you buildspec.yml.
aws cloudformation package --debug --template-file <YourSamTemplate.yml> --s3-bucket <YourbucketName> --output-template-file <YourOutputSamTemplate.yml>
where YourSamTemplate.yml is the name of samTemplate.yml in your project root.
and YourOutputSamTemplate.yml is the new name which you eant to give your output file after cloud formation package command is done.
Then in the next stage you need to define your input-artifact as output of your aws codebuild stage and then use this input-artifact to map your template.
build-output::YourOutputSamTemplate.yml
I believe the output of the build stage is what needs to be passed into the CloudFormation action.
In the tutorial the build output is called NewSamTemplate.yaml
So try updating your TemplatePath to MyAppBuild::NewSamTemplate.yml
In my case, the error was in the filename. Instead of template.yml the actual file was template.yaml