How can I upload build artifacts to s3 bucket from codepipeline? - amazon-web-services

I have a codepipeline which trigger a few codebuild projects in different stage. In my codebuild project i have this configuration to
#codebuild.yml
Artifacts:
Type: CODEPIPELINE
...
Source:
Type: CODEPIPELINE
BuildSpec: buildspec.yml
# buildspec.yml
version: 0.2
phases:
...
artifacts:
name: test-result
files:
- '**/*'
- '*'
I this configuration it specifies the artifact and use CODEPIPELINE as artifact type. so in codepipeline, how can I upload them to s3 bucket?
What I can think of is to write another codebuild project and use aws s3 command line to upload the files. But it is too manual. Is there an automatic way to do the job?
The build artifact includes test results and I'd like to upload the test results regardless previous failure or not. Is it possible to achieve this in codepipeline?

how can I upload them to s3 bucket?
There are two ways. One you already pointed out - use CB action (same or different) to copy files to S3 using AWS CLI.
Second, is to use S3 deploy action. This allows you to deploy zipped or unzipped artifact to your destination bucket.

Related

Move child folder content to an Amazon S3 bucket using AWS Code Deploy

I have created the below-mentioned pipeline in AWS.
CodeCommit -> CodeBuild -> CodeDeploy
I am using ng serve command to build the Angular JS source code and it creates a folder named dist and moves all the build content.
When deploying the build result to an Amazon S3 bucket, is it possible to move only the content inside the dist folder?
At the moment, all the source files are being transferred to the Amazon S3 bucket with the dist folder.
Other than writing a Lambda function to achieve this, is there a shortcut to achieve this within the pipeline?
I wonder why you have used CodeDeploy for S3 copying. You can achieve the same using CodeBuild itself.
CodeCommit => CodePipeline => CodeBuild => S3
After creating the build (dist), add commands for S3 sync in your buildspec.yml file
Sample few lines of buildspec.yml for your use case.
- ng serve
- ls -ltr
- cd dist
- aws s3 sync . s3://YOUR_BUCKET --acl bucket-owner-full-control

How to add files from S3 bucket to output artifacts in AWS CodePipeline? (NodeJS)

I am using a AWS CodePipeline which fetches the source from my git repository, uses CodeBuild buildSpec to build and save output artifacts to S3 bucket, which ultimately gets deployed to Elastic BeanStalk (NodeJS Environment).
Everything works fine but I require the pipeline to copy 1 particular file from one of my AWS S3 buckets and add it to the output artifacts before deploying it to the EB
Can it be done using the buildSpec?
artifacts:
files:
- '**/*'
# - How to add a file from S3 to the artifacts?
My recommendation is as part of the build or post_build, copy the required file from s3 into your build directory.
build:
commands:
- echo "Build commands"
- aws s3 cp --region=xx-xxxx-x "s3://file/in/s3" "local-file-instance-to-include"
Then you will have the file copied from s3, available for your build, and you can add it to the artifacts output.

AWS CodePipeline Action execution failed

I'm trying to hook my GitHub repo with S3 so every time there's a commit, AWS CodePipeline will deploy the ./<path>/public folder to a specified S3 bucket.
So far in my pipeline, the Source works (hooked to GitHub and picks up new commits) but the Deploy failed because: Action execution failed
BundleType must be either YAML or JSON.
This is how I set them up:
CodePipeline
Action name: Source
Action provider: GitHub
Repository: account/repo
Branch: master
GitHub webhooks
CodeDeploy
Compute type: AWS Lambda
Service role: myRole
Deployment settings: CodeDeployDefault.LambdaAllAtOnce
IAM Role: myRole
AWS Service
Choose the service that will use this role: Lambda / CodeDeploy
Select your use case: CodeDeploy
Policies: AWSCodeDeployRole
I understand that there must be a buildspec.yml file in the root folder. I've tried using a few files I could find but they don't seem to work. What did I do wrong or how should I edit the buildspec file to do what I want?
Update
Thanks to #Milan Cermak. I understand I need to do:
CodePipeline:
Stage 1: Source: hook with GitHub repo. This one is working.
Stage 2: Build: use CodeBuild to grab only the wanted folder using a buildspec.yml file in the root folder of the repo.
Stage 3: Deploy: use
Action Provider: S3
Input Artifacts: OutputArtifacts (result of stage 2).
Bucket: the bucket that hosts the static website.
CodePipeline works. However, the output contains only files (.html) not folders nested inside the public folder.
I've checked this and figured how to remove path of a nested folder with discard-paths: yes but I'm unable to get all the sub-folders inside the ./<path>/public folder. Any suggestion?
CodeBuild use buildspec, but CodeDeploy use appspec.
Is there any appspec file?
You shouldn't use CodeDeploy, as that's a service for automation of deployments of applications, but rather CodeBuild, which executes commands and prepares the deployment artifact for further use in the pipeline.
These commands are in thebuildspec.yml file (typically in the root directory of the repo, but it's configurable). For your use case, it won't be too complicated, as you're not compiling anything or running tests, etc.
Try this as a starting point:
version: 0.2
phases:
build:
commands:
- ls
artifacts:
files:
- public/*
The phases section is required, that's why it's included (at least, thanks to the ls command, you'll see what files are present in the CodeBuild environment), but it's not interesting for your case. What is interesting is the artifacts section. That's where you define what is the output of the CodeBuild phase, i.e. what gets passed further to the next step in the pipeline.
Depending on how you want to have the files structured (for example, do you want to have the public directory also in the artifact or do you only want to have the files themselves, without the parent dir), you might want to use other configuration that's possible in the artifacts section. See the buildspec reference for details.
Remember to use the output artifact of the CodeBuild step as the input artifact of the Deploy to S3 step.
Buildspec is for CodeBuild as t_yamo pointed out.
You are using CodeDeploy which uses an appspec.yml file, which looks something like this for my config.
version: 0.0
os: linux
files:
- source: /
destination: /path/to/destination
hooks:
BeforeInstall:
- location: /UnzipResourceBundle.sh
ApplicationStart:
- location: /RestartServer.sh
timeout: 3600
UnzipResourceBundle.sh is just a bash script which can be used to do any number of things.
#!/bin/bash
// Do something
You can find a sample for the AppSpec.yml file from Amazon Documentation here - https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file-example.html#appspec-file-example-lambda
CodePipeline recently announced a deploy to S3 action: https://aws.amazon.com/about-aws/whats-new/2019/01/aws-codepipeline-now-supports-deploying-to-amazon-s3/

Codepipeline: Insufficient permissions Unable to access the artifact with Amazon S3 object key

Hello I created a codepipeline project with the following configuration:
Source Code in S3 pulled from Bitbucket.
Build with CodeBuild, generating an docker image and storing it into a Amazon ECS repository.
Deployment provider Amazon ECS.
All the process works ok until when it tries to deploy, for some reason I am getting the following error during deployment:
Insufficient permissions Unable to access the artifact with Amazon S3
object key 'FailedScanSubscriber/MyAppBuild/Wmu5kFy' located in the
Amazon S3 artifact bucket 'codepipeline-us-west-2-913731893217'. The
provided role does not have sufficient permissions.
During the building phase, it is even able to create a new docker image in the ECS repository.
I tried everything, changed IAM roles and policies, add full access to S3, I have even setted the S3 bucket as public, nothing worked. I am without options, if someone could help me that would be wonderful, I have poor experience with AWS, so any help is appreciated.
I was able to find a solution. The true issue is that when the deployment provider is set as Amazon ECS, we need to generate an output artifact indicating the name of the task definition and the image uri, for example:
post_build:
commands:
- printf '[{"name":"your.task.definition.name","imageUri":"%s"}]' $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG > imagedefinitions.json
artifacts:
files: imagedefinitions.json
This happens when AWS CodeDeploy cannot find the build artifact from AWS CodeBuild. If you go into the S3 bucket and check the path you would actually see that the artifact object is NOT THERE!
Even though the error says about a permission issue. This can happen due the absent of the artifact object.
Solution: Properly configure artifacts section in buildspec.yml and configure AWS Codepipeline stages properly specifying input and output artifact names.
artifacts:
files:
- '**/*'
base-directory: base_dir
name: build-artifact-name
discard-paths: no
Refer this article - https://medium.com/#shanikae/insufficient-permissions-unable-to-access-the-artifact-with-amazon-s3-247f27e6cdc3
For me the issue was that my CodeBuild step was encrypting the artifacts using the Default AWS Managed S3 key.
My Deploy step uses a Cross-Account role, and so it couldn't retrieve the artifact. Once I changed the Codebuild encryption key to my CMK as it should've been originally, my deploy step succeeded.

Can AWS CodePipeline trigger AWS CodeBuild without hijacking CodeBuild's artifact settings?

I setup AWS CodeBuild to write out a large number of artifacts files into S3. The CodeBuild yaml files defines which files and CodeBuild project settings define the S3 bucket. And all works correctly.
It appears that when you trigger AWS CodeBuild from AWS CodePipeline that CodePipeline ignores the artifact settings of AWS CodeBuild and instead forces the artifact into a zip file in a codepipline S3 bucket.
Is there a way to use CodePipeline but have it respect AWS CodeBuild's artifact settings?
CodeBuild also gives you access to aws-cli.
You can edit the buildspec.yaml file and upload these artifacts to S3. You can also create a .sh file, give it the right execute permissions and then use shell script to upload the artifacts to S3 Bucket.
You will also need to give the right permissions to the S3 Bucket via the Service Role for CodeBuild.