Can AWS CodePipeline trigger AWS CodeBuild without hijacking CodeBuild's artifact settings? - amazon-web-services

I setup AWS CodeBuild to write out a large number of artifacts files into S3. The CodeBuild yaml files defines which files and CodeBuild project settings define the S3 bucket. And all works correctly.
It appears that when you trigger AWS CodeBuild from AWS CodePipeline that CodePipeline ignores the artifact settings of AWS CodeBuild and instead forces the artifact into a zip file in a codepipline S3 bucket.
Is there a way to use CodePipeline but have it respect AWS CodeBuild's artifact settings?

CodeBuild also gives you access to aws-cli.
You can edit the buildspec.yaml file and upload these artifacts to S3. You can also create a .sh file, give it the right execute permissions and then use shell script to upload the artifacts to S3 Bucket.
You will also need to give the right permissions to the S3 Bucket via the Service Role for CodeBuild.

Related

Send an artifact (zip file) to source stage on AWS CodePipeline through Jenkins

is there any way of sending a zip file made on a Jenkins job to an AWS CodePipeline Source Stage, so the build gets triggered and passed to CodeBuild with the artifact made on the Jenkins job?
In your CodePipeline, create a source action (Action Provider->Source->Amazon S3) pointing to a s3 bucket (and prefix?) and place your zip file in the configured S3 location.

Move child folder content to an Amazon S3 bucket using AWS Code Deploy

I have created the below-mentioned pipeline in AWS.
CodeCommit -> CodeBuild -> CodeDeploy
I am using ng serve command to build the Angular JS source code and it creates a folder named dist and moves all the build content.
When deploying the build result to an Amazon S3 bucket, is it possible to move only the content inside the dist folder?
At the moment, all the source files are being transferred to the Amazon S3 bucket with the dist folder.
Other than writing a Lambda function to achieve this, is there a shortcut to achieve this within the pipeline?
I wonder why you have used CodeDeploy for S3 copying. You can achieve the same using CodeBuild itself.
CodeCommit => CodePipeline => CodeBuild => S3
After creating the build (dist), add commands for S3 sync in your buildspec.yml file
Sample few lines of buildspec.yml for your use case.
- ng serve
- ls -ltr
- cd dist
- aws s3 sync . s3://YOUR_BUCKET --acl bucket-owner-full-control

How to write AWS Code pipeline only for updated files?

I am very new to AWS.
I have task where I need to write Code pipeline to copy files from CodeCommit to S3 bucket, but condition is only update/deleted files should get copied. I have already written pipeline, but it copies all the files to S3 bucket.
I tried writing AWS Lambda + Python code, but it works the same. I used below chunk of code,
https://medium.com/#michael.niedermayr/using-aws-codecommit-and-lambda-for-automatic-code-deployment-to-s3-bucket-b35aa83d029b
Any help or suggestion appreciated
Instead of using the S3 deploy action to copy your files, use a CodeBuild action in your CodePipeline.
In the CodeBuild buildspec, use 'aws s3 sync' command with the '--delete' switch to sync files to the S3 bucket. Make sure the CodeBuild Service role has permission to write to the S3 bucket.

How to delete files in S3 using CodePipeline

I have created a CodePipeline that takes input(files) from CodeCommit repository and deploys files to S3 bucket. When I delete files in CodeCommit repository, pipeline doesn't delete the file in the corresponding S3 bucket. In short, I want my S3 bucket to be in sync with CodeCommit repository. How can I achieve this?
My Pipeline doesn't have CodeBuild stage.
AWS-CodeCommit --> CodePipeline --> Deploy-to-Cross-Account-S3
Unfortunately, currently there is no predefined configuration of an S3 Deploy action to delete an S3 object from the pipeline itself.
Docs
The workaround for now will be to add another action on the pipeline to use CodeBuild to perform the objects cleanup before/after deploying them.
You can add a CodeBuild stage of:
- aws s3 sync . s3://[bucket_name] --delete
Not sure how to make it work cross-account, tho.

Codepipeline: Insufficient permissions Unable to access the artifact with Amazon S3 object key

Hello I created a codepipeline project with the following configuration:
Source Code in S3 pulled from Bitbucket.
Build with CodeBuild, generating an docker image and storing it into a Amazon ECS repository.
Deployment provider Amazon ECS.
All the process works ok until when it tries to deploy, for some reason I am getting the following error during deployment:
Insufficient permissions Unable to access the artifact with Amazon S3
object key 'FailedScanSubscriber/MyAppBuild/Wmu5kFy' located in the
Amazon S3 artifact bucket 'codepipeline-us-west-2-913731893217'. The
provided role does not have sufficient permissions.
During the building phase, it is even able to create a new docker image in the ECS repository.
I tried everything, changed IAM roles and policies, add full access to S3, I have even setted the S3 bucket as public, nothing worked. I am without options, if someone could help me that would be wonderful, I have poor experience with AWS, so any help is appreciated.
I was able to find a solution. The true issue is that when the deployment provider is set as Amazon ECS, we need to generate an output artifact indicating the name of the task definition and the image uri, for example:
post_build:
commands:
- printf '[{"name":"your.task.definition.name","imageUri":"%s"}]' $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG > imagedefinitions.json
artifacts:
files: imagedefinitions.json
This happens when AWS CodeDeploy cannot find the build artifact from AWS CodeBuild. If you go into the S3 bucket and check the path you would actually see that the artifact object is NOT THERE!
Even though the error says about a permission issue. This can happen due the absent of the artifact object.
Solution: Properly configure artifacts section in buildspec.yml and configure AWS Codepipeline stages properly specifying input and output artifact names.
artifacts:
files:
- '**/*'
base-directory: base_dir
name: build-artifact-name
discard-paths: no
Refer this article - https://medium.com/#shanikae/insufficient-permissions-unable-to-access-the-artifact-with-amazon-s3-247f27e6cdc3
For me the issue was that my CodeBuild step was encrypting the artifacts using the Default AWS Managed S3 key.
My Deploy step uses a Cross-Account role, and so it couldn't retrieve the artifact. Once I changed the Codebuild encryption key to my CMK as it should've been originally, my deploy step succeeded.