How to write AWS Code pipeline only for updated files? - amazon-web-services

I am very new to AWS.
I have task where I need to write Code pipeline to copy files from CodeCommit to S3 bucket, but condition is only update/deleted files should get copied. I have already written pipeline, but it copies all the files to S3 bucket.
I tried writing AWS Lambda + Python code, but it works the same. I used below chunk of code,
https://medium.com/#michael.niedermayr/using-aws-codecommit-and-lambda-for-automatic-code-deployment-to-s3-bucket-b35aa83d029b
Any help or suggestion appreciated

Instead of using the S3 deploy action to copy your files, use a CodeBuild action in your CodePipeline.
In the CodeBuild buildspec, use 'aws s3 sync' command with the '--delete' switch to sync files to the S3 bucket. Make sure the CodeBuild Service role has permission to write to the S3 bucket.

Related

Move child folder content to an Amazon S3 bucket using AWS Code Deploy

I have created the below-mentioned pipeline in AWS.
CodeCommit -> CodeBuild -> CodeDeploy
I am using ng serve command to build the Angular JS source code and it creates a folder named dist and moves all the build content.
When deploying the build result to an Amazon S3 bucket, is it possible to move only the content inside the dist folder?
At the moment, all the source files are being transferred to the Amazon S3 bucket with the dist folder.
Other than writing a Lambda function to achieve this, is there a shortcut to achieve this within the pipeline?
I wonder why you have used CodeDeploy for S3 copying. You can achieve the same using CodeBuild itself.
CodeCommit => CodePipeline => CodeBuild => S3
After creating the build (dist), add commands for S3 sync in your buildspec.yml file
Sample few lines of buildspec.yml for your use case.
- ng serve
- ls -ltr
- cd dist
- aws s3 sync . s3://YOUR_BUCKET --acl bucket-owner-full-control

How to delete files in S3 using CodePipeline

I have created a CodePipeline that takes input(files) from CodeCommit repository and deploys files to S3 bucket. When I delete files in CodeCommit repository, pipeline doesn't delete the file in the corresponding S3 bucket. In short, I want my S3 bucket to be in sync with CodeCommit repository. How can I achieve this?
My Pipeline doesn't have CodeBuild stage.
AWS-CodeCommit --> CodePipeline --> Deploy-to-Cross-Account-S3
Unfortunately, currently there is no predefined configuration of an S3 Deploy action to delete an S3 object from the pipeline itself.
Docs
The workaround for now will be to add another action on the pipeline to use CodeBuild to perform the objects cleanup before/after deploying them.
You can add a CodeBuild stage of:
- aws s3 sync . s3://[bucket_name] --delete
Not sure how to make it work cross-account, tho.

Can I copy files from one aws bucket to another aws bucket without downloading to local machine?

I have some huge files which are in bucket1. I need to copy some of the files to bucket2. I know some ways where I will download files from bucket1 to local machine and upload to bucket2.
Can I skip this download and upload step and request amazon to copy files without downloading? Is this even possible?
Amazon S3 has API calls that can copy objects between buckets (even between regions), which does not involve any downloading.
The easiest method is to use the AWS Command-Line Interface (CLI), which has some useful commands:
aws s3 sync s3://bucket1/ s3://bucket2/
will syncrhonize files between buckets, so they have the same content.
aws s3 cp --recursive s3://bucket1/ s3://bucket2/
will do similar, but you can be more selective
See: Using High-Level s3 Commands with the AWS Command Line Interface - AWS Command Line Interface

Deploying React Website to s3 Bucket

New'ish to AWS very new to CI/CD. And have a question re: deploying a Deploying React Website to an S3 Bucket.
I've got me a Git repo that contains a React web app.
I've setup an AWS CodePipeline project, which polls the master branch of the repo looking for commits.
It then triggers the AWS CodeBuild project which builds the react app as defined in the buildspec.yml
In the example/tutorial I've followed the buildspec.yml had the following...
post_build:
commands:
- aws s3 cp dist s3://${S3_BUCKET} --recursive
...which copies the build output to the destination S3 Bucket.
It all works great, however this assumes that the S3 Bucket is already there.
Question: Which step should be responsible for creating the destination S3 Bucket and what should I be using to do so?
I'm thinking that perhaps it should be a CodeDeploy with another Cloudformation Template
Just after a little guidance before going down the wrong route :)
Many thanks.
Ok, so I think I found the tutorial you were referring to: https://www.karelbemelmans.com/2017/01/deploying-a-hugo-website-to-amazon-s3-using-aws-codebuild/
Can you specify two post_build commands? You could just create the bucket first which might fail if the bucket already exists but who cares right? Or you could check if the bucket exists and only if it doesn't create it.
Here's the s3 command you need to create a bucket:
https://docs.aws.amazon.com/cli/latest/reference/s3api/create-bucket.html
There's an API for list buckets but I can't post it because this new user doesn't have 10 reputation yet unfortunately.
Good luck,
-Asaf

Can AWS CodePipeline trigger AWS CodeBuild without hijacking CodeBuild's artifact settings?

I setup AWS CodeBuild to write out a large number of artifacts files into S3. The CodeBuild yaml files defines which files and CodeBuild project settings define the S3 bucket. And all works correctly.
It appears that when you trigger AWS CodeBuild from AWS CodePipeline that CodePipeline ignores the artifact settings of AWS CodeBuild and instead forces the artifact into a zip file in a codepipline S3 bucket.
Is there a way to use CodePipeline but have it respect AWS CodeBuild's artifact settings?
CodeBuild also gives you access to aws-cli.
You can edit the buildspec.yaml file and upload these artifacts to S3. You can also create a .sh file, give it the right execute permissions and then use shell script to upload the artifacts to S3 Bucket.
You will also need to give the right permissions to the S3 Bucket via the Service Role for CodeBuild.