Deploying React Website to s3 Bucket - amazon-web-services

New'ish to AWS very new to CI/CD. And have a question re: deploying a Deploying React Website to an S3 Bucket.
I've got me a Git repo that contains a React web app.
I've setup an AWS CodePipeline project, which polls the master branch of the repo looking for commits.
It then triggers the AWS CodeBuild project which builds the react app as defined in the buildspec.yml
In the example/tutorial I've followed the buildspec.yml had the following...
post_build:
commands:
- aws s3 cp dist s3://${S3_BUCKET} --recursive
...which copies the build output to the destination S3 Bucket.
It all works great, however this assumes that the S3 Bucket is already there.
Question: Which step should be responsible for creating the destination S3 Bucket and what should I be using to do so?
I'm thinking that perhaps it should be a CodeDeploy with another Cloudformation Template
Just after a little guidance before going down the wrong route :)
Many thanks.

Ok, so I think I found the tutorial you were referring to: https://www.karelbemelmans.com/2017/01/deploying-a-hugo-website-to-amazon-s3-using-aws-codebuild/
Can you specify two post_build commands? You could just create the bucket first which might fail if the bucket already exists but who cares right? Or you could check if the bucket exists and only if it doesn't create it.
Here's the s3 command you need to create a bucket:
https://docs.aws.amazon.com/cli/latest/reference/s3api/create-bucket.html
There's an API for list buckets but I can't post it because this new user doesn't have 10 reputation yet unfortunately.
Good luck,
-Asaf

Related

How to write AWS Code pipeline only for updated files?

I am very new to AWS.
I have task where I need to write Code pipeline to copy files from CodeCommit to S3 bucket, but condition is only update/deleted files should get copied. I have already written pipeline, but it copies all the files to S3 bucket.
I tried writing AWS Lambda + Python code, but it works the same. I used below chunk of code,
https://medium.com/#michael.niedermayr/using-aws-codecommit-and-lambda-for-automatic-code-deployment-to-s3-bucket-b35aa83d029b
Any help or suggestion appreciated
Instead of using the S3 deploy action to copy your files, use a CodeBuild action in your CodePipeline.
In the CodeBuild buildspec, use 'aws s3 sync' command with the '--delete' switch to sync files to the S3 bucket. Make sure the CodeBuild Service role has permission to write to the S3 bucket.

Ember application into AWS S3 bucket

I am having issues with my dependencies when uploading an ember frontend app into S3 AWS hosted on my website. Here are 2 methods I have tried:
Uploading to S3 from git via CircleCi into S3
Uploading directly from terminal into S3
These are the steps I used to push the app into S3:
1)ember install ember-cli-deploy-build
2)ember build
3)ember install ember-cli-deploy-s3
4)ember deploy production --verbose --active=true
Using these commands it is rebuilding the application before uploading, I was just wondering if there's a way I can upload all my local files exactly as is into S3 and avoid the rebuild process (As the app works perfectly locally before deploy)?
Any help would be greatly appreciated, thank you
just use aws-cli? once authenticated you can do `aws cp --recursive your-folder s3://bucket
see https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html
and to authenticate with mfa:
https://aws.amazon.com/premiumsupport/knowledge-center/authenticate-mfa-cli/

deploy specific directory(tree) of github's repo to aws s3 bucket using aws code pipeline

I've a requirement where I need to deploy a specific tree of my gitHub's repo to a folder in AWS s3 bucket. By following link tutorial, I was able to deploy whole repo to the S3 bucket but I need to deploy only one directory(tree).
Say:- I've a tree called foo inside myRepo in github. I want to deploy foo to myBucket in AWS S3.
Ps:- foo contains folders, subfolders and files.
Kindly help, Thanks in advance !

How to delete files in S3 using CodePipeline

I have created a CodePipeline that takes input(files) from CodeCommit repository and deploys files to S3 bucket. When I delete files in CodeCommit repository, pipeline doesn't delete the file in the corresponding S3 bucket. In short, I want my S3 bucket to be in sync with CodeCommit repository. How can I achieve this?
My Pipeline doesn't have CodeBuild stage.
AWS-CodeCommit --> CodePipeline --> Deploy-to-Cross-Account-S3
Unfortunately, currently there is no predefined configuration of an S3 Deploy action to delete an S3 object from the pipeline itself.
Docs
The workaround for now will be to add another action on the pipeline to use CodeBuild to perform the objects cleanup before/after deploying them.
You can add a CodeBuild stage of:
- aws s3 sync . s3://[bucket_name] --delete
Not sure how to make it work cross-account, tho.

Codepipeline: Insufficient permissions Unable to access the artifact with Amazon S3 object key

Hello I created a codepipeline project with the following configuration:
Source Code in S3 pulled from Bitbucket.
Build with CodeBuild, generating an docker image and storing it into a Amazon ECS repository.
Deployment provider Amazon ECS.
All the process works ok until when it tries to deploy, for some reason I am getting the following error during deployment:
Insufficient permissions Unable to access the artifact with Amazon S3
object key 'FailedScanSubscriber/MyAppBuild/Wmu5kFy' located in the
Amazon S3 artifact bucket 'codepipeline-us-west-2-913731893217'. The
provided role does not have sufficient permissions.
During the building phase, it is even able to create a new docker image in the ECS repository.
I tried everything, changed IAM roles and policies, add full access to S3, I have even setted the S3 bucket as public, nothing worked. I am without options, if someone could help me that would be wonderful, I have poor experience with AWS, so any help is appreciated.
I was able to find a solution. The true issue is that when the deployment provider is set as Amazon ECS, we need to generate an output artifact indicating the name of the task definition and the image uri, for example:
post_build:
commands:
- printf '[{"name":"your.task.definition.name","imageUri":"%s"}]' $AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/$IMAGE_REPO_NAME:$IMAGE_TAG > imagedefinitions.json
artifacts:
files: imagedefinitions.json
This happens when AWS CodeDeploy cannot find the build artifact from AWS CodeBuild. If you go into the S3 bucket and check the path you would actually see that the artifact object is NOT THERE!
Even though the error says about a permission issue. This can happen due the absent of the artifact object.
Solution: Properly configure artifacts section in buildspec.yml and configure AWS Codepipeline stages properly specifying input and output artifact names.
artifacts:
files:
- '**/*'
base-directory: base_dir
name: build-artifact-name
discard-paths: no
Refer this article - https://medium.com/#shanikae/insufficient-permissions-unable-to-access-the-artifact-with-amazon-s3-247f27e6cdc3
For me the issue was that my CodeBuild step was encrypting the artifacts using the Default AWS Managed S3 key.
My Deploy step uses a Cross-Account role, and so it couldn't retrieve the artifact. Once I changed the Codebuild encryption key to my CMK as it should've been originally, my deploy step succeeded.