Continuous deployment for website into AWS S3 using Bitbucket pipelines - amazon-web-services

I have a static website written using frameworks AngularJS and Bootstrap CSS. All the framework dependencies are managed using Bower.
Currently, I do bower install and then copy my site contents into an AWS S3 bucket and publish it as a website. Every time I make a tiny change in any file, I delete all the contents of the bucket and upload the code (containing changes).
I am using Bitbucket as version control system.
I want to cut the process of doing bower install and manual upload, I'd rather deploy the website as soon as I push the code into Bitbucket. What can be done? I have no knowledge of Bitbucket pipelines.

The script below worked for me. Step 1 installs bower dependencies, and step 2 deploy the changes to S3 bucket.
image: php:7.1.1
pipelines:
default:
- step:
name: Install dependencies
image: node:8.5.0
caches:
- node
script:
- npm install
- npm install -g bower
- bower install --allow-root
artifacts:
- bower_components/**
- step:
# set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY as environment variables
name: Deploy to S3
deployment: test # set to test, staging or production
# trigger: manual # uncomment to have a manual step
image: atlassian/pipelines-awscli
script:
- aws s3 sync --delete . s3://www.tarkshala.com

Related

How to CI/CD deploy static Dockerized React build files to S3

I currently have a React application that I have a AWS CodePipeline set up for that does the following.
Detect changes in GitHub repository
Build the "build" files (with CodeBuild) using buildspec.yaml file
Push "build" files to S3 bucket
The S3 bucket is configured to serve the static files to my domain.
This setup is great because it's cheap, I don't need to have an EC2 server always up and running serving these static files, which is completely unnecessary.
Recently however I've Dockerized this application, which is fantastic for me when I'm working on it from different machines.
However now that it's Dockerized it seems like it would be a better idea to have a docker container build the "build" files and push them to the S3 bucket, to ensure that the files being built on my machine are identical to the ones being pushed to the S3 Bucket.
Ideally I would like to have this all be automated when I push to the repo like it currently is.
I've seen a lot of tutorials about how to automate the creation of docker images getting pushed to AWS ECR and then using ECS (Fargate) to run the container. However to me this is just the same thing as running my app on an EC2 server... why do I want to do all this and then have a container continuously running on a server? Now it would just be a ECS server...
So what I am asking is, how can I create an automated CI/CD pipeline that builds the static files using a docker container, and then pushes them to S3, as I currently have it?
Here is current CodeBuild buildspec.yaml file for reference
version: 0.2
phases:
install:
runtime-versions:
nodejs: 12
commands:
# install yarn
- npm install yarn
# install dependencies
- yarn
# so that build commands work
- yarn add eslint-config-react-app
build:
commands:
# run build script
- yarn build
artifacts:
# include all files required to run application
# we include only the static build files
files:
- '**/*'
base-directory: 'build'
I figured this out. It is possible to do this without modifying the Source or Deploy sections of the CodePipeline. You do not need EC2,ECR, ECS or Fargate.
You will modify the CodeBuild section of the pipeline to use a buildspec.yml file like this:
version: 0.2
phases:
install:
runtime-versions:
docker: 19
commands:
# log in to docker account to prevent rate limiting
- docker login -u $DOCKER_USERNAME -p $DOCKER_PASSWORD
# build the Docker image for the application
- docker build -t my-react-app:latest -f Dockerfile.prod .
build:
commands:
# run container from built image (builds production files)
- docker run my-react-app:latest
# set container id to variable
- CONTAINER=$(docker ps -alq)
# copy build files from container to host
- docker cp $CONTAINER:/app/build/ $CODEBUILD_SRC_DIR/build
artifacts:
# include all files required to run application
# we include only the static build files
files:
- "**/*"
base-directory: "build"
There are some additional details, I've written a blog post about it here:
https://ncoughlin.com/posts/aws-codepipeline-dockerized-react-s3/

AWS amplify deployment fails - You need to enable JavaScript to run this app

I have a react app that I'm trying to deploy automtically using AWS amplify. I connected the repo and the build and deployment seems to be successfull:
But opening the url shows You need to enable JavaScript to run this app. in the console
Building and serving the app locally using
$ npm run build
$ serve -s build
works fine.
I saw in the issue here that this might be about setting the "proxy" in the package.json, but I'm not sure which port does AWS amplify uses and adding the line of the answer there (using localhost:5000) doesn't work either.
any ideas?
EDIT:
amplify.yml:
version: 1
frontend:
phases:
preBuild:
commands:
- npm ci
build:
commands:
- npm run build
artifacts:
baseDirectory: build
files:
- '**/*'
cache:
paths:
- node_modules/**/*

How to configure AWS Codebuild with Webpack

I have created an AWS Codepipeline that runs in four stages. 1) Source code from github, 2) deploy backend to Elastic Beanstalk, 3) build fronted code with Codebuild (using the buildspec file below), and 4) deploy results of webpack to S3.
Everything works as expected so far except for the results of stage 3. Codebuild seemingly sets the artifacts as the source files and not the results of the webpack build. When I look in the bucket and folder for the deployed code, I'm expecting to see a series of js asset files and a manifest.json. Instead, I see the project files. Not quite sure what I'm configuring wrong here.
buildspec.yml
version: 0.2
phases:
install:
runtime-versions:
nodejs: 12
commands:
- echo Installing dependencies...
- yarn
build:
commands:
- echo Building project...
- yarn build
post_build:
commands:
- echo build completed on `date`
artifacts:
files:
- '**/*'
cache:
paths:
- '/root/.npm/**/*'
- '/node_modules/'
webpack-build configuration
webpack-deploy configuration
After a few hours of troubleshooting, I was finally able to figure out what was going on.
Running yarn build on the project bundles everything into a /dist folder. The artifacts line, however, indicates that the files that should be uploaded to S3 are all of the project files. So the fix was as simple as updating **/* to dist/**/*.

How to upload a generated folder content into S3 using CodeBuild?

I am trying to configure a CodePipeline on AWS that it takes my Nuxt website on Github, run the command npm run generate to generate the static website then upload the dist folder on an S3 bucket.
Here what my buildspec.yml it looks like:
version: 0.2
phases:
install:
commands:
- npm install
build:
commands:
- npm run generate
post_build:
commands:
- aws s3 sync dist $S3_BUCKET
The error I get is: The user-provided path dist does not exist. Is anyone know how to correct this? I read a lot about artefacts but I never use them beforeā€¦
Thanks in advance,
You can use artifacts to upload dist folder to s3. I will suggest not to use post build command to achieve this because the post build command runs even when the build is failed, this is the known limitation of codebuild. Just replace your buildspec with the following.
version: 0.2
phases:
install:
commands:
- npm install
build:
commands:
- npm run generate
artifacts:
files:
- '**/*'
base-directory: 'dist'
'**/*' means it will upload all the files and folder under the base directory "dist". You need to mention your bucket name in your aws console ( browser).
Also make sure that your codebuild IAM role has sufficient permission to access your bucket.

circleci 2.0 can't find awscli

I'm using circleCI 2.0 and they can't find aws but their documents clearly say that aws is installed in default
when I use this circle.yml
version: 2
jobs:
build:
working_directory: ~/rian
docker:
- image: node:boron
steps:
- checkout
- run:
name: Pre-Dependencies
command: mkdir ~/rian/artifacts
- restore_cache:
keys:
- rian-{{ .Branch }}-{{ checksum "yarn.lock" }}
- rian-{{ .Branch }}
- rian-master
- run:
name: Install Dependencies
command: yarn install
- run:
name: Test
command: |
node -v
yarn run test:ci
- save_cache:
key: rian-{{ .Branch }}-{{ checksum "yarn.lock" }}
paths:
- "~/.cache/yarn"
- store_artifacts:
path: ~/rian/artifacts
destination: prefix
- store_test_results:
path: ~/rian/test-results
- deploy:
command: aws s3 sync ~/rian s3://rian-s3-dev/ --delete
following error occurs:
/bin/bash: aws: command not found
Exited with code 127
so if I edit the code this way
- deploy:
command: |
apt-get install awscli
aws s3 sync ~/rian s3://rian-s3-dev/ --delete
then i get another kind of error:
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package awscli
Exited with code 100
Anyone knows how to fix this???
The document you are reading is for CircleCI 1.0 and for 2.0 is here:
https://circleci.com/docs/2.0/
In CircleCI 2.0, you can use your favorite Docker image. The image you are currently setting is node:boron, which does not include the aws command.
https://hub.docker.com/_/node/
https://github.com/nodejs/docker-node/blob/14681db8e89c0493e8af20657883fa21488a7766/6.10/Dockerfile
If you just want to make it work for now, you can install the aws command yourself in circle.yml.
apt-get update && apt-get install -y awscli
However, to take full advantage of Docker's benefits, it is recommended that you build a custom Docker image that contains the necessary dependencies such as the aws command.
You can write your custom aws-cli Docker image something like this:
FROM circleci/python:3.7-stretch
ENV AWS_CLI_VERSION=1.16.138
RUN sudo pip install awscli==${AWS_CLI_VERSION}
I hit this issue when deploying to AWS lambda functions and pushing files to S3 bucket. Finally solved it and then built a docker image to save time in installing the AWS CLI every time. Here is a link to the image and the repo!
https://github.com/wilson208/circleci-awscli
https://hub.docker.com/r/wilson208/circleci-awscli/
Fire a PR in or open an issue if you need anything added to the image and I will get to it when I can.
Edit:
Also, checkout the readme on github for examples of deploying a package to Lambda or pushing files to S3