Deploying an Angular 2 app built with webpack using Bitbucket - amazon-web-services

I have searched high and low for an answer to this question but have been unable to find one.
I am building an Angular 2 app that I would like hosted on an S3 bucket. There will be an EC2 (possibly) backend but that's another story. Ideally, I would like to be able to check my code into Bitbucket, and by some magic that alludes me I would like S3, or EC2, or whatever to notice via a hook, for instance, that the source has changed. Of course the source would have to be built using webpack and the distributables deployed correctly.
Now this seems like a pretty straightforward request but I can find no solution exception something pertaining to WebDeploy which I shall investigate right now.
Any ideas anyone?

Good news, AWS Lambda created for you.
You need to create following scenario and code to achieve your requirement.
1-Create Lambda function, this function should do the following steps:
1-1- Clone your latest code from GitHub or Bitbucket.
1-2- install grunt or another builder for your angular app.
1-3- install node modules.
1-4- build your angular app.
1-5- copy new build to your S3 bucket.
1-6- Finish.
2-Create AWS API gateway with one resource and one method point to your Lambda function.
3-Goto your GitHub or Bitbucket settings and add webhook with your API gateway.
4-Enjoy life with AWS.
;)
Benefits:
1-You only charge when you have the new build.
2-Not need any machine or server (EC2).
3-You only maintain one function on Lambda.
for more info:
https://aws.amazon.com/lambda/
https://aws.amazon.com/api-gateway/

S3 isn't going to listen for Git hooks and fetch, build and deploy your code. BitBucket isn't going to build and deploy your code to S3. What you need is a service that sits in-between BitBucket and S3 that is triggered by a Git hook, fetches from Git, builds, and then deploys your code to S3. You need to search for Continuous Integration/Continuous Deployment services which are designed to do this sort of thing.
AWS has CodePipeline. You could setup your own Jenkins or TeamCity server. Or you could look into a service like CodeShip. Those are just a few of the many services out there that could accomplish this task. I think any of these services will require a bit of scripting on your part in order to get them to perform the actual webpack and copy to S3.

Related

CloudBuild with GITLAB at module level

I was working on GITHUB and GCP(Cloud Build for deployments) and working good. Below are the steps:
Created multiple Cloud Functions and used same GIT HUB repository.
Created separate Cloud Build Trigger for each Cloud Function where separate cloudbuild.yml in each Cloud Function folder in repository.
Trigger gets run when there are changes in respective cloud function scripts.
Now i need to integrate Cloud Build with GITLAB.
I have gone through the documentation but found that only webhook is the option and the trigger will be based on whole repository changes. It will require separate repository for each cloud function or Cloud Run. There is no option to select the repository itself.
Can experts guide me on this how I can do this integration because, we are planning to have one repo and multiple service/applications stored in that repository. And we want CI to run on GCP environment itself.
Personally I found GitLab being the worst in comparison to GitHub and BitBucket in terms of integration with the GCP Cloud Build (to run the deployment within GCP).
I don't know ideal solutions, but I probably have 2 ideas. None of them is good from my point of view.
1/ Mirror GitLab repository into GCP repository as described here - Mirroring GitLab repositories to Cloud Source Repositories One of the biggest drawbacks from my point of view - the integration solution is based on a personal credentials, and there should be a person to make it working -
Mirroring stops working if the Google Account is closed or loses access rights to the Git repository in Cloud Source Repositories
When mirroring is done - you probably can work with the GCP based repository in an ordinary way and trigger cloud build jobs as usual. A separate question - how to provide deployment logs to those who initiated the deployment...
2/ Use webhooks. That does not depend on any personal accounts, but not very granular - as you mentioned push on the whole repository level. To overcome that limitation, there might be a very tricky (inline) yaml file - executed by a cloud build trigger. In that yaml file, not only we should fetch the code, but also parse all changes (all commits) in that push to find out which subdirectories (thus separate components - cloud functions) are potentially modified. Then, for each affected (modified) subdirectory we can trigger (asynchronously) some other cloud build job (with a yaml file for it located inside that subdirectory).
An obvious drawback - not clear who and how should get the logs from all those deployments, especially if something went wrong, and the development (and management) of such deployment process might be time/effort consuming and not easy.

Which is the best way on AWS to set up a CI/CD of a Django app from GitHub?

I have a Django Web Application which is not too large and uses the default database that comes with Django. It doesn't have a large volume of requests either. Just may not be more than 100 requests per second.
I wanted to figure out a method of continuous deployment on AWS from my source code residing in GitHub. I don't want to use EBCLI to deploy to Elastic Beanstalk coz it needs commands in the command line and is not automated deployment. I had tried setting up workflows for my app in GitHub Actions and had set up a web server environment in EB too. But it ddn't seem to work. Also, I couldn't figure out the final url to see my app from that EB environment. I am working on a Windows machine.
Please suggest the least expensive way of doing this or share any videos/ articles you may hae which will get me to my app being finally visible on the browser after deployment.
You will use AWS CodePipeline, a service that builds, tests, and deploys your code every time there is a code change, based on the release process models you define. Use CodePipeline to orchestrate each step in your release process. As part of your setup, you will plug other AWS services into CodePipeline to complete your software delivery pipeline.
https://docs.aws.amazon.com/whitepapers/latest/cicd_for_5g_networks_on_aws/cicd-on-aws.html

How to deploy nuxt frontend with express backend on AWS?

I have a Nuxt v2 SSR application as the frontend running on port 3000
I have an express API as the backend running on port 8000
I have a python script that loads data from external APIs and needs to run continuously
Currently all of them are separate projects with their own package.json and what not
How do I deploy this to AWS?
The only thing I have figured out so far is that I may have to deploy express API as an Elastic Beanstalk application.
Should I have a separate docker-compose file for each because they are separate projects currently or should I merge them into one project with a single docker-compose file
I saw similar questions asked about React, could really appreciate some direction here in Nuxt
None of these similar questions are based on Nuxt
How to deploy separated frontend and backend?
How to deploy a React + NodeJS Express application to AWS?
How to deploy backend and frontend projects if they are separate?
There are a couple of approaches depending on your workload and budget. Let's cover what the options are and which ones apply to the work at hand.
Serverless Approach with Server-side rendering (SSR)
Tutorial
By creating an API Gateway route that invokes NuxtRenderer in a Lambda. The resulting generated HTML/JS/CSS would be pushed to S3 bucket. The S3 bucket acts as a Cloudfront origin. Use the CDN to cache the least updated items and use the API call to update the cache when needed. This is the cheapest way to deploy, but if you don't have traffic some customers may experience brief lag when cache updates hit. Calculator
Serverless Approach via static generation
This is the easiest way to get going. All the routes that do not have dynamic template generation can simply live in an S3 bucket. The simplest approach is to run nuxt generate and then upload the resulting contents of dist to the bucket. No node backend here, which is not the requirement in the question but its worth mentioning.
Well documented on NuxtJS.org and free tier.
Serverless w/ Elastic Beanstalk
IMO this approach is unnecessary and slightly dated for the offering AWS provides in 2022. It still works of course, but the cost to benefits aren't attractive.
To do this you need to use the eb command line tool and set NPM_CONFIG_UNSAFE_PERM=true. AFAIK there is nothing else special that needs to happen for eb to know what to do from there. Calculator
Server-ish Approach(es)
Another alternative is to use Lightsail and NodeJS server. Lightsail offers low (but not lower than serverless) cost to a full time NodeJS server. In this case you would want to clone your project to the server then setup systemd script to keep nodejs running.
Yet another way to do this and have some scalability is to use ECS and Docker. In this case you would create a Dockerfile that builds the containers, executes npm start, and exposes the port Nuxt runs on to the host. This example shows how to run it using Fargate, which is essentially a serverless version of EC2 machine. Calculator
There are some ways for you to deploy your stack in AWS. I can give you some options, but you're best shot if you want to save some costs is by using Lambda Functions as your backend, S3 for your front-end and a batch Lambda job for your python script.
For your backend - https://github.com/vendia/serverless-express
For your nuxt frontend - https://nuxtjs.org/deployments/amazon-web-services
For your python job - https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/RunLambdaSchedule.html
It's not way too simple to execute all of those, but with the following links you'll probably have an idea of how you may implement your solution.

from gitlab ci/cd to AWS EC2

It's beens ome time since I've been trying to figure out the really easy way.
I am using gitlab CI/CD and want to move the built data from there to AWS EC2. Problem is i found 2 ways which both are really bad ideas.
building project on gitlab ci/cd, then ssh into the AWS, pull the project from there again, and run npm scripts. This is really wrong and I won't go into details why.
I saw the following: How to deploy with Gitlab-Ci to EC2 using AWS CodeDeploy/CodePipeline/S3 , but it's so big and complex.
Isn't there any easier way to copy built files from gitlab ci/cd to AWS EC2 ?
I use Gitlab as well, and what has worked for me is configuring my runners on EC2 instances. A few options come to mind:
I'd suggest managing your own runners (vs. shared runners) and
giving them permissions to drop built files in S3 and have your
instances pick from there. You could trigger SSM commands from the
runner targeting your instances (preferably by tags) and they'll
download the built files.
You could also look into S3 notifications. I've used them to trigger
Lambda functions on object uploads: it's pretty fast and offers
retry mechanisms. The Lambda could then push SSM commands to
instances. https://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html

CodeDeploy to S3

I have a site in a S3 bucket, configured for web access, for which I run an aws s3 sync command every time I push on a specific git repository (I'm using Gitlab at the moment).
So if I push to stable branch, a Gitlab runner performs the npm start build command for building the site, and then aws s3 sync to synchronize to a specific bucket.
I want to migrate to CodeCommit and use pure AWS tools to do the same.
So far I was able to successfully setup the repository, create a CodeBuild for building the artifact, and the artifact is being stored (not deployed) to a S3 bucket. Difference is that I can't get it to deploy to the root folder of the bucket instead of a subfolder, seems like the process is not made for that. I need it to be on a root folder because of how the web access is configured.
For the deployment process, I was taking a look at CodeDeploy but it doesn't actually let me deploy to S3 bucket, it only uses the bucket as an intermediary for deployment to a EC2 instance. So far I get the feeling CodeDeploy is useful only for deployments involving EC2.
This tutorial with a similar requirement to mine, uses CodePipeline and CodeBuild, but the deployment step is actually a aws s3 sync command (same as I was doing on Gitlab), and the actual deployment step on CodePipeline is disabled.
I was looking into a solution which involves using AWS features made for this specific purpose, but I can't find any.
I'm also aware of LambCI, but to me looks like what CodePipeline / CodeBuild is doing, storing artifacts (not deploying to the root folder of the bucket). Plus, I'm looking for an option which doesn't require me to learn or deploy new configuration files (outside AWS config files).
Is this possible with the current state of AWS features?
Today AWS has announced as a new feature the ability to target S3 in the deployment stage of CodePipeline. The announcement is here, and the documentation contains a tutorial available here.
Using your CodeBuild/CodePipeline approach, you should now be able to choose S3 as the deployment provider in the deployment stage rather than performing the sync in your build script. To configure the phase, you provide an S3 bucket name, specify whether to extract the contents of the artifact zip, and if so provide an optional path for the extraction. This should allow you to deploy your content directly to the root of a bucket by omitting the path.
I was dealing with similar issue and as far as I was able to find out, there is no service which is suitable for deploying app to S3.
AWS CodeDeploy is indeed for deploying code running as server.
My solution was to use CodePipeline with three stages:
Source which takes source code from AWS CodeCommit
Build with AWS CodeBuild
Custom lambda function which after successful build takes artifact from S3 artifact storage, unzip it and copies files to my S3 website host.
I used this AWS lambda function from SeamusJ https://github.com/SeamusJ/deploy-build-to-s3
Several changes had to be made, I used node-unzip-2 instead of unzip-stream for unziping artifict from s3.
Also I had to change ACLs in website.ts file
Uploading from CodeBuild is currently the best solution available.
There's some suggestions on how to orchestrate this deployment via CodePipeline in this answer.