I have a Nuxt v2 SSR application as the frontend running on port 3000
I have an express API as the backend running on port 8000
I have a python script that loads data from external APIs and needs to run continuously
Currently all of them are separate projects with their own package.json and what not
How do I deploy this to AWS?
The only thing I have figured out so far is that I may have to deploy express API as an Elastic Beanstalk application.
Should I have a separate docker-compose file for each because they are separate projects currently or should I merge them into one project with a single docker-compose file
I saw similar questions asked about React, could really appreciate some direction here in Nuxt
None of these similar questions are based on Nuxt
How to deploy separated frontend and backend?
How to deploy a React + NodeJS Express application to AWS?
How to deploy backend and frontend projects if they are separate?
There are a couple of approaches depending on your workload and budget. Let's cover what the options are and which ones apply to the work at hand.
Serverless Approach with Server-side rendering (SSR)
Tutorial
By creating an API Gateway route that invokes NuxtRenderer in a Lambda. The resulting generated HTML/JS/CSS would be pushed to S3 bucket. The S3 bucket acts as a Cloudfront origin. Use the CDN to cache the least updated items and use the API call to update the cache when needed. This is the cheapest way to deploy, but if you don't have traffic some customers may experience brief lag when cache updates hit. Calculator
Serverless Approach via static generation
This is the easiest way to get going. All the routes that do not have dynamic template generation can simply live in an S3 bucket. The simplest approach is to run nuxt generate and then upload the resulting contents of dist to the bucket. No node backend here, which is not the requirement in the question but its worth mentioning.
Well documented on NuxtJS.org and free tier.
Serverless w/ Elastic Beanstalk
IMO this approach is unnecessary and slightly dated for the offering AWS provides in 2022. It still works of course, but the cost to benefits aren't attractive.
To do this you need to use the eb command line tool and set NPM_CONFIG_UNSAFE_PERM=true. AFAIK there is nothing else special that needs to happen for eb to know what to do from there. Calculator
Server-ish Approach(es)
Another alternative is to use Lightsail and NodeJS server. Lightsail offers low (but not lower than serverless) cost to a full time NodeJS server. In this case you would want to clone your project to the server then setup systemd script to keep nodejs running.
Yet another way to do this and have some scalability is to use ECS and Docker. In this case you would create a Dockerfile that builds the containers, executes npm start, and exposes the port Nuxt runs on to the host. This example shows how to run it using Fargate, which is essentially a serverless version of EC2 machine. Calculator
There are some ways for you to deploy your stack in AWS. I can give you some options, but you're best shot if you want to save some costs is by using Lambda Functions as your backend, S3 for your front-end and a batch Lambda job for your python script.
For your backend - https://github.com/vendia/serverless-express
For your nuxt frontend - https://nuxtjs.org/deployments/amazon-web-services
For your python job - https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/RunLambdaSchedule.html
It's not way too simple to execute all of those, but with the following links you'll probably have an idea of how you may implement your solution.
Related
I have a Django Web Application which is not too large and uses the default database that comes with Django. It doesn't have a large volume of requests either. Just may not be more than 100 requests per second.
I wanted to figure out a method of continuous deployment on AWS from my source code residing in GitHub. I don't want to use EBCLI to deploy to Elastic Beanstalk coz it needs commands in the command line and is not automated deployment. I had tried setting up workflows for my app in GitHub Actions and had set up a web server environment in EB too. But it ddn't seem to work. Also, I couldn't figure out the final url to see my app from that EB environment. I am working on a Windows machine.
Please suggest the least expensive way of doing this or share any videos/ articles you may hae which will get me to my app being finally visible on the browser after deployment.
You will use AWS CodePipeline, a service that builds, tests, and deploys your code every time there is a code change, based on the release process models you define. Use CodePipeline to orchestrate each step in your release process. As part of your setup, you will plug other AWS services into CodePipeline to complete your software delivery pipeline.
https://docs.aws.amazon.com/whitepapers/latest/cicd_for_5g_networks_on_aws/cicd-on-aws.html
So, I have a separate Angular frontend and Node JS backend. After doing a bunch of research, I've decided to publish my frontend to AWS S3, and backend to Heroku. Now, I want to understand how (or if) I can automate integration and deployment of both apps together or will I have to do it separately? So either I keep CI/CD of Angular side on S3 and Node side on Heroku or I can somehow manage both apps build/deploy automation at once.
Not sure, but here's a thought: If I deploy frontend on S3 and backend on Elastic beanstalk, does it make it easier to manage as AWS would somehow know to CI/CD both together.
Or ultimately, I'd have to keep them both separate and when I make changes to the Node side, the Angular side wouldn't even know to re-build and re-deploy.
Also, please don't suggest keeping them in one folder bc I need to keep them separate.
I'm even to hearing about more technologies to host my backend and frontend other than mentioned above like Netlify, Azure (ofc with explained advantages).
Well there's no right or wrong way to implement this. Both approaches of keeping a same pipeline and separate one has it's own pros and cons. In my approach you should have 2 pipelines 1 for backend and 1 for frontend. Each of them should be independent of each other as well. So when 1 is triggered, other one doesn't care.
You can host Node app on AWS Lambda or AWS Fargate+ECS. Don't use AWS code pipeline as it's a very bad CI solution. https://buildkite.com/, CircleCI, Github Actions, etc are much better choice.
I have my services split into Cloud Run services, say e.g. in a blog application I have user, post and comment services.
For each service I get a separate http endpoint on deploy, but what I want is to have api.mydomain.com act as a gateway for accessing all of them via their respective routes (/user*, /post*, etc).
Is there a standard (i.e. GCP-managed and serverless-ey) way to do this?
Things I've tried/thought of and their issues:
Firebase hosting with rewrites - this is the 'suggested' solution, but it's not very flexible and more problematically I think this leads to double wrapping CDNs on every request. Correct me if wrong, but Cloud Run endpoints use a CDN already, then you have Firebase hosting running through fastly. Seems silly to be needlessly adding cost and latency like that.
nginx on a constantly running instance - works ok but not managed and not serverless; requires scaling interventions
nginx on Cloud Run - this seems like it would have highly variable performance since there are (a) two possible cold starts, and (b) again double wrapping CDN.
using Cloud LB/CDN directly - seemingly not supported with Cloud Run
Any ideas? For me this kind of makes Cloud Run unusable for microservices. Hopefully there's a way around it.
I searched online but can't find a good answer to what the key differences are between AWS Lambda vs Heroku. I can for example write Node JS but when should I use Lambda and when Heroku?
AWS Lambda is for creating server side apps based on serverless architecture, while Heroku is a platform as a service (PaaS) which can be used to build and run server apps.
Heroku is a service which provides tools to deploy, manage, and scale server applications.
You can use node.js, go, python, etc. for your server app which runs on a Heroku instance.
Here are the key differences:
On Lambda you pay for the computation time, while on Heroku you pay monthly.
Lambda instances are created on demand, while Heroku instance is always running.
Heroku provides add-ons which are easy to use to integrate third party services, such as MailGun and Redis, in your service.
If your service is just going to do something simple such as converting an image and saving it on a S3 bucket, you can use AWS Lambda, but if you are creating a complex service which is doing multiple different tasks, it is better to use Heroku and choose a language/framework which fits your needs.
You can learn more on serverless architecture here.
I have searched high and low for an answer to this question but have been unable to find one.
I am building an Angular 2 app that I would like hosted on an S3 bucket. There will be an EC2 (possibly) backend but that's another story. Ideally, I would like to be able to check my code into Bitbucket, and by some magic that alludes me I would like S3, or EC2, or whatever to notice via a hook, for instance, that the source has changed. Of course the source would have to be built using webpack and the distributables deployed correctly.
Now this seems like a pretty straightforward request but I can find no solution exception something pertaining to WebDeploy which I shall investigate right now.
Any ideas anyone?
Good news, AWS Lambda created for you.
You need to create following scenario and code to achieve your requirement.
1-Create Lambda function, this function should do the following steps:
1-1- Clone your latest code from GitHub or Bitbucket.
1-2- install grunt or another builder for your angular app.
1-3- install node modules.
1-4- build your angular app.
1-5- copy new build to your S3 bucket.
1-6- Finish.
2-Create AWS API gateway with one resource and one method point to your Lambda function.
3-Goto your GitHub or Bitbucket settings and add webhook with your API gateway.
4-Enjoy life with AWS.
;)
Benefits:
1-You only charge when you have the new build.
2-Not need any machine or server (EC2).
3-You only maintain one function on Lambda.
for more info:
https://aws.amazon.com/lambda/
https://aws.amazon.com/api-gateway/
S3 isn't going to listen for Git hooks and fetch, build and deploy your code. BitBucket isn't going to build and deploy your code to S3. What you need is a service that sits in-between BitBucket and S3 that is triggered by a Git hook, fetches from Git, builds, and then deploys your code to S3. You need to search for Continuous Integration/Continuous Deployment services which are designed to do this sort of thing.
AWS has CodePipeline. You could setup your own Jenkins or TeamCity server. Or you could look into a service like CodeShip. Those are just a few of the many services out there that could accomplish this task. I think any of these services will require a bit of scripting on your part in order to get them to perform the actual webpack and copy to S3.