Easier way to update files on Amazon EC2 instance? - amazon-web-services

Major newbie when it comes to Amazon EC2 servers, and web development in general.
At the moment I have a web app that is hosted on parse. Everything is done on the client side in the browser, and I want to change it to a client server model by writing a server in node.js.
I've looked into Amazon EC2, I've set up and instance and it looks good. My question is however:
Is there an easier way to update files on the instance? At the moment I'm pushing all the files from my computer to a github repo, then pulling them on to the instance- this seems very long winded. When using parse, all I needed to type was 'parse deploy' into the command line to update and deploy my application, is there something like this for Amazon EC2?
Thank you

I typically install or enable FTP on my ec2 instances and then just use the ftp client of my choice to update files.

Related

AWS 3-tier architecture issues

Guys I am trying to implement a 3-tier architecture to host a web app on aws.
The requirements given to me are as follows.
The app will leverage a 3-tier architecture:
A Web Server that will be running on S3
An application tier running on ECS Cluster on Fargate or a fleet of EC2s with ASG (your choice)
A data tier running on RDS Aurora PostgreSQL latest supported version
I understand perfectly what to do on the 2nd and 3rd instructions for the App and Database tier.
What I don't get is the “web server running on s3” . Is it possible to have a web server on S3?
What I know is, I can have a web server running on EC2.
Please, I need some explanation here.
Yes and no, S3 is a static file host, which means you have these HTML, CSS, and JS files where all you want to do is to send these files to the browser, then absolutely, yes. S3 can be used as a file serving service, https://docs.aws.amazon.com/AmazonS3/latest/userguide/WebsiteHosting.html
However, when you have the case where your website is doing some real-time HTML generation, something like SSR (Server Side Rendering), S3 won't cut it. S3 does not process the code in any way and only directly sends the files as-is to the frontend. In which case, you need a more traditional server on EC2/ECS/EKS.

Should I use docker (nginx) for serving a SPA?

I only have 1 javascript file(bundle.js packed by webpack) and 1 html. It's kinda like SPA.
I'm thinking how I host this SPA? I already have one clean VM on Amazon EC2.
I was planning setup a docker (Nginx) on this EC2. However, as I said, this VM is clean. Only this SPA will use this EC2 VM.
So I'm confused by this situation. Should I use docker(nginx) or just install a Nginx on this EC2 for serving this SPA?
AWS S3 service is capable of service static files, You just need to upload your files to a bucket, then make them public and note the public URL.
As a side note, Containerizing apps and using microservices architecture, will give you advantages, Some of them are:
Ease of Upgrade
Fault Containment
Ease of technology change
Increased Security
Efficient Resource Utilization
S3 is cheap enough for static files, almost free compared to EC2 unless you have backend in place. You can use Cyberduck for S3 and if you want to go FTP one day, same app would give you a common UX for uploading your files.
Though Docker setup would be over engineering for static serving in IaaS, you would need to build an image that contains nginx and your files as in KyleAMathews/docker-nginx project.

Should multiple ec2 instances share an EFS or should code be downloaded to instance on spin-up?

I am playing around with the idea of having an Auto Scaling Group for my website that receives a lot of traffic. I need each server to be running an identical webservice, so I have come up with several ideas to make this happen.
Idea 1: Use Code Commit + User Data
I will keep my webserver code in a git repo in CodeCommit. Then, when my EC2 instances spin-up, they will install apache2, and then pull from the git repo.
Idea 2: Use Elastic File System
After a server spins up, it will mount to one central EFS that has my webserver code on it. EC2 will install apache2 then use EFS to get the proper php files etc.
Idea 3: Use AWS S3
Like above with apache2, but then download webserver code from s3.
Which option is advised? Why?
I suggest you have a reference machine which is used for creating images. Keep it updated with the latest version of your code and when you are happy with it, create an image out of it, update your launch configuration, and change the ASG configuration so that it uses it. You can then stop the reference machine and leave the job to the ASG instances.

how to deploy code form bitbucket to amaozn EC2 server

I have created a laravel application and now i have push my code to a staging server.
In my amazon AWS account I have created an EC2 server, simple way to do this, and find it confusing.
I want to run this project on server.
Can someone please point me in the right direction?
You can use startup script on your EC2.
Your startup script should do the following steps:
1-Install requirement software or services in the new machine.
2-Download or clone latest your app code from git and build it if necessary.
3-Download other assets or software or data from the S3 bucket.(Java, Tomcat, or WARs)
4-Start and configure services.
You should strongly consider using the AWS CodeDeploy service to manage the deployment of code on your instance.
It is easy to setup, and the service itself is free; you pay for instance usage only.

Drupal with Amazon Web Services?

I'm not sure if this is the write place to ask, but this is the only site I know where I get my questions answered... anyways
I wanted to install drupal but where should I host it? Can amazon web service host this such application? Do I need to go somewhere else and host it? I do have an account with inmotionhosting, but I was thinking if Amazon does the job, why not just use it? Any thoughts and opinions?
You can install Drupal on AWS EC2 if you have sys admin experience. Otherwise you will need to use a managed platform, like Cloudways, for that. Configuring web server like Apache and Nginx, cache like Varnish and Memcached and other features on AWS is little difficult. Many managed servers have those features available in their platform so you don't have to configure anything or go through long process of installing application on AWS.
Amazon Web Services (AWS) will host Drupal no problem.
The service you're looking for is Amazon Elastic Compute Cloud (Amazon EC2). It's pretty much equivalent to a private server with which you can do almost whatever you want (Web hosting included). The downside is that you have to do all the setup yourself.
If you don't know how to install Apache or configure your own Linux machine, you'd probably be better off with managed hosting where they'll set everything up for you.
You can also just use AWS Cloudformation to set up your drupal environment. It's a service that is part of AWS that will set up your stack for you. you may still need to know how to handle your config files but at least you do not have to go into installing the DB , Apache etc all manually.
http://aws.amazon.com/cloudformation/
Bitnami provides a free (Apache-licensed) pre-built Drupal image for AWS that you launch easily. It is great for quickly testing something but if you choose the right instance for your expected load, also for production (disclaimer: I am a cofounder of Bitnami, though as I mentioned the image is open source)
Drupal can be deployed and hosted automatically on Jelastic PaaS. You won't need to configure it from scratch. And if you wish to make some custom settings while installation, you can also easily install it manually. Both variants are described in the guide.
As a result, you'll get automatic scaling, pay-per-use pricing, management via intuitive UI, a wide choice of local service providers from different countries and other options to run your Drupal effectively.