So I have a website that was still being served even though I did not have an EC2 instance running on the us-east-1 dashboard.
I did have a load balancer running. When I terminated the load balancer the website is no longer being served.
My question is this...
Even though I had a load balancer there were no EC2 instances running. Where is the website being loaded from?
Doesn't an EC2 instance need to be running?
Not really, first of all, check if you have an Instance running in a different AWS region. If not, your site could be running in multiple other AWS Services like ECS, EKS or could be deployed Serverless (if the website is a Single Page Application: react, angular, vue).
So to answer your question: No, you don't need an EC2 instance running on AWS to host a website. And load balancers can be deployed in front of many other services that are not running on EC2.
Related
I am pretty new to AWS deployment (please send any helpful guides). And I read that it comes with an elastic load balancer but I've also heard that a lot of people put NGINX on an EC2 to use as a load balancer.
Do people commonly use either or? Having two seems redundant.
Nginx on an EC2 instance for load balancing would be a single point of failure, if the EC2 instance went down your app would be down. An AWS Load Balancer is actually multiple load balancer nodes distributed across multiple AWS availability zones to provide high availability. The EC2 instance would also be something for you to have to manage, where an AWS Load Balancer is managed for you by Amazon.
You mention Elastic Beanstalk in your question title. Elastic Beanstalk will use both. It uses a Load Balancer for distributing traffic across multiple instances, and it uses Nginx on each instance as a reverse proxy to your application.
Recently somebody manually deleted all Elastic Load Balancers on a AWS account I am working with. All the Load balancers had been provisioned from Elastic Beanstalk configs.
I rebuilt all the Elastic Beanstalk instances from previous configs to restore the deleted load balancers. The various applications are now running correctly apart from 2 which are failing to send traffic to each other. I will call them App A and App B.
App A is sending traffic to App B using its elastic beanstalk URL, however the messages are failing to send. If I SSH into App A, I can manually send JSON messages to App B using CURL and the EC2 private IP. When I ping the EB URL from App B, it shows me a IP to do not recognise and which is not allocated to any EC2 instances running on the account.
App B is in a private subnet with a network load balancer.
How can I get the Elastic Beanstalk URL to point at the correct IP.
I have recently inherited this environment and did not configure the original setup. So perhaps I am missing a step or aspect of how AWS Elastic Bean stalk is intended to work in this regard.
Additionally I am certain this is not a programmatic error ( the code has not changed since the instances where rebuild ) or a firewall setting as I am manually able to send traffic and get a response with a curl script.
Its is the beanstalk URL which appears to be incorrect
I have an Elastic Beanstalk instance that is running a Flask app. I want to know if there is any way through AWS to automatically block IP addresses that are doing unusual activity on my site.
This could be a range of things, for example:
Send several GET requests over and over
Trying to POST without a CSRF
And more. Any ideas? Thanks.
Generally, for that you would front your EB with application load balancer and AWS Web Application Firewall.
This setup is documented in the recent AWS blog and other sources:
How do I protect my Elastic Beanstalk environment against attacks from known unwanted hosts?
Setting up AWS Web Application Firewall (WAF) with Elastic Beanstalk
Guidelines for Implementing AWS WAF
I am creating the first betas of a project. I need a SpringBoot server connecting to MongoDBs in AWS.
MongoDB is already deployed as a replicaset in different EC2 instances. I was exploring AWS Beanstalk as environment to deploy the SpringBoot. However I am not yet ready to deploy a Load Balancer, because is costly.
I am looking for the way to deploy a Single-Instance Environment (https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features-managing-env-types.html?ref_=pe_395030_31184250_9#single-instance-environ) in a VPC (which would be shared with MongoDB).
Does AWS Beanstalk allows you to configure a EC2 instance within an VPC but without Load Balancer?
If not, I am planning to deploy an EC2 instance in the VPC myself without Beanstalk.
Other temporary solution would be accessing MongoDB over the internet, with the right security group rules, but i do not think is a good practice at all, so I am not considering it.
All Elastic Beanstalk environments are in a VPC, unless you have a really old AWS account that still supports EC2 classic. What you are looking for is the EB Single-Instance Environment type.
I have an spring boot application which is deployed on EC2 server which runs fine.
But after a certain amount of time, that task is killed automatically. I have tried to search for a solution but couldn't find any or may be I didn't exactly know what to search for. And hence the question on SO.
Is there anything that I can implement which will keep on pinging the instance which will keep it alive?
If yes, how do I implement it? And do I have to consider any other drawback for doing this?
Option 1. Elastic Container Service
Dockerize your app. Create a ECS cluster and publish you service. You Docker container will still run on a EC2 instance, but ECS will automatically maintain the number of app instances "alive". If you container gets killed (because the main Java process got killed), ECS will automatically restart the container.
Option 2. Elastic Beanstalk
Publish your Java app with Elastic Beanstalk and it will automatically monitor the "health" of you web app, and replace "unhealthy" EC2 instances automatically. It only works for web apps, since Elastic Beanstalk does HTTP(S) checks calling a URL you specify.
Option 3. Use Elastic Load Balancer with Auto Scaling Group
ELB can do HTTP(s) checks on you app. If an instance doesn't respond, it will be automatically replaced. You will require to create an AMI or cloud-init script to bootstrap new instances.