running a docker loop device on aws - amazon-web-services

I'm new to aws and am having some issues with getting my mobile app back running again. Forgive me if this question seems vague.
For a school project we created a mobile app on aws and deployed using docker containers (another student managed these tasks). When trying to get my own key pair to ssh into my ec2 instance i detached the volume associated with my instance and reattached it after getting my own key pair. Now i can ssh into my instance but my front end cant talk to my web server.
So my question is, do i create a new application on elastic beanstalk to deploy my app? Even though when i run lsblk is shows a have a docker loop device and when i run docker images i see several that match the name of my application? or do i somehow get the container running again, docker run doesn't seem to be working.

No need, just upload a new update into Elastic Beanstalk. AWS will handle the rest.
FYI, Elastic Beanstalk - Single Docker Container update process (simple under the hood):
You upload the update into AWS.
AWS will put it on your S3.
Inside your EC2, there is an Elastic Beanstalk agent. It will check for a new update.
If there is an update, the agent will download the update file and extract it.
The agent will build a new Docker image.
If the build is success, it will generate a new config to tell Nginx (web proxy) the new web server container.
Nginx will be reloaded.
Your old docker container will be destroyed.
Don't change anything inside EC2 of Elastic Beanstalk, except you know what you do. Elastic Beanstalk is design for automate deployment and scaling. So, if you change something in your EC2 manually, it might be lost. Of course, you can modified your EC2 instance, but you need to automate it using .ebextensions or take an image.

Related

Running an ec2 instance from other server provider existing image

I have installed ressources on a scalway server scaleway.com (a frensh servers and cloud provider). This server uses ubuntu as os and runs a python app with many installations and configs.
In scalway, you can run a new server from a server image you had before installed or already existing. In ec2 too, you can surely launch an aws ec2 instances from an existing AMI https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/AMIs.html.
But, is there a way please to launch an ec2 instance with a scalway image. It will save much time in deployment of the app.
Thanks.
It seems that there is no way:
https://www.reddit.com/r/Scaleway/comments/jwha9q/is_there_a_way_to_download_an_image/

How to deploy multiple services to AWS ECS?

I have a docker-compose setup made up with 2+ containers that I use for local development. I'm trying to deploy the following to AWS ECS and I successfully deployed the first service but I'm struggling deploying more. It seems like I'm missing something in the configuration.
To deploy the first service to ECS, I followed these steps:
Deployed docker image to ECR
Created an ECS cluster
Created a task that I attached my ECR image to
Configured a VPC
Configured ELB
Successfully launched a new instance inside of my ECS container
To deploy the second service, I'm trying to follow steps 1,3 and 6 and it seems like it's deployed successfully. However, when I try to hit any endpoint from the second service, it results in /404 at the first one.
As I understand, my current ELB configuration is set up to route all the traffic to the first instance, so my question is - which steps I should also re-use while deploying the 2nd service? Should it have a separate ELB and security group?
I tried googling it but all the articles are about deploying a single instance which I had no problems with.
You dont need to create a cluster for second service. You need create another task definition with new image of second service (deployed to the ECR). Next step is create a service in the cluster and setup Service discovery in the Configure network step.
Documentation
You might want to consider this: https://docs.docker.com/engine/context/ecs-integration/. It's still in beta but it can deploy your compose file directly to ECS for you.

AWS ECS repository creation with EC2

Hello,
I have a running EC2 with applications installed and I want to migrate the app onto Docker container.So, can we create an ECS Docker repository using running EC2 instance?
The biggest thing in your question is dockerizing your apps.The complexity varies as per technologies used in app, app server as docker image and many other things comes into picture.Its not direct conversion and you need to do lot of work to convert apps as docker images depends app comlexity.
ECR is docker registry where you store docker images and you don't need ec2 instance for that.
Once you docker images and ECR, you need to create ECS cluster to run your apps docker containers.

How to connect to specific instance behind Elastic Load Balancer

I'm deploying my app via Elastic Beanstalk, which creates and Elastic load balancer and puts all my instances behind it (3 or more).
Is there a way to contact each of these instances directly? I want to trigger a specific command on each instances (git pull command to synchronize with the latest code in my remote repo).
I have the list of IP address and public DNS of the instances from PHP SDK but since the firewalls rules restricts the source of IP address to the elastic load balancer IP on port 80, I can't seem to access them directly.
Is there a way around it?
P.S. The SSH port seems to open for all traffic, but how can I create a trigger with that? I'm hoping to create a PHP script to automate this with a webhook on the remote repo.
I highly suggest you use the EB CLI with git integration for all deployments, no matter how small. It is great because you can map a git branch to an environment with eb use YOUR_ENV then when you run eb deploy with that branch checked out it will deploy to that environment.
There is a lot of work involved in ensuring multiple servers pull the correct code and everything is working as expected. What if a server is in the processes of spinning up but is not ready for SSH so your script skips it and it does not get the new code?
Also, what happens when a new server spins up but it is using the old application because that's what is in EB? You could have your kickstart do a git pull but then what happens when you are not ready to push, a new server starts and is alone with the new code?
I could probably find 5 more edge cases without breaking a sweat. Look into eb deploy, you will be happy you did.
You need to setup a CI (or make a simple web service) and create a webhook in your repository. Your CI need to get all instances under your Elastic Beanstalk environment and then call git pull via SSH.
Or, just create a cron job in your all instances via .ebxensions script.
I thought it's not a good practice in Elastic Beanstalk to run git pull in order to synchronize your app with your git repo. Because, it misused the Application Version semantic meaning. Sometimes, you can't determine which app version are in your instances from Application Version. It's better to create a new Application Version in Elastic Beanstalk to deploy a new app version.
If you host your repo in Github, you can take a look into CodeDeploy.

Pointing amazon AWS Elastic Beanstalk to existing EC2

Was wondering if someone can help with below amazon AWS question, seems a basic item but can't find any answers, getting very frustrated.
1) I have an EC2 instance running that has a third party process running in the background, and when called from command line it spits out a number.
2) I have a java web app that runs this command line and uses the output for the web gui etc..
But for the life of me, i cannot figure out how to deploy my java web app on the SAME existing EC2 that's running the process, every time i try to create an elastic beanstalk it creates a new EC2 instance.
How do i make the elastic beanstalk to run off the same existing EC2 i already have? I understand there are other workaround to pass the data remotely but this seems a fundamental requirement that is missing from AWS - that you cannot run your web app with backend/batch processes on the same EC2 instance?
Thank you
Elastic Beanstalk is basically a higher abstraction layer on EC2 and it's tightly coupled with it. That means at a minimum every time you deploy your application it will spin up an EC2 server.
The advantage is that you don't need to manage your EC2 instances, for example it will autoscale automatically depending on your traffic demand.
The disadvantage is that it theoretically doesn't allow you to tweak little things in the EC2 instance because you may mess up how the Elastic Beanstalk interprets your app. Also, I believe you cannot force your Elastic Beanstalk deployment to use a specific AMI.
If you want more flexibility in your app (which sounds like your do) I recommend do your own deployment for your application (No Elastic Beanstalk). That way you can run the your app and your jobs on the same EC2 Instance.
You can use custom AMI with Elastic Beanstalk.
AWS documentation has a guide on how to create and use a customized AMI: Using Custom AMIs
But then again, nobody's stopping you from running your background processes on the standard Elastic Beanstalk instance. I run background cron jobs and Flask application on one Elastic Beanstalk instance.
files:
"/tmp/cronjob-for-foobar" :
mode: "000777"
owner: ec2-user
group: ec2-user
content: |
# skip
# clean up files created by above cronjob
30 23 * * * rm $HOME/cron*.log
container_commands:
70-foobar-cronjobs:
command: crontab /tmp/cronjob-for-foobar
Obviously, you can have anything scheduled in cron, as long as you stay within your instance limits.