Deploy Django Code with Docker to AWS EC2 instance without ECS - django

Here is my setup.
My Django code is hosted on Github
I have docker setup in my EC2 Instance
My deployment process is manual. I have to run git pull, docker build and docker run on every single code change. I am using a dockerservice account to perform this step.
How do I automate step 3 with AWS code deploy or something similar.
Every example I am seeing on the internet involves ECS, Fargate which I am not ready to use yet

Check this out on how to Use Docker Images from a Private Registry (eg. dockerhub) for Your Build Environment
How to Use Docker Images from a Private Registry for Your Build Environment

Related

How to run commands in a fargate task

I have a requirement where i have to create a Fargate task that can clone a gitab repository(source code) and run a maven build command to build the code.
And there would be another fargate task that would create a docker image out of it.
Gitlab is on an EC2 instance.
Since we do not have exec access into the containers on Fargate, how and what would be the best way to do this. (I have multiple repos on Gitlab and so the repo that i want to clone and build is not going be the same every time)
I have been reading about the Amazon Elastic Container Service (ECS) / Fargate plugin on Jenkins.But i'm not sure if Jenkins can be used to get into a Fargate container and run commands.
nowadays you can use ECS exec. Here's how to set it up: https://aws.amazon.com/blogs/containers/new-using-amazon-ecs-exec-access-your-containers-fargate-ec2/
or in short:
https://www.ernestchiang.com/en/posts/2021/using-amazon-ecs-exec/

Deploying a dockerfile to AWS Fargate (Building docker images on AWS)

I have the end goal of deploying a docker container on AWS Fargate. As it happens, my dockerfile has no local dependencies and my upload connection is very slow, thus I want to build it in the cloud. What would be the easiest way to build the image on AWS? Creating an EC2 Linux instance, installing docker and aws-cli in it, building the image then uploading to AWS ECR, if that's possible?
The easiest way is by using AWS CodeBuild - it will do everything for you, even push it to AWS ECR.
Basic instructions: here

How to deploy a spring boot application jar from Jenkins to an EC2 machine

I'm seeing so many different sources how to to achieve CI with Jenkins and EC2 and strangely none seem to fit my needs.
I have 2 EC2 ubuntu instances. One is empty and the other has Jenkins installed on it.
I want to perform a build on the Jenkins machine and copy the jar to the other ubuntu machine. Once the jar is there i want to run mvn spring-boot:run
That's is - a very simple flow which i can't find a good source to follow that doesn't include slaves, dockers etc..
AWS Code Deploy lets you use a Jenkins and deploy it on your EC2 instances.
Quick google search gave me this very detailed instruction on how to setup code pipeline with AWS Code Deploy.
The pipeline uses GitHub -> Jenkins -> EC2 flow, as you need it.
Set up jenkins to do a build then scp the artifact to the other machine
There's an answer here how to setup ssh keys for jenkins to publish via ssh about setting up the keys for ssh

what triggers Elastic Beanstalk to pull in an updated Docker image

I have an Elastic Beanstalk application running and configured to serve a Docker container ("generic Docker" configuration) and linked to a private image on Docker Hub.
How can I prompt the Elastic Beanstalk application to download the latest version of the docker hub image after pushing up a new version with docker push?
Do I need to "restart the app server," "rebuild the environment," something else, or is "supposed" to pull it in automatically? Not seeing this addressed in the docs.
** EDIT **
To be clear, eb deploy does NOT pull in an updated Docker image, but it does push up the files from your application directory to your ec2 instances.
So, at the end of the day I'm probably not going to use docker push for deployments, but just to keep the image up to date in the case that you actually need to make ENVIRONMENT configuration changes, not code changes, or when bringing on a new developer, you can use docker pull.
Currently eb deploy my-environment-name is working great for Docker based Elastic Beanstalk deployments.
You just need to run command line: eb deploy. Here is a nice tutorial http://victorlin.me/posts/2014/11/26/running-docker-with-aws-elastic-beanstalk.

How to configure Amazon container service without docker hub integration

I am trying to setup a new springboot+docker(microservices) based project. The deployment is targeted on aws. Every service has a Dockerfile associated with it. I am thinking of using amazon container service for deployment, but as far as I see it only pulls images from docker hub. I don't want ECS to pull from docker-hub, rather build the images from docker file and then take over the deploying those containers.Is it possible to do? If yes how.
This is not possible yet with the Amazon EC2 Container Service (ECS) alone - while ECS meanwhile supports private registries (see also the introductory blog post), it doesn't yet offer an image build service (as usual, AWS is expected to add such notable additional features over time, see e.g. the Feature Request: ECS container dream service for more on this).
However, it can already be achieved with AWS Elastic Beanstalk's built in initial support for Single Container Docker Configurations:
Docker uses a Dockerfile to create a Docker image that contains your source bundle. [...] Dockerfile is a plain text file that contains instructions that Elastic Beanstalk uses to build a customized Docker image on each Amazon EC2 instance in your Elastic Beanstalk environment. Create a Dockerfile when you do not already have an existing image hosted in a repository. [emphasis mine]
In an ironic twist, Elastic Beanstalk has now added Multicontainer Docker Environments based on ECS, but this highly desired more versatile Docker deployment option doesn't offer the ability to build images in turn:
Building custom images during deployment with a Dockerfile is not supported by the multicontainer Docker platform on Elastic Beanstalk. Build your images and deploy them to an online repository before creating an Elastic Beanstalk environment. [emphasis mine]
As mentioned above, I would expect this to be added to ECS in a not too distant future due to AWS' well known agility (see e.g. the most recent ECS updates), but they usually don't commit to roadmap details, so it is hard to estimate how long we need to wait on this one.
Meanwhile Amazon has introduced EC2 Container Registry https://aws.amazon.com/ecr/
It is a private docker repository if you do not like docker hub. Nicely integrated with the ECS service.
However it does not build your docker images, so it does not solve the entire problem.
I use a bamboo server for building images (the source is in git repositories in bitbucket). Bamboo pushes the images to Amazons container registry.
I am hoping the Bitbucket Pipelines will make the process more smooth with less configuration of build servers. From the videos I have seen all your build configuration sits right in your repository. It is still in a closed beta so I guess we will have to wait a bit more to see what it ends up being.