Installation of Jenkins in AWS EC2 Ubuntu 16.04 LTS - amazon-web-services

I am trying to implement the CI/CD pipeline for my spring boot application deployment using Jenkins on AWS EC2 machine. And I am using containerized deployment of micro services using Docker. When I am exploring about the installation of Jenkins I found that , we can use Jenkins docker image. And also we can install normally. I found the following link for example of normal installation of Jenkins.
wget -q -O — https://pkg.jenkins.io/debian/jenkins-ci.org.key | sudo apt-key add -
Here my confusion is that , If I am using Dockerized deployment of my micro services , Can I use normal installation of Jenkins in my VM and Can I use docker commands inside Jenkins pipeline job?
Can anyone help me to clarify the confusion please?

If you want to run docker commands in Jenkins pipelines on the same machine where Jenkins exists you should run it without container as that configuration will be much easier for you - you need to just add Jenkins to "docker" group so he can run docker containers.
When you run Jenkins from within container configuration is a little harder as probably you need to map host's docker daemon socket to Jenkins container so he can start docker containers on host or you need to use docker-in-docker feature but please take a look on that article: http://jpetazzo.github.io/2015/09/03/do-not-use-docker-in-docker-for-ci/

Related

Docker Service not running on EC2 when launched in an AWS auto-scaling group?

Currently I have a fresh Ubuntu image with just docker installed on it. I tested this image to make sure my docker container works and everything is working fine.
I create a golden AMI of this. I create a launch template and add below to the User Data.
#!/bin/bash
sudo docker pull command
sudo docker run command
My issue is that the command will not run. From the other testing I have done it seems like the docker service is not running when these commands are being executed. When I ssh on to the EC2 and check the docker service it is running. I can manually run the docker commands and it works.
Any idea why the docker service wouldn't run when it boots up with the autoscaling group?
I have tried adding a bash file and running that through the User Data but same thing.
I even added a sleep to see if the docker service would come up before running the commands but still the same thing.
Any help would be greatly appreciated.

How to run AWS SAM CLI within a container as it requires "docker" daemon to be running?

I need to run aws-sam-cli within a docker container with a Ubuntu 20.04 base image but I am having trouble installing and running "docker" daemon into the container as "sam" depends on docker to be running.
docker in docker (dind) won't help as it only allows someone to start a container within a container. In my case, "sam" needs to build and push things into a local docker registry and then deploy the built image into AWS ECR.
If I can build on the "docker" base image and add aws-cli and aws-sam-cli into it, it might help but I have not been successful doing so either.
Does anybody have a solution out there that allows docker registry to be running within a container?
Many thanks.

Is it possible to install Apache superset on a ECS container

I am working on Apache Superset, I was able to install it on a linux EC2 instance using docker , is there is any possibility to install it on ECS ?
There are a couple of approaches to this.
First you can take the container image here and build an ECS task def / ECS service around it by bringing it up standalone. Make sure you enable ECS exec to be able to execute into the container and launch those commands. I have not tested this but I see no reason why it should not work.
I have also spent some time trying to make the docker compose files in the Superset GH repo work with Amazon ECS. You can read more about my findings here.

Correct approach in deploying stack to Docker for AWS

I am trying to deploy my docker-compose based stack to Docker for AWS (created via AWS CloudFormation).
My compose YAML file is managed in Git repository and Docker images in private registry (Gitlab).
What is the correct way of working with Manager to deploy a service?
I tried (and failed) several approaches:
Working with local Docker client via Docker API is not possible, because Docker for AWS manager node is not opening 2375 port.
Rsyncing compose YAML and environment file directly to manager node is not possible, because rsync is not installed on Amazon Docker AMI.
curl the file from Gitlab seems like a very inconvenient way of doing it.
Thanks
Found a way to do it more or less properly (according to some comment in Swarm documentation):
Create SSH tunnel to manager:
$ ssh -NL localhost:2374:/var/run/docker.sock docker#<manager ip> &
Run everything locally with
$ docker -H localhost:2374 info
or define
export DOCKER_HOST=localhost:2374
and use docker as you if you are running on Swarm manager
$ docker info
In my opinion, there are 2 options you can try
Use Jenkins and then there is a plugin called publish over SSH. You can use this plugin to send your compose file and then run commands like "docker stack deploy".More description can be found here
You can use Docker cloud to bring your swarm to your local terminal, similar to what you have already done . Follow this link
The first approach is much better because you have automated the deployment, now you can schedule deployments, run it on click of a button or even on commits.
Hope it helps !!

How to deploy a spring boot application jar from Jenkins to an EC2 machine

I'm seeing so many different sources how to to achieve CI with Jenkins and EC2 and strangely none seem to fit my needs.
I have 2 EC2 ubuntu instances. One is empty and the other has Jenkins installed on it.
I want to perform a build on the Jenkins machine and copy the jar to the other ubuntu machine. Once the jar is there i want to run mvn spring-boot:run
That's is - a very simple flow which i can't find a good source to follow that doesn't include slaves, dockers etc..
AWS Code Deploy lets you use a Jenkins and deploy it on your EC2 instances.
Quick google search gave me this very detailed instruction on how to setup code pipeline with AWS Code Deploy.
The pipeline uses GitHub -> Jenkins -> EC2 flow, as you need it.
Set up jenkins to do a build then scp the artifact to the other machine
There's an answer here how to setup ssh keys for jenkins to publish via ssh about setting up the keys for ssh