Docker Service not running on EC2 when launched in an AWS auto-scaling group? - amazon-web-services

Currently I have a fresh Ubuntu image with just docker installed on it. I tested this image to make sure my docker container works and everything is working fine.
I create a golden AMI of this. I create a launch template and add below to the User Data.
#!/bin/bash
sudo docker pull command
sudo docker run command
My issue is that the command will not run. From the other testing I have done it seems like the docker service is not running when these commands are being executed. When I ssh on to the EC2 and check the docker service it is running. I can manually run the docker commands and it works.
Any idea why the docker service wouldn't run when it boots up with the autoscaling group?
I have tried adding a bash file and running that through the User Data but same thing.
I even added a sleep to see if the docker service would come up before running the commands but still the same thing.
Any help would be greatly appreciated.

Related

How to run AWS SAM CLI within a container as it requires "docker" daemon to be running?

I need to run aws-sam-cli within a docker container with a Ubuntu 20.04 base image but I am having trouble installing and running "docker" daemon into the container as "sam" depends on docker to be running.
docker in docker (dind) won't help as it only allows someone to start a container within a container. In my case, "sam" needs to build and push things into a local docker registry and then deploy the built image into AWS ECR.
If I can build on the "docker" base image and add aws-cli and aws-sam-cli into it, it might help but I have not been successful doing so either.
Does anybody have a solution out there that allows docker registry to be running within a container?
Many thanks.

Is it possible to install Apache superset on a ECS container

I am working on Apache Superset, I was able to install it on a linux EC2 instance using docker , is there is any possibility to install it on ECS ?
There are a couple of approaches to this.
First you can take the container image here and build an ECS task def / ECS service around it by bringing it up standalone. Make sure you enable ECS exec to be able to execute into the container and launch those commands. I have not tested this but I see no reason why it should not work.
I have also spent some time trying to make the docker compose files in the Superset GH repo work with Amazon ECS. You can read more about my findings here.

Installation of Jenkins in AWS EC2 Ubuntu 16.04 LTS

I am trying to implement the CI/CD pipeline for my spring boot application deployment using Jenkins on AWS EC2 machine. And I am using containerized deployment of micro services using Docker. When I am exploring about the installation of Jenkins I found that , we can use Jenkins docker image. And also we can install normally. I found the following link for example of normal installation of Jenkins.
wget -q -O — https://pkg.jenkins.io/debian/jenkins-ci.org.key | sudo apt-key add -
Here my confusion is that , If I am using Dockerized deployment of my micro services , Can I use normal installation of Jenkins in my VM and Can I use docker commands inside Jenkins pipeline job?
Can anyone help me to clarify the confusion please?
If you want to run docker commands in Jenkins pipelines on the same machine where Jenkins exists you should run it without container as that configuration will be much easier for you - you need to just add Jenkins to "docker" group so he can run docker containers.
When you run Jenkins from within container configuration is a little harder as probably you need to map host's docker daemon socket to Jenkins container so he can start docker containers on host or you need to use docker-in-docker feature but please take a look on that article: http://jpetazzo.github.io/2015/09/03/do-not-use-docker-in-docker-for-ci/

Running docker on a data pipeline ec2 instance

I am having issues running docker (I cannot connect after the service is started) on an EC2 instance initiated as a ec2Resource - shellCommand in AWS Data Pipeline.
Basically I have a pipeline and part of it is run an ec2Resource, which performs a shell command - that command installs docker (successfully it seems) and then starts the service (again gets back the okay) - but the user cannot then connect to the docker daemon (like it's not running).
Has anyone got this working before?
Can I use, or should I be using a different AMI (I'm running in sydney AU).
Your help would be most appreciated!
Okay solved - run HVM not the default PV, that fixed it right away!

Correct approach in deploying stack to Docker for AWS

I am trying to deploy my docker-compose based stack to Docker for AWS (created via AWS CloudFormation).
My compose YAML file is managed in Git repository and Docker images in private registry (Gitlab).
What is the correct way of working with Manager to deploy a service?
I tried (and failed) several approaches:
Working with local Docker client via Docker API is not possible, because Docker for AWS manager node is not opening 2375 port.
Rsyncing compose YAML and environment file directly to manager node is not possible, because rsync is not installed on Amazon Docker AMI.
curl the file from Gitlab seems like a very inconvenient way of doing it.
Thanks
Found a way to do it more or less properly (according to some comment in Swarm documentation):
Create SSH tunnel to manager:
$ ssh -NL localhost:2374:/var/run/docker.sock docker#<manager ip> &
Run everything locally with
$ docker -H localhost:2374 info
or define
export DOCKER_HOST=localhost:2374
and use docker as you if you are running on Swarm manager
$ docker info
In my opinion, there are 2 options you can try
Use Jenkins and then there is a plugin called publish over SSH. You can use this plugin to send your compose file and then run commands like "docker stack deploy".More description can be found here
You can use Docker cloud to bring your swarm to your local terminal, similar to what you have already done . Follow this link
The first approach is much better because you have automated the deployment, now you can schedule deployments, run it on click of a button or even on commits.
Hope it helps !!