build and push docker image to AWS ECR using lambda - amazon-web-services

Is it possible to automate building a docker image from code committed into github (no tests involved) and then push it to AWS ECR using a lambda function?

you cannot do it just with lambda as lambda is not really a suitable execution environment for the docker daemon (necessary to build the images), however you can use lambda + sns to trigger an endpoint that could point to a service you developed, hosted on ec2 that would trigger the docker build command after a git clone (you can use something similar to python's fabfile.org or a framework that allows you to execute server commands).
You sure can extend this idea on perhaps bringing the ec2 build machine up with some ami that automates this, etc....
The big point here is that you don't really have control over what's provisioned in lambda, so you need ec2.

Related

How can I deploy aws lambda as jenkins's agent?

I'm going to use Jenkins for CI/CD for the first time in my project.
So, I'm using "gitHub hook trigger for GITScm polling" to automatically make the code into a docker image when github pushed, and upload the image to the ECR when passing the test using "pytest".
However, it is burdensome to run the Docker image directly on the Jenkins master server and operate the pytest, so I just want to test the code in aws lambda whenever it is pushed.(in other words, I want to use lambda as Jenkins agent)
I want to implement this process through Jenkins pipeline(Jenkinsfile), but I don't know how.
Is there a way to create and deploy aws lambda through the Jenkins pipeline?
What you could try is to use the AWS Lambda Plugin.
It allows you to build and deploy lambdas from Jenkins. So you could create a lambda containg your logic, invoking it and using the result afterwards.
But keep in mind that lambda might not be the right tool for the job. Execution only can take 15 minutes, and I think you are better off using ECS/EKS.

Reg. Docker Image of microservices in AWS Lambda

I developed a spring boot-microservices application in which each microservice is packaged into a separate docker container. The databases for these services are also in separate docker containers. Currently, all these are hosted and running in AWS ECS. If I need to migrate to Lambda, can I reuse the same docker containers as such? (of course, I will add the AWS serverless dependency in all the pom.xml files) and do repackaging. Kindly let me know if I can run the modified docker images as such in Lambda?
Thank You
I think you can't use share the same docker image between your ECS task and Lambda. Because they differ in few aspects and some of them are very specific to lambda, how we write the handler as well as package them.
New for AWS Lambda – Container Image Support
Your wording too is a bit confusing
can I reuse the same docker containers as such?
and then you say
can run the modified docker images as such in Lambda?
Example task definitions
For Spring Boot Application specifically, you can take a look here:
Running APIs Written in Java on AWS Lambda
Java AWS Lambda Container Image Support (Complete Guide)

Run AWS Lambda inside Docker in local

I'm new to AWS and for learning purpose I created a free AWS account. I don't want to install all dependencies, packages and configure them with my test account in my pc until I learn them well. So I planned to create a docker image so I can do configurations later in my pc. But I can't find any good example how to set up docker image for AWS Lambda. Can you please help me to set up docker image?
p.s
I'm using NodeJs
Check out https://github.com/localstack/localstack - A fully functional local AWS cloud stack (Lambda as well).
The solution will depend on language you are going to use for lambdas.
Try some tutorials, i.e. the next descibes how to simulate lambda for python:
https://aws.amazon.com/premiumsupport/knowledge-center/lambda-layer-simulated-docker/
Recent AWS blog describes how to do it:
How do I create a Lambda layer using a simulated Lambda environment with Docker?
Basically you can run already made docker image for that:
https://hub.docker.com/r/lambci/lambda/
This is the same docker image used by AWS's SAM (Serverless Application Model) when you test your lambda function locally. Thus this is the closest you can get to the real lambda environment.

AWS-ECS Task Services restart automation

Currently we are running application with serverless architecture microservices using AWS ECS, whenever we deployed or update new artifacts on ECR, we need to restart the services by changing the tasks from 0 to 1 vice versa to restart the service and pick-up the new artifacts. As we know this process is very manual and taking some steps to accomplish, I want to automate this, is it possible to use AWS-lambda or cloudwatch? or any configuration as long as to skip the manual process. What kind of code and language and example of automation do I need to achieve this?
Take a look at ecs-deploy script. Basically it will replace an existing service with a latest (or specific) image from ECR. So if you have automation to update ECR with the latest image this script will deploy that image to ECS
A setup which could work if you have a CI/CD pipeline is upon uploading to ECR, trigger a lambda which resets that corresponding service. Supplying any variables to the lambda such as ECR tag to pull or service name.
ECS has an option to restart services with ForceNewDeployment. In Python the call would look like.
updateTrigger = client.update_service(
cluster = myClusterName,
service = serviceToUpdate,
desiredCount = 1,
forceNewDeployment=True
)
From https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ecs.html#ECS.Client.update_service

Continuous Deployment of Docker Compose App to AWS/EC2

I've been trying to find an efficient way to handle continuous deployment with a Docker compose setup and AWS hosting.
So far I've looked into CodeDeploy, S3 buckets, and ECS. My application is relatively small with only 3 docker services, a Django app, NGINX, and PostgreSQL. I was unable to find any reliable information for using CodeDeploy with Docker compose and because of the small scale ECS seems impractical. I've considered an S3 bucket but that seems no better than just deploying my application with something like git or scp.
What is a standard way of handling deploying a docker compose setup on AWS? If possible I would like to use Bitbucket Pipelines or CircleCI to perform the deployment in a manually triggered step after running tests. But I've been unable to find a solution that would easily let me copy over the code (which is in a git repo on a production branch and is how I get the code onto the production server at the moment).
I would like to add some possibilities to #gasc answer
It would be better if you make a cloudformation template for deploying your EC2 resources with all required groups, auto scaling and other stuff.
Then Create the AMI with docker compose installed or any other thing you would be required for your ec2 enviroment.
Then you can use code deploy pipeline, here also aws provides private container registry may be you want to use that
Rest of the steps are same just SCP the compose file into EC2 launch
docker-compose up
command and you are done.
Let me know if you want more help I'm open for discussion
What I will do in your case is:
1 - If needed, update your docker-compose.yml file (or however you called it) to version 3 or higher, to use swarm.
2 - During your pipeline build all images needed, and push them to a registry.
3 - In your pipeline scp your compose file to a manager node.
4 - Deploy your application using swarm (docker stack deploy -c <your-docker-compose-file> your_app_name). This way you can handle rolling updates and scale easily.
Note that if you want to use multiple nodes you need to open a few ports in them
I see you mentioned that ECS might seem impractical for such a small scale - in my opinion not necesarilly. It would require of you to rewrite your docker-compose.yml into task and services definitions, but since there's not a lot of services, that shouldn't take you much time.