Elastic Beanstalk Always Updates Environment On Deploy From Codebuild - amazon-web-services

I have a large, multi-component django application I am trying to deploy to elastic beanstalk. I am using the multi-docker environment. THis is my current workflow
Git commit triggers AWS code pipeline
AWS Codebuild builds docker image (docker-compose build), runs some tests, and pushes this image to AWS Elastic Container Registry
AWS Code Build calls eb deploy
The issue I am running into is that when I call eb deploy from my local box, the it simply upgrades the application, but when I call it from Code Build, it is upgrading the environment every time, which takes about 30 minutes for some reason
I run the deploy command with -v and confirmed that the same files are being zipped. Any ideas on what is going on here, is my setup incorrect?
I also tried to deploy the application from Code Deploy in the pipeline and can confirm that it also always upgrades the entire environement.

I think that if you use CB to update your EB env, it just replaces it as it is being considered as a new environment. In your local workstation you are using only one single environment, but with new application version.
I would consider replacing CB for updating your EB environment, with the EB deploy provider in your CP. This should successful just upload your new application version to an existing EB environment.
CP natively supports a number of deploy action providers, one of the being Elastic Beanstalk:
You can configure CodePipeline to use Elastic Beanstalk to deploy your code. You can create the Elastic Beanstalk application and environment to use in a deploy action in a stage either before you create the pipeline or when you use the Create Pipeline wizard.

Related

Elastic beanstalk is not pushing updated version of source code on EC2

we have a situation where incremental source code deployment is not working in Elastic beasntalk.
so If a new elastic beanstalk environment is created with source code package, it works and EC2 instance gets the source package, but next time if I want to deploy a new version of source code, gitlab pipeline is successful and package reaches to the elastic beanstalk as I can see it under application version, however new source code does not deploy on existing EC2 instances.
we are using deployment_policy and rolling update type as "Immutable" in elastic beanstalk configuration. one thing, that's observed is, if we terminate existing Ec2 instance, ASG spin up a new EC2 instance and that contains updated source code.
also, manual code deployment from Elastic beanstalk console is working and revised code version gets updated on EC2.
we are using Terraform to build up the EB infrastructure and gitlab for package deployment as pipeline.
ASG is set as min_size = 1, max_size = 2
Any help is appreaciated.
Did you configure a deploy token on gitlab? Did you follow this instructions for more details about how to deploy with gitlab CD pipes?
I think it is an environment variables problem! Hope it helps.

elastic beanstalk cli publishing .net core 5 webapi to existing Linux Environment - Multi-proj in one solution

I'm trying to deploy .net core 5 application to Elastic Beanstalk.
I can deploy application via AWS Toolkit. But I'm trying to deploy via EB CLI. Because github actions works with EB CLI.
So when I'm trying command eb deploy, solution create artifact with solution root directory. How can I point to eb deploy command pick up KKWebApi project, not solution.
Many Thanks.

Deploy Django Code with Docker to AWS EC2 instance without ECS

Here is my setup.
My Django code is hosted on Github
I have docker setup in my EC2 Instance
My deployment process is manual. I have to run git pull, docker build and docker run on every single code change. I am using a dockerservice account to perform this step.
How do I automate step 3 with AWS code deploy or something similar.
Every example I am seeing on the internet involves ECS, Fargate which I am not ready to use yet
Check this out on how to Use Docker Images from a Private Registry (eg. dockerhub) for Your Build Environment
How to Use Docker Images from a Private Registry for Your Build Environment

AWS ElasticBeanstalk configuring or running additional commands on deployment

I am working deploying a Laravel application to the AWS ElasticBeanstalk. I configured the CLI and I could deploy the application to an ElasticBeanstalk environment running the command. This is what I have done so far.
I created an ElasticBeanstalk application and an environment in it.
Then I initialised the application for deployment using "eb init" and deployed it using "eb deploy". But I would like to add some additional commands to be run during the deployment. For example, I might run "gulp build" or other commands. Where and how can I figure it? I know that there is an .elasticextension folder but that does not allow us to add custom commands to be run on deployment.
I know that there is an .elasticextension folder but that does not allow us to add custom commands to be run on deployment.
Not sure what do you mean that you can't run commands in .ebextensions during deployment. But the extensions are commonly used for running commands or scripts when you are deploying your app. There are special sections for that:
commands: You can use the commands key to execute commands on the EC2 instance. The commands run before the application and web server are set up and the application version file is extracted.
container_commands: You can use the container_commands key to execute commands that affect your application source code. Container commands run after the application and web server have been set up and the application version archive has been extracted, but before the application version is deployed.
There are also platform hooks on Amazon Linux 2 to further fine tune the deployment of your applications.
Finally, if all of them are not suited, you could create dedicated build step in CodePipleline for you application. The dedicated step could be used to create fully deployment version of your application for EB with minimal amount of work to do at EB instances.

What is a good way to deploy a distributed application using CodeDeploy and a CI tool?

When using AWS, it seems a nice way to deploy an application to a newly created instance is via AWS CodeDeploy. This works as follows:
Set up an auto-scaling group for the application
Write a user-data bash script for the auto-scaling group which pulls the CodeDeploy agent from S3, installs it and starts it
Set up a CodeDeploy deployment group which deploys to the auto-scaling group
Now, when an application bundle (e.g. jar or debian package) is deployed to the deployment group, it will be deployed automatically to new instances launched in the auto-scaling group.
My question is: how can this deployment strategy fit with a CI tool like Travis CI?
Specifically:
How can CodeDeploy pick up a package built by a CI tool like Travis CI? Does the build job need to upload the package to S3?
How can CodeDeploy be used to deploy the application gradually (e.g. one instance at a time)?
Does this deployment strategy require each running instance to be shut down and replaced, or is the new version of the application deployed on the existing instances? If it is the former, machine IP addresses would change during deployment, so how can other services discover the newly deployed application (i.e. without hardcoded IP addresses)?
tl;dr version:
The build job needs to upload the package to S3.
Use the one at a time deployment config.
The new version of the application is deployed on the existing instances.
Ok, here's the long version:
I recommend you try the Deployment Walkthrough or take a looks at Concepts in the documentation. It should help you get familiar with CodeDeploy faster.
You don't have to use an AutoScaling group with CodeDeploy if you don't want to. CodeDeploy with AutoScaling integration allows you to manage fleets that need to change in size dynamically separately from the code that is deployed to them, but that is not a requirement to use CodeDeploy. You can also launch some EC2 instances manually, install the host agent, and then tag them into a deployment group - but they won't get deployed to automatically on launch like the AutoScaling instances would. In either case, you can always create fleet wide deployments.
You'll have to do some work to integrate it with your CI tool. CodeDeploy doesn't directly manage your build artifacts, so your build process will need to do that. To have automatic deployments, your will need to:
Create a archive bundle with an appspec.yml, any scripts you need to handle the install/upgrade, and your build artifacts.
Upload the bundle to S3.
Create a deployment in CodeDeploy.
You might want to look at CodePipeline as an example of a continuous delivery system that's integrated with CodeDeploy.
CodeDeploy uses deployment configs to control how aggressively it deploys to the instances in your fleet. (This config gets ignored for automatic deployments, since each instance is handled separately.) CodeDeploy will fail your deployment and stop deploying to new instances if it cannot potentially fail another instance without violating the constraints in the deployment config.
There are three built in deployment configs, and you can create your own via the CLI or API if you need a different one. To deploy to only one instance at a time, you can use the CodeDeployDefault.OneAtATime deployment config, which allows at most one unhealthy host at any given time.
For anyone else (like me) looking for an example on how to actually integrate Travis-CI with CodeDeploy:
Configure the Application, DeploymentGroups and instances, as explained in the CodeDeploy walkthrough.
Use aws-cli commands to deploy your first revision successfully to the CodeDeploy target instance.
After you have the application deployed and running, configure Travis to trigger the deployments.
The CodeDeploy appspec.yml file and any scripts used for the deployment should be packaged inside your application bundle (latest.zip in the below example).
The following .travis.yml config worked for me:
script: npm run build
before_deploy:
- zip -r latest dist/*
- mkdir -p dpl_cd_upload
- mv latest.zip dpl_cd_upload/latest.zip
deploy:
- provider: s3
access_key_id: "XXXX"
secret_access_key: "YYYYY"
bucket: "deployments-bucket-name"
local_dir: dpl_cd_upload
skip_cleanup: true
- provider: codedeploy
access_key_id: "ZZZZZ"
secret_access_key: "WWWW"
bucket: "deployments-bucket-name"
key: latest.zip
bundle_type: zip
application: CodeDeployAppName
deployment_group: CodeDeployDeploymentGroupName
This examples were really useful:
https://github.com/travis-ci/cat-party/blob/master/.travis.yml