Elastic beanstalk is not pushing updated version of source code on EC2 - amazon-web-services

we have a situation where incremental source code deployment is not working in Elastic beasntalk.
so If a new elastic beanstalk environment is created with source code package, it works and EC2 instance gets the source package, but next time if I want to deploy a new version of source code, gitlab pipeline is successful and package reaches to the elastic beanstalk as I can see it under application version, however new source code does not deploy on existing EC2 instances.
we are using deployment_policy and rolling update type as "Immutable" in elastic beanstalk configuration. one thing, that's observed is, if we terminate existing Ec2 instance, ASG spin up a new EC2 instance and that contains updated source code.
also, manual code deployment from Elastic beanstalk console is working and revised code version gets updated on EC2.
we are using Terraform to build up the EB infrastructure and gitlab for package deployment as pipeline.
ASG is set as min_size = 1, max_size = 2
Any help is appreaciated.

Did you configure a deploy token on gitlab? Did you follow this instructions for more details about how to deploy with gitlab CD pipes?
I think it is an environment variables problem! Hope it helps.

Related

AWS ElasticBeanstalk with custom AMI

I have the following doubts about using custom AMI with AWS EB.
Now I have:
a default platform, Python 3.6 + Amazon Linux 1.10.0, and in EB configurations > Instances > AMI I get an ID that I think is the default AMI provided by AWS to launch the platform (and if it is like that than it should get modified at every platform update).
some platform configurations done with .ebextensions files
my Flask app that I deploy from CLI (eb deploy)
So, in order to avoid the .ebextensions configurations time, I'd like to use a custom AMI that includes (1) + (2) and continue to deploy my Flask app like before.
So to build the AMI:
can I stop an EC2 instance of my running env and make an AMI from that one from EC2 console? If I do so, then the AMI would contain even my .ebextensions files and my app, is it a problem?
if the AMI shouldn't include .ebextensions files, then the only way to custom the platform before doing the AMI is to SSH?
after having built the AMI I put its ID in EB console > configurations > instances and then EB takes care of everything, like updating the AMI id in EC2 > autoscaling > launch options?
to do a platofrm update I have first to manually rebuild the AMI starting from the new platform and then update the AMI ID in EB configurations? So it's not possible to update the platform from EB console like I was used to do before and then to save the new AMI?
when I deploy my app it then shouldn't contain .ebextensions files?
if I create the AMI with my app included, then EB autoscaling would even save the time of deploying the app? (Of course in this case to deploy I would have to create a new AMI first).
Thanks for help.
can I stop an EC2 instance of my running env and make an AMI from that one from EC2 console? If I do so, then the AMI would contain even my .ebextensions files and my app, is it a problem?
You don't have to stop it. You can make AMI from running instance. Also your instance its in ASG, so stopping it is not a good idea.
if the AMI shouldn't include .ebextensions files, then the only way to custom the platform before doing the AMI is to SSH?
It shoudn't matter if you have pre-existing app on the ami. New deployment will install your app anyway.
after having built the AMI I put its ID in EB console > configurations > instances and then EB takes care of everything, like updating the AMI id in EC2 > autoscaling > launch options?
Yes,
to do a platofrm update I have first to manually rebuild the AMI starting from the new platform and then update the AMI ID in EB configurations? So it's not possible to update the platform from EB console like I was used to do before and then to save the new AMI?
Probably, have to repeat the process.
when I deploy my app it then shouldn't contain .ebextensions files?
It depends what they do. If they install software which is already on the custom ami, you can remove it.
if I create the AMI with my app included, then EB autoscaling would even save the time of deploying the app? (Of course in this case to deploy I would have to create a new AMI first).
The purpose of the custom ami is to save time on installing and configuring custom software that is normally not on the AWS amis. Its not to replace or elimiate the need of deploying your APP. You still need to do it, but can skip installing custom packages.
You can create a custom AMI from a running EC2 instance from the console, and from the CLI. Any AMI you create is a faithful copy of the instance, so if the instance has ebextensions, then the AMI will do also.
I think I understand that you want to create an AMI from instances being managed by ElasticBeanstalk? If that is so, then there are certain files that need to exist on the ElasticBeanstalk EC2 instance so that ElasticBeanstalk and Cloudformation can manage the environment. The .ebextensions are scripts are used to configure the environment, at least in my experience there are maintained in your repo. If your AMI has .ebextensions then they are most likely needed.
I don't think it is typical to use a custom AMI under ElasticBeanstalk: the whole point is to let AWS manage that layer for you. I would recommend that if you really need a custom AMI, you look at doing what you want to do directly in EC2 and forgo ElasticBeanstalk. ElasticBeanstalk is really only an abstracted 'friendly' interface to EC2 and other services (eg autoscaling and load balancer are actually EC2). Maybe even consider putting your application into a docker?
You can create Custom AMI of EC2 instance which is running for Elastic beanstalk. IF you are going with custom AMI then no need to use .ebextension files because either AMI should include all the changes which has already done when you deployed application along with ebextension file or do the necessary changes in server before creating AMI. But it is good to use default AMI which AWS provides while creation of Elastic Beanstalk and use .ebextension files to do required tasks during deployment.

Elastic Beanstalk Always Updates Environment On Deploy From Codebuild

I have a large, multi-component django application I am trying to deploy to elastic beanstalk. I am using the multi-docker environment. THis is my current workflow
Git commit triggers AWS code pipeline
AWS Codebuild builds docker image (docker-compose build), runs some tests, and pushes this image to AWS Elastic Container Registry
AWS Code Build calls eb deploy
The issue I am running into is that when I call eb deploy from my local box, the it simply upgrades the application, but when I call it from Code Build, it is upgrading the environment every time, which takes about 30 minutes for some reason
I run the deploy command with -v and confirmed that the same files are being zipped. Any ideas on what is going on here, is my setup incorrect?
I also tried to deploy the application from Code Deploy in the pipeline and can confirm that it also always upgrades the entire environement.
I think that if you use CB to update your EB env, it just replaces it as it is being considered as a new environment. In your local workstation you are using only one single environment, but with new application version.
I would consider replacing CB for updating your EB environment, with the EB deploy provider in your CP. This should successful just upload your new application version to an existing EB environment.
CP natively supports a number of deploy action providers, one of the being Elastic Beanstalk:
You can configure CodePipeline to use Elastic Beanstalk to deploy your code. You can create the Elastic Beanstalk application and environment to use in a deploy action in a stage either before you create the pipeline or when you use the Create Pipeline wizard.

AWS CodeDeploy?

My app is created using elastic-beanstalk aws service, do I need to use the AWS CodeDeploy service to deploy my app?
Currently I just do:
eb deploy myApp
Then, a new application version is deployed without using AWS CodeDeploy.
So, AM I doing something wrong?
Elastic beanstalk do it on your behalf. During deployment process you define some polices and roles, which defines elastic beanstalk will call these services on your behalf. Codedeploy is one of that services.
Elasticbeanstalk does automation of your process only and setup thet whole deployment environment for you (php,nginx/apache in case of web), if you look /opt/elasticbeanstalk/, you can see codedeploy folder there, which means that you do not need to do it manually.
AWS code deploy is different workaround and provides more controlling. How you want your changes to be pushed, is it to be pushed on all instances an once or one by one, minimum number of healthy instances.
Check here-
http://cloudacademy.com/blog/how-to-deploy-application-code-from-s3-using-aws-codedeploy/
http://blog.powerupcloud.com/2016/03/24/deployment-automation-using-aws-code-depoly/
https://blogs.aws.amazon.com/application-management/post/Tx33XKAKURCCW83/Automatically-Deploy-from-GitHub-Using-AWS-CodeDeploy
You can update the your application with new version. CLI as follows
$eb deploy --version
You are not doing anything wrong. EB Deploy will enable you to deploy your apps being served from Elastic Beanstalk. AWS Code Deploy on the other hand is more flexible & gives you more control, you can for example, deploy apps you are serving from EC2 thats not being managed by Elastic Beanstalk.
With AWS you can for example deploy to multiple environments ie development, staging, production.
Elastic Beanstalk and CodeDeploy are totally different AWS services and independent of each other and follow different deployment approaches.
What you're doing is totally correct to deploy a new version of your code.
AWS elastic-bean-stalk itself has the nice capability for deploying applications nicely.You dont need to use aws code deploy again.It will be superflous.You can use beanstalk tools itself to deploy the code.
AWS CodeDeploy is a building block service focused on helping developers deploy and update software on any instance, including Amazon EC2 instances and instances running on-premises.
AWS Elastic Beanstalk (as well as AWS OpsWorks btw) are end-to-end application management solutions.
When it comes to deploying new software release on Beanstalk, you better use the own deployment process provided to you by Beanstalk.
eb deploy myApp

What is a good way to deploy a distributed application using CodeDeploy and a CI tool?

When using AWS, it seems a nice way to deploy an application to a newly created instance is via AWS CodeDeploy. This works as follows:
Set up an auto-scaling group for the application
Write a user-data bash script for the auto-scaling group which pulls the CodeDeploy agent from S3, installs it and starts it
Set up a CodeDeploy deployment group which deploys to the auto-scaling group
Now, when an application bundle (e.g. jar or debian package) is deployed to the deployment group, it will be deployed automatically to new instances launched in the auto-scaling group.
My question is: how can this deployment strategy fit with a CI tool like Travis CI?
Specifically:
How can CodeDeploy pick up a package built by a CI tool like Travis CI? Does the build job need to upload the package to S3?
How can CodeDeploy be used to deploy the application gradually (e.g. one instance at a time)?
Does this deployment strategy require each running instance to be shut down and replaced, or is the new version of the application deployed on the existing instances? If it is the former, machine IP addresses would change during deployment, so how can other services discover the newly deployed application (i.e. without hardcoded IP addresses)?
tl;dr version:
The build job needs to upload the package to S3.
Use the one at a time deployment config.
The new version of the application is deployed on the existing instances.
Ok, here's the long version:
I recommend you try the Deployment Walkthrough or take a looks at Concepts in the documentation. It should help you get familiar with CodeDeploy faster.
You don't have to use an AutoScaling group with CodeDeploy if you don't want to. CodeDeploy with AutoScaling integration allows you to manage fleets that need to change in size dynamically separately from the code that is deployed to them, but that is not a requirement to use CodeDeploy. You can also launch some EC2 instances manually, install the host agent, and then tag them into a deployment group - but they won't get deployed to automatically on launch like the AutoScaling instances would. In either case, you can always create fleet wide deployments.
You'll have to do some work to integrate it with your CI tool. CodeDeploy doesn't directly manage your build artifacts, so your build process will need to do that. To have automatic deployments, your will need to:
Create a archive bundle with an appspec.yml, any scripts you need to handle the install/upgrade, and your build artifacts.
Upload the bundle to S3.
Create a deployment in CodeDeploy.
You might want to look at CodePipeline as an example of a continuous delivery system that's integrated with CodeDeploy.
CodeDeploy uses deployment configs to control how aggressively it deploys to the instances in your fleet. (This config gets ignored for automatic deployments, since each instance is handled separately.) CodeDeploy will fail your deployment and stop deploying to new instances if it cannot potentially fail another instance without violating the constraints in the deployment config.
There are three built in deployment configs, and you can create your own via the CLI or API if you need a different one. To deploy to only one instance at a time, you can use the CodeDeployDefault.OneAtATime deployment config, which allows at most one unhealthy host at any given time.
For anyone else (like me) looking for an example on how to actually integrate Travis-CI with CodeDeploy:
Configure the Application, DeploymentGroups and instances, as explained in the CodeDeploy walkthrough.
Use aws-cli commands to deploy your first revision successfully to the CodeDeploy target instance.
After you have the application deployed and running, configure Travis to trigger the deployments.
The CodeDeploy appspec.yml file and any scripts used for the deployment should be packaged inside your application bundle (latest.zip in the below example).
The following .travis.yml config worked for me:
script: npm run build
before_deploy:
- zip -r latest dist/*
- mkdir -p dpl_cd_upload
- mv latest.zip dpl_cd_upload/latest.zip
deploy:
- provider: s3
access_key_id: "XXXX"
secret_access_key: "YYYYY"
bucket: "deployments-bucket-name"
local_dir: dpl_cd_upload
skip_cleanup: true
- provider: codedeploy
access_key_id: "ZZZZZ"
secret_access_key: "WWWW"
bucket: "deployments-bucket-name"
key: latest.zip
bundle_type: zip
application: CodeDeployAppName
deployment_group: CodeDeployDeploymentGroupName
This examples were really useful:
https://github.com/travis-ci/cat-party/blob/master/.travis.yml

How to deploy existing app with CodeDeploy on a new EC2 instance

I use CodeShip to deploy my app to AWS EC2 instances when a new app version is out.
CodeShip first packages my app into .zip and puts it in S3.
Each deployment package is being generated with different file name.
When I deploy with CodeShip, they create a new revision in CodeDeploy and deploys it to all current instances.
On the one direction, every time I deploy new version it's being deployed to all my already up instances, but how can I deploy the latest revision of my app from CodeDeploy to the new born instance?
Is there any way using the aws-sdk (Ruby or cli) to achieve this?
I use OpsWorks to maintain my instances and I use custom Chef cookbook to build the environment.
Thanks
Since you have a group of EC2 instances which you wish to have the same application running, consider to use the AutoScaling Group (ASG). Then, create a deployment group based on the ASG (not EC2 instance tags). Next time when any new machine is launched in this ASG, Code Deploy will automatically run and deploys the last successful deployment package. I've been using this method for many months now and it works perfectly.
From CodeDeploy http://aws.amazon.com/codedeploy/faqs/ , if you go to the concepts section and look at question "What is a deployment group" , in the answer you will get more idea about how deployment group works.
CodeDeploy uses the tags in the deployment group to find EC2 instances when creating a new deployment. So for new born instances, basically you just need to tag them with the same tag(s) in that deployment group, and then kick off a new deployment with the revision you want. CodeDeploy will find all the EC2 instances that are tagged with the tags in the deployment group.
But you should notice this, if you want to manually start up a new EC2 instance, there are several things you need to do. You can follow the steps here: http://docs.aws.amazon.com/codedeploy/latest/userguide/how-to-prepare-instances.html to launch a instance that works with CodeDeploy.
After the instance has been tagged and set up correctly, you can just kick off a new deployment with the latest revision as the current way you do it. The revision should be deployed to the new instance by CodeDeploy automatically.