I have a web and a worker environment on Elastic Beanstalk. I have a cron.yaml file that has a task that runs every hour. It was working fine in the beginning when I was deploying using eb-cli.
Then I decided to use AWS code pipeline to deploy code and that's when the cron job stopped working. The way it works is that the build stage of the pipeline creates a docker image and pushes it to my ECR repo. This image is then used (using Dockerrun.aws.json) in the next stage to run the app in both environments. But this approach does not schedule the tasks defined in cron.yaml.
I think with eb-cli, when elastic beanstalk unzips the archive uploaded to s3 for deployment by eb-cli, it finds that cron.yaml file and schedules the task. But with ECR, it's just the docker image and that cron.yaml is not available. If this is the case, how can I provide the cron.yaml file to elastic beanstalk when deploying from ECR repo?
What am I missing?
I was able to solve this issue myself. All you need to do is add cron.yaml file to artifacts in your buildspec.yml file, like so:
artifacts:
files:
- Dockerrun.aws.json
- cron.yaml
Related
I have a requirement to do CI/CD using Bitbucket Pipelines.
We use Maven to build our code on Bitbucket pipelines and push the artifacts (jars) to AWS S3. The missing link is to figure out a way to get the artifacts from S3 and deploy to our EC2 instance.
It should all work from Bitbucket Pipelines yml - hopefully using Maven plugins.
For pushing the artifacts to S3 we use:
<groupId>com.gkatzioura.maven.cloud</groupId>
<artifactId>s3-storage-wagon</artifactId>
Is there a way/plugin that will download the artifact from S3's bucket and deploy it to EC2 instance specific folder and perhaps call a sh script to run the jars?
Thank you!
Use AWS Code Deploy (https://docs.aws.amazon.com/codedeploy/latest/userguide/welcome.html) to deploy it to the EC2 instance. The trigger for code deploy would be the S3 bucket that you deploy your jars to. You will need to turn S3 versioning on to make it work. Code Deploy has it's own set of hooks that you can use to perform any shell command or run any bat files on the EC2.
System
CI/CD using AWS CodePipeline:
AWS CodeBuild connect Github 2
AWS Beanstalk using images from ECR, Multi-container docker platform
Error
Deployment completed, but with errors: Failed to deploy application. No ecs task definition (or empty definition file) found in environment
Elastic Beanstalk Upload Application
app.zip
|
--Dockerrun.aws.json
|
--nginx
|
--default.conf
Dockerrun.aws.json
Content here.
I tried to deploy only Elastic Beanstalk using images after building and file app.zip, and surprisingly it worked.
Thanks in advance!
I solved this issue.
I thought that EB will read the file Dockerrun.aws.json from zip file I uploaded.
However, EB using the Docker platform read that file from the image. I put the file Dockerrun to the image and it worked.
I do not want to use the console. No manual processes. I need a command line version, something that I can code in my continuous deployment scripts.
As part of the build process, I can output a ZIP file (be it on my local machine or in CI process, e.g: via bitbucket pipelines or AWS codedeploy VM instance).
I want a command like:
aws eb deploy my-app ./server.zip
That is, first upload my chosen zip file and then deploy it (doesn't have to be one command).
The official eb deploy does not seem to support this, nor have I been able to find any other method to do this.
Any ideas would be much appreciated :)
I don't think eb CLI supports uploading a ZIP and updating an environment but you can use a combination of AWS CLI commands.
Upload the ZIP to S3
Create an application version
Update the environment
Example,
aws s3 cp deploy.zip s3://mybucket/deploy.zip
aws elasticbeanstalk create-application-version --application-name my-app --version-label 12345 --source-bundle S3Bucket="mybucket",S3Key="deploy.zip"
aws elasticbeanstalk update-environment --application-name my-app --environment-name MyApp-env --version-label 12345
I was looking for this answer as well. I was able to find some AWS documentation that lets you use the Elastic Beanstalk CLI configuration to upload the zip.
https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/eb-cli3-configuration.html#eb-cli3-artifact
Deploying an artifact instead of the project folder
You can tell the EB CLI to deploy a ZIP file or WAR file that you generate as part of a separate build process by adding the following lines to .elasticbeanstalk/config.yml in your project folder.
deploy:
artifact: path/to/buildartifact.zip
If you configure the EB CLI in your Git repository, and you don't > commit the artifact to source, use the --staged option to deploy the latest build.
~/eb$ eb deploy --staged
I tested and it did work for me!
Current Situation:
We have CI and CD as below.
we have poll SCM in Jenkins, once new commits comes Jenkins will start the build through jenkinsfile, and Jenkins file look for pom and starts building jar file, once jar created it will start to create docker image out of it with help of dockerfile, and image will push to docker hub(private dockerhub).(CD==> then we use portainer to deploy the latest image to aws docker swarm cluster manually).
We are trying to achieve CD with below fashion:
Now I have to deploy the latest image from dockerhub to aws(docker swarm cluster) automatically through Jenkins like one click deployment.
How Can we achieve this deployment using Ansible or Portainer in auto-fashion like build and deploy?
If so please suggest with reference or steps to achieve this?
is there any better approach than Ansible?
I am developing an Elastic Beanstalk app. It is a Scala web application, built with sbt. I want to deploy the resulting WAR from the command line to an existing environment.
All I can find is the eb CLI which appears to require you to use git: http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/eb-cli3-getting-started.html
Is there not a way to simply specify a WAR and environment name to perform the deployment?
What is the best workaround otherwise? I can upload to S3 from the command line and then use the web app to choose that file, but it's a bit more painful than I wanted.
You can use Elastic Beanstalk CLI (eb) instead of AWS CLI. Just run eb create to create a new environment and eb deploy to update your environment.
You can set specific artifact (your *.war file), by configuring the EB CLI (read: http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/eb-cli3-configuration.html#eb-cli3-artifact):
You can tell the EB CLI to deploy a ZIP or WAR file that you generate
as part of a separate build process by adding the following lines to
.elasticbeanstalk/config.yml in your project folder.
deploy:
artifact: path/to/buildartifact.zip
I found a way - use the aws CLI instead. First upload to S3 (I actually use s3cmd) then create an application version:
$ aws elasticbeanstalk create-application-version --application-name untaggeddb --version-label myLabel --source-bundle S3Bucket="bucketName",S3Key="key.war"
I believe the application version can then be deployed with update-environment also using the aws CLI.