I am trying to implement the Blue-Green deployment using the CICD pipeline. I am following this guide here:
https://aws-quickstart.s3.amazonaws.com/quickstart-codepipeline-bluegreen-deployment/doc/blue-green-deployments-to-aws-elastic-beanstalk-on-the-aws-cloud.pdf
Now in this guide on page 16, it is saying to launch the CF template. Then we have to enter some parameters after uploading the template. I am right now using the sample PHP application of aws which is zipped and stored in one of my S3 bucket. Parameters are following:
Here elb-blue-green-sample is my bucket where the sample PHP application is zipped and stored. Beanstalk Source Stage S3 Bucket key is pointing to the PHP zip file in my bucket. Every else parameter I keeping blank as they are optional and CF will create it on its own.
Now when I launch this template it is showing following error and is then Rollbacking:
Please help me to implement if somebody has implemented it or have any experience as I am stuck here and do not know where I am understanding it wrong.
Thanks in Advance
In your case, the Beanstalk Source Stage S3 Bucket key will be just:
php-v1 (1).zip
Related
I have created a simple deploy pipeline using jenkins, I have created the codedeploy, the S3 Bucket, the Autoscalable group, the ami. Everything listed in the docs. But it needs a appspec.yml. I have looked at the documentation for appspec.yml. And it’s very confusing.
Is there any way to generate a appspec.yml. I am not even sure what its role is. I thought the code deploy would take the zip file out of the S3 Bucket and deploy it to the scaleble group.
Any help?
appspec.yml is the file that tells codedeploy service about what tasks it should do with the code on your EC2 servers. So it needs to be built according to your workflow. This documentation and the examples will help you what you want to achieve.
Is there any way to generate a appspec.yml
You can't auto-generate the file. It must be custom designed for your specific application, and only you know what your application is, how it works, how it is configured, what are its dependencies, and so on.
Simple question: I wrote a lambda function (Golang) which I usually deploy manually, through either the AWS CLI or the console.
Recently I moved the code repository to CodeCommit and I'm thinking about using the other dev services, too.
I guess I should write a buildspec.yml for CodeBuild to create the build zip on my behalf and upload it to S3.
What I can find enough (and clear) documentation about is CodeDeploy.
The deployment action is extremely simple, as I understand it, deploy to Lambda a zip stored in my S3 bucket. Though, the only official documentation I found revolves around Cloudformation, SAM, etc.
I'd prefer to keep my CI/CD as simple as possible. What do you suggest? Do I get it wrong? Can you please point me out in the right direction?
I want to automate the process by which when I push my code to my private repository on Bitbucket, it will directly update on my EC2 Instances. For that I gained some knowledge which was like:
AWS CodeDeploy is used for that. What I have learned is:
Go to IAM Role and create the role and attach the policy i.e AmazonEC2RoleForCodeDeploy & AWSCodeDeployRole
Edit Trust Relationship -> "service": "codedeploy.amazonaws.com"
At the time of the creation of EC2 Instance use the IAM Role to the role which we just created.
But I am confused about the YML code which will be paste on Advance Details (at the time of Instance Creation).
I attended many Blogs and youtube tutorials but getting confused about the right implementation.
On some tutorial, used S3 for code deploy. I am a little confused about this problem regarding the automation process.
My simple requirement is When I update/push the code on my personal branch, it will update on my directory which is under my EC2 instance.
Any AWS expert or DevOps will help me with that, Kindly share the best efficient manner to achieve this task. I am Mentioning "expert" because I need the best and optimum solution not the solution to achieve the goal.
Any help or suggestion is really appreciated for that. Pardon me if there is some typo error or any grammatical error on mine query.
The basic steps would be:
Add the CodeDeploy addon from from Bitbucket's App Marketplace and
configure it.
To add an IAM role that allows Bitbucket to first push code to an
intermediate S3 bucket in your account.
Once you've configured the CodeDeploy settings from the Settings
page of the Bitbucket console, you can perform a deployment to the
Deployment group from your branch and monitor this in the Codedeploy
console.
The blog below has clear and detailed steps for this:
https://aws.amazon.com/blogs/apn/announcing-atlassian-bitbucket-support-for-aws-codedeploy/
The Yaml file you are confused about is the Appspec file that tells CodeDeploy what actions to perform during a Deployment.
The bare minimum appspec.yml file you can have to deploy all the files from the source package:
version: 0.0
os: linux
files:
- source: /
destination: /var/www/html
This file is placed in the root of your source code.
Appspec file example:
https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file-example.html#appspec-file-example-server
I have followed this guide on how to separately package and deploy a .NET Core serverless application on AWS.
I'm trying to deploy to different environments, which I can achieve by parameterising the CloudFormation template and creating different stages in the Azure DevOps release pipeline, each stage passing different parameters to the template. However, there is one aspect that's confusing me a little bit. From the guide, an AWS Lambda .NET Core Deployment task is added to create a deployment package in the CI part. In that task, a S3 bucket name is specified, which is where the compiled code will be uploaded to. However, this means that one single S3 bucket will contain uploaded code for all environments. What if I wanted to upload the code to a different S3 bucket, depending on the environment being deployed to? Or is it normal practice to have just one S3 bucket in this scenario?
I thought of creating the bucket at the release pipeline stages, but the task to package the serverless application (during CI) requires a bucket name to be supplied.
Am I going the wrong way about this?
I have a site in a S3 bucket, configured for web access, for which I run an aws s3 sync command every time I push on a specific git repository (I'm using Gitlab at the moment).
So if I push to stable branch, a Gitlab runner performs the npm start build command for building the site, and then aws s3 sync to synchronize to a specific bucket.
I want to migrate to CodeCommit and use pure AWS tools to do the same.
So far I was able to successfully setup the repository, create a CodeBuild for building the artifact, and the artifact is being stored (not deployed) to a S3 bucket. Difference is that I can't get it to deploy to the root folder of the bucket instead of a subfolder, seems like the process is not made for that. I need it to be on a root folder because of how the web access is configured.
For the deployment process, I was taking a look at CodeDeploy but it doesn't actually let me deploy to S3 bucket, it only uses the bucket as an intermediary for deployment to a EC2 instance. So far I get the feeling CodeDeploy is useful only for deployments involving EC2.
This tutorial with a similar requirement to mine, uses CodePipeline and CodeBuild, but the deployment step is actually a aws s3 sync command (same as I was doing on Gitlab), and the actual deployment step on CodePipeline is disabled.
I was looking into a solution which involves using AWS features made for this specific purpose, but I can't find any.
I'm also aware of LambCI, but to me looks like what CodePipeline / CodeBuild is doing, storing artifacts (not deploying to the root folder of the bucket). Plus, I'm looking for an option which doesn't require me to learn or deploy new configuration files (outside AWS config files).
Is this possible with the current state of AWS features?
Today AWS has announced as a new feature the ability to target S3 in the deployment stage of CodePipeline. The announcement is here, and the documentation contains a tutorial available here.
Using your CodeBuild/CodePipeline approach, you should now be able to choose S3 as the deployment provider in the deployment stage rather than performing the sync in your build script. To configure the phase, you provide an S3 bucket name, specify whether to extract the contents of the artifact zip, and if so provide an optional path for the extraction. This should allow you to deploy your content directly to the root of a bucket by omitting the path.
I was dealing with similar issue and as far as I was able to find out, there is no service which is suitable for deploying app to S3.
AWS CodeDeploy is indeed for deploying code running as server.
My solution was to use CodePipeline with three stages:
Source which takes source code from AWS CodeCommit
Build with AWS CodeBuild
Custom lambda function which after successful build takes artifact from S3 artifact storage, unzip it and copies files to my S3 website host.
I used this AWS lambda function from SeamusJ https://github.com/SeamusJ/deploy-build-to-s3
Several changes had to be made, I used node-unzip-2 instead of unzip-stream for unziping artifict from s3.
Also I had to change ACLs in website.ts file
Uploading from CodeBuild is currently the best solution available.
There's some suggestions on how to orchestrate this deployment via CodePipeline in this answer.