Creating a simple pipeline (CodeCommit repository - amazon-web-services

I have created a simple deploy pipeline using jenkins, I have created the codedeploy, the S3 Bucket, the Autoscalable group, the ami. Everything listed in the docs. But it needs a appspec.yml. I have looked at the documentation for appspec.yml. And it’s very confusing.
Is there any way to generate a appspec.yml. I am not even sure what its role is. I thought the code deploy would take the zip file out of the S3 Bucket and deploy it to the scaleble group.
Any help?

appspec.yml is the file that tells codedeploy service about what tasks it should do with the code on your EC2 servers. So it needs to be built according to your workflow. This documentation and the examples will help you what you want to achieve.

Is there any way to generate a appspec.yml
You can't auto-generate the file. It must be custom designed for your specific application, and only you know what your application is, how it works, how it is configured, what are its dependencies, and so on.

Related

AWS Lambda deployment through CodeDeploy

Simple question: I wrote a lambda function (Golang) which I usually deploy manually, through either the AWS CLI or the console.
Recently I moved the code repository to CodeCommit and I'm thinking about using the other dev services, too.
I guess I should write a buildspec.yml for CodeBuild to create the build zip on my behalf and upload it to S3.
What I can find enough (and clear) documentation about is CodeDeploy.
The deployment action is extremely simple, as I understand it, deploy to Lambda a zip stored in my S3 bucket. Though, the only official documentation I found revolves around Cloudformation, SAM, etc.
I'd prefer to keep my CI/CD as simple as possible. What do you suggest? Do I get it wrong? Can you please point me out in the right direction?

Setting up CodePipeline with Terraform

I am new to Terraform and building a CI setup. When I want to create a CodePipeline that is going to be connected to a GitHub repo, do I run specific commands inside my Terraform codebase that will reach out to AWS and create the CodePipeline config/instance for me? Or would I set this CodePipeline up manually inside AWS console and hook it up to Terraform after the fact?
do I run specific commands inside my Terraform codebase that will reach out to AWS and create the CodePipeline config/instance for me?
Yes, you use aws_codepipeline which will create new pipeline in AWS.
Or would I set this CodePipeline up manually inside AWS console and hook it up to Terraform after the fact?
You can also import existing resources to terraform.
I see you submitted this eight months ago, so I am pretty sure you have your answer, but for those searching that comes across this question, here are my thoughts on it.
As most of you have researched, terraform is infrastructure as code (IaC). As IaC it needs to be executed somewhere. This means that you either execute locally or inside a pipeline. A pipeline consists of docker containers that emulate a local environment and run commands for you to deploy your code. There is more to that, but the premise of understanding how terraform runs remains the same.
So to the magic question, Terraform is Code, and if you intend to use a pipeline, Jenkins, AWS, GitLab, and more, then you need a code repository to put all your code into. In this case, a code repository where you can store your terraform code so a pipeline can consume it when deploying your code. There are other reasons why you should use a code repository, but your question is directed to terraform and its usage with the pipeline.
Now the magnificent argument, the chicken or the egg, when to create your pipeline and how to do it. To your original question, you could do both. You could store all your terraform code in a repository (i recommend), clone it down, and locally run terraform to create your pipeline. This would be ideal for you to save time and leverage automation. Newbies, you will have to research terraform state files which is an element you need to backup in some form or shape once the pipeline is deployed for you.
If you are not so comfortable with Terraform, the GUI in AWS is also fine, and you can configure it easily to hook your pipeline into Github to run jobs.
You must set up Terraform and AWS locally on your machine or within the pipeline to deploy your code in both scenarios. This article is pretty good and will give you the basic understanding of setting up terraform
Don't forget to configure AWS on your local machine. For you Newbies using pipeline, you can leverage some of the pipeline links to get you started. Remember one thing, within AWS Codepipeine; you have to use IAM roles and not access keys. That will make more sense once you have gone through the first link. Please also go to youtube and search Terraform for beginners in AWS. Various videos can provide a lot more substance to help you get started.

How to automate the code from Bitbucket to Amazon Ec2 Instance

I want to automate the process by which when I push my code to my private repository on Bitbucket, it will directly update on my EC2 Instances. For that I gained some knowledge which was like:
AWS CodeDeploy is used for that. What I have learned is:
Go to IAM Role and create the role and attach the policy i.e AmazonEC2RoleForCodeDeploy & AWSCodeDeployRole
Edit Trust Relationship -> "service": "codedeploy.amazonaws.com"
At the time of the creation of EC2 Instance use the IAM Role to the role which we just created.
But I am confused about the YML code which will be paste on Advance Details (at the time of Instance Creation).
I attended many Blogs and youtube tutorials but getting confused about the right implementation.
On some tutorial, used S3 for code deploy. I am a little confused about this problem regarding the automation process.
My simple requirement is When I update/push the code on my personal branch, it will update on my directory which is under my EC2 instance.
Any AWS expert or DevOps will help me with that, Kindly share the best efficient manner to achieve this task. I am Mentioning "expert" because I need the best and optimum solution not the solution to achieve the goal.
Any help or suggestion is really appreciated for that. Pardon me if there is some typo error or any grammatical error on mine query.
The basic steps would be:
Add the CodeDeploy addon from from Bitbucket's App Marketplace and
configure it.
To add an IAM role that allows Bitbucket to first push code to an
intermediate S3 bucket in your account.
Once you've configured the CodeDeploy settings from the Settings
page of the Bitbucket console, you can perform a deployment to the
Deployment group from your branch and monitor this in the Codedeploy
console.
The blog below has clear and detailed steps for this:
https://aws.amazon.com/blogs/apn/announcing-atlassian-bitbucket-support-for-aws-codedeploy/
The Yaml file you are confused about is the Appspec file that tells CodeDeploy what actions to perform during a Deployment.
The bare minimum appspec.yml file you can have to deploy all the files from the source package:
version: 0.0
os: linux
files:
- source: /
destination: /var/www/html
This file is placed in the root of your source code.
Appspec file example:
https://docs.aws.amazon.com/codedeploy/latest/userguide/reference-appspec-file-example.html#appspec-file-example-server

Is it possible to use AWS CodePipeline with Lightsail?

I'm working all the day and couldn't find the answer. So I'm asking you guys: is it possible to use AWS Pipeline with AWS Lightsail?
My objective is to store the code inside CodeCommit and use CodeBuild, CodeDeploy, CodePipeline and S3 to create a Continuous Deployment inside a Lightsail instance.
Those are the steps I think I have to follow to accomplish the task:
[x] setup a Lightsail instance
[x] create an IAM user and set permissions
[x] transfer my repository to CodeCommit
[x] create an S3 bucket to hold the build artifacts
[x] create a CodeBuild project to build the artifacts
[x] create a buildspec.yml file with my build steps
[ ] create a CodeDeploy project to deploy my application
[ ] create a CodePipeline project to trigger the build when I commit to certain branch
As you can see, I'm almost there. But I couldn't find any way to use my Lightsail instance with CodeDeploy. So, my question is: is it possible? Is there some limitation? Did I miss something really basic? Is there any other way to make the CD with Lighsail? Sorry, I'm getting a little crazy right here ahhaha.
Today, 08/16/2017, it's not possible to integrate them.
I asked the same question on AWS forums and they replied that those technologies are not integrated yet since they are separated from each other.
Well I guess I'll have to find another way.
i’m not a total expert here, but I think the way to do it would be with a custom script in CodeBuild, rather than with CodeDeploy.
CodeDeploy has a lot of custom stuff going on to support rollbacks and that sorr of advanced stuff (means you have to install the agent on your target server etc).
CodeBuild is just made for running scripts, so I think it’d be reasonable to add a deploy script (that runs after your tests) that connects up to yor Lightsail instance via SSH and deploy any changed files (similar to how you’d do it in open source using Travis CI etc).
Specifically I’ve used the dploy package on npm to do the actual SFTP upload before. It’s Git-aware so it only uploads changes since the last revision (but you could just rsync if you didn’t care about that).
I recently had the same challenge and got it working.
It is necessary to register the Lightsail Instance as an on-premise instance with CodeDeploy. On the instance itself the CodeDeploy agent needs to be installed and configured.
I have written a post about how to set this up on my blog.
https://scratchpad.blog/howto/how-to-use-codedeploy-with-aws-lightsail/
Following those steps can help you deploy lightsail as an onpremises instance and you configure codedeploy to deploy to the onpremises instance

Deploying an Angular 2 app built with webpack using Bitbucket

I have searched high and low for an answer to this question but have been unable to find one.
I am building an Angular 2 app that I would like hosted on an S3 bucket. There will be an EC2 (possibly) backend but that's another story. Ideally, I would like to be able to check my code into Bitbucket, and by some magic that alludes me I would like S3, or EC2, or whatever to notice via a hook, for instance, that the source has changed. Of course the source would have to be built using webpack and the distributables deployed correctly.
Now this seems like a pretty straightforward request but I can find no solution exception something pertaining to WebDeploy which I shall investigate right now.
Any ideas anyone?
Good news, AWS Lambda created for you.
You need to create following scenario and code to achieve your requirement.
1-Create Lambda function, this function should do the following steps:
1-1- Clone your latest code from GitHub or Bitbucket.
1-2- install grunt or another builder for your angular app.
1-3- install node modules.
1-4- build your angular app.
1-5- copy new build to your S3 bucket.
1-6- Finish.
2-Create AWS API gateway with one resource and one method point to your Lambda function.
3-Goto your GitHub or Bitbucket settings and add webhook with your API gateway.
4-Enjoy life with AWS.
;)
Benefits:
1-You only charge when you have the new build.
2-Not need any machine or server (EC2).
3-You only maintain one function on Lambda.
for more info:
https://aws.amazon.com/lambda/
https://aws.amazon.com/api-gateway/
S3 isn't going to listen for Git hooks and fetch, build and deploy your code. BitBucket isn't going to build and deploy your code to S3. What you need is a service that sits in-between BitBucket and S3 that is triggered by a Git hook, fetches from Git, builds, and then deploys your code to S3. You need to search for Continuous Integration/Continuous Deployment services which are designed to do this sort of thing.
AWS has CodePipeline. You could setup your own Jenkins or TeamCity server. Or you could look into a service like CodeShip. Those are just a few of the many services out there that could accomplish this task. I think any of these services will require a bit of scripting on your part in order to get them to perform the actual webpack and copy to S3.