I have a React application that I am bundling using Webpack. The app relies on a MongoDB database and a Node/Express server to field the backend of the app (API requests, etc.).
I want to set up continuous integration/deployment (C.I/D.), but am not sure where to start. As my app's GIT repo is with Bitbucket and I have had experience with AWS in the past, it would be good to enable C.I/D. using these. How do I go about this?
You can use Jenkins to build your project from BitBucket.
Make use of AWS CodePipeline and AWS CodeDeploy for continuous delivery on AWS.
Jenkins gives you the flexibility to work with any source control system, and has plugins for AWS CodePipeline.
From AWS CodePipeline, you can configure a stage to call a Jenkins build job.
I've been using this system in production for quite some time now, without any issues.
Related
I want to deploy NextJS on AWS using AWS CDK for a POC and was looking at options. In the NextJS docs, it says that we can just create an instance and run npm run build && npm start, it will start the service for us. However, this is not the most optimised way of deploying.
Vercel deploys this in the most optimized way possible:
How can I do the same with AWS? How can I serve the static assets and pages via Cloudfront CDN and the server side rendered pages and APIs via either Lambda or ECS? Is there a step by step guide that I can follow to split out the build files for the same?
Other options I explored
AWS Amplify: As it is a premium service, I feel doing all this by my self would be a lot cheaper and gives me more flexibility in CDK (I am not sure how Amplify works behind the scenes to deploy the nextjs assets on a S3 + Cloudfront + Lambda stack)
serverless framework: There is a plugin to deploy nextjs. But, I want to have full control over the deployment and don't want to depend on any external framework. Want to do it natively using AWS CDK.
Any pointers to do this natively using AWS CDK would be helpful. Thanks.
Deploying Next.js as a serverless application requires a bunch of services when you don't want to pack the whole Next.js server into a single Lambda.
My current setup of AWS services to achieve this looks like the following:
It consists of 3 main resources:
CloudFront
This works as a serverless reverse proxy that routes the traffic from the Internet to S3 (JavaScript, prerendered pages) or Lambda (Server rendered pages).
When using the image optimization capabilities of Next.js you also need an extra service that provides the API for it.
S3
Since you don't want to invoke Lambdas just to serve static content, you need a S3 bucket where those files are stored and served from.
Lambda
The Lambdas are then used to serve the server generated pages (SSR & API).
They contain a minimal version of the Next.js server (e.g. without the static files that are served from S3).
I built this setup with Terraform, so there is no native CDK solution available at this time.
But most of it could be simply translated to CDK since the model behind Terraform and CDK is pretty much the same.
Source code of the Terraform module is available on GitHub: https://github.com/milliHQ/terraform-aws-next-js
I am looking to integrate enterprise bitbucket server with aws ci/cd pipeline features.
I have tried creating a project within aws codebuild but do not see any option for bitbucket enterprise .
If this is not possible then what is the long route using api gateway / webhooks etc ?
AWS Codebuild only supports the Bitbucket cloud. To integrate with Bitbucket self hosted solution, you will need to create a API gateway + Lambda. And then add this gateway address as a webhook in the bitbucket repo. The Lambda will then be responsible to process the incoming events from Bitbucket server. There could be 2 routes from here.
One way could be to download the zip for the particular commit and upload it on a S3 bucket. Add S3 as a source trigger for the build project. You lose the ability to run any git specific commands in such a case though as it's just a zip file containing the specific version of files.
Second option could be to pass on the relevant info to codebuild by directly invoking it from Lambda. Passing off details like commit_id, event (pr or push), branch etc as environment variables. Based on this info, run a git clone in codebuild before running other build steps. This way you would have access to git specific commands.
Here is an example workflow from AWS (it is for codepipeline, but you can modify it suitably for codebuild)
What I'm trying to do is to create a following CI flow with standard AWS tools: run a build of a commit when a Pull Request in Github is created or updated. Or run a build of any branch on my command. Very similar to what Codeship, Travis and many other CI services offer.
Is it possible with CodeBuild + CodePipeline? I noticed that I have to specify exact branch in CodePipeline and, unfortunately, could not find how to integrate Github Pull requests into it. Maybe I overlooked it?
CodeBuild now directly supports building GitHub pull requests (without Lambda intermediate step), if you're looking to simply run a build as part of the PR. For running more steps with CodePipeline as part of a PR, you'll still need to set up some scaffolding as the other answers suggest.
https://aws.amazon.com/about-aws/whats-new/2017/09/aws-codebuild-now-supports-building-github-pull-requests/
CodePipeline does support basic, fully-managed integrations with both GitHub and CodeBuild, as listed in Product and Service Integrations with AWS CodePipeline. With these integrations, it is possible to use CodeBuild with CodePipeline to run a build of a commit when a commit is pushed to a branch on GitHub. See Use AWS CodePipeline with AWS CodeBuild to Run Builds for details on integrating CodeBuild with CodePipeline as a Build action provider, and see the Four-Stage Pipeline Tutorial for details on integrating Github with CodePipeline as a Source action provider.
Currently, the Pull Request feature in Github is not supported in the official CodePipeline integration, you did not overlook anything. For an interesting AWS-ecosystem open source project (not yet v1.0) that does support GitHub Pull Request integration (though not yet CodePipeline), you might want to check out LambCI.
It looks like this can be done somewhat manually by using Lambda and S3 - https://aws.amazon.com/blogs/devops/integrating-git-with-aws-codepipeline/
Webhooks notify a remote service by issuing an HTTP POST when a commit is pushed to the repository. AWS Lambda receives the HTTP POST through Amazon API Gateway, and then downloads a copy of the repository. It places a zipped copy of the repository into a versioned S3 bucket. AWS CodePipeline can then use the zip file in S3 as a source; the pipeline will be triggered whenever the Git repository is updated.
You could try https://www.deploytoproduction.com for Github Pull Request build status integration with AWS CodeBuild. It is free for a single Github repository with a subscription plan available for multiple repositories.
The service doesn't currently integrate with CodePipeline but that is coming soon.
If you wanted to build something yourself, you could make a new integration on GitHub that uses the webhook functionality to trigger a lambda function which in turn triggers your CodeBuild jobs or pushes an artifact to S3 to start a CodePipeline.
Full disclosure I am the author of this service
I am novice at Jenkins. My demo project built in github and with AWS codedeploy I can run my project succesfully. If I use AWS codepipeline without Jenkins, whatever changed in github its automatically integrated and run the project. So I want to use Jenkins, if codes have successfully built then it should run. So when I add jenkins in AWS codepipeline and integrated with my jenkins server this process has not run and it just processing in build section. What is the error or it's not integrated with jenkins? So what should I do? Kindly help me.
If your project is simple single html page then no need of using build provider.
If your project is based on maven or gradle then Jenkins will build the job and generate the output artifact file as zip and stored in jenkins workplace. Then these output artifact file is taken as input artifact file for next stage mostly for deployment purpose.
For using jenkins as Build Provider in AWS CodePipeline you should use IAM role for accessing between Jenkins server and AWS CodePipeline.
Purpose of IAM role:
Jenkins server will get input artifact files from the source provider such as AWS S3 bucket, GitHub.
Jenkins server will poll SCM based on Build trigger in your job.
After build successful, Jenkins server will store the output artifact file as zip in jenkins workplace as I mentioned earlier.
These output artifact file is taken as input for next stage. For example, Artefact file should be deployed on AWS CodeDeploy.
Thanks
I build artifacts on jenkins builds on cloudbees and for dev and test env (which are on Run#Cloud) the deployments are done from Jenkins.
However for production deployments, I would need to download the artifact (as URL) on the production machine. is there a way to set this up so that it does not ask for cloudbees login.
If you don't want artifacts to be public you need some credentials to access cloudbees jenkins (this isn't a FOSS project, is it ?)
You can use jenkins token to authenticate, so you don't need to publish your cloudbees password on production server.
See https://[account].ci.cloudbees.com/user/[me#mycompany.com]/configure to retrieve token, then you can access jenkins using
wget http://[me%40mycompany.com]:[token]#<account>.ci.cloudbees.com/...