How to automatically replicate CodeCommit repo to Bitbucket on-prem - amazon-web-services

I want to push changes to CodeCommit and have a Bitbucket on prem repo automatically updated as a result.
I've seen a few guides/answers on how to replicate from Bitbucket to CodeCommit, but is there any way to do the opposite?

You could use Eventbridge to monitor repository state change events and trigger an appropriate AWS service to send the data to on premise server.
Some possibilities might be:
Lambda function which connects to Bitbucket Rest API
Codebuild project which does not actually build anything but loads the repository and runs a script which pushes repo to Bitbucket
Datasync (if it is possible to copy data directly on file system level (for example, through SMB)

Related

configure bitbucket pipeline to detect in with folder changes were made

I am managing my aws lambda code changes with bitbucket pipelines.
I have bitbucket pipelines file that runs a lambda deploy file, the lambda deploy file deploys the new lambda code into aws.
at the moment I have a lot of lambdas and a lot of repos for then, I will want to unite all the lambda files into a single repo.
I thought if I could have bitbucket pipelines to detect in which folder a change was made.
then i want to pass the folder name as a variable to the lambda deploy file
I tried to pass the folder argument in which the change occurred into the lambda-deploy file but I couldn't find a way to do it

How to integrate Azure Repo with AWS CodeCommit

I want to implement CI/CD in AWS CodeCommit.
I know its possible manually to kickstart the process once the code reached CodeCommit. But I am using Azure DevOps Repo as my source code repo and want to automate the process.
The deployement is done using AWS SAM. I am looking for a method like; when I push a code to Azure Repo , it should reach the AWS CodeCommit and do the CI/CD without any further manual intervention.
Is there any way to do that?
Azure repos and CodeCommit are compliant with the git standard. The git standard allows you to specify multiple remotes. This is useful for if you were maintaining a mirror or, as in your use case, you need to do something in different environments.
You can read about setting multiple remotes here (provided by github; even though you’re not using github the process and commands should be the same).
Once you have your multiple remotes setup, you can configure your CI/CD pipeline to kick off its process to deploy your SAM template based on your push; when you push your code changes they will be sent to both your Azure repo, and your CodeCommit repo, and your CI/CD pipeline that is monitoring your CodeCommit repo will see the change and kickoff its execution.
Its worth pointing out that you’ll need to properly setup and configure your CI/CD pipeline. AWS provides a number of services to support this including AWS CodePipeline, AWS CodeBuild, and AWS CodeDeploy.

build and push docker image to AWS ECR using lambda

Is it possible to automate building a docker image from code committed into github (no tests involved) and then push it to AWS ECR using a lambda function?
you cannot do it just with lambda as lambda is not really a suitable execution environment for the docker daemon (necessary to build the images), however you can use lambda + sns to trigger an endpoint that could point to a service you developed, hosted on ec2 that would trigger the docker build command after a git clone (you can use something similar to python's fabfile.org or a framework that allows you to execute server commands).
You sure can extend this idea on perhaps bringing the ec2 build machine up with some ami that automates this, etc....
The big point here is that you don't really have control over what's provisioned in lambda, so you need ec2.

CodeDeploy to S3

I have a site in a S3 bucket, configured for web access, for which I run an aws s3 sync command every time I push on a specific git repository (I'm using Gitlab at the moment).
So if I push to stable branch, a Gitlab runner performs the npm start build command for building the site, and then aws s3 sync to synchronize to a specific bucket.
I want to migrate to CodeCommit and use pure AWS tools to do the same.
So far I was able to successfully setup the repository, create a CodeBuild for building the artifact, and the artifact is being stored (not deployed) to a S3 bucket. Difference is that I can't get it to deploy to the root folder of the bucket instead of a subfolder, seems like the process is not made for that. I need it to be on a root folder because of how the web access is configured.
For the deployment process, I was taking a look at CodeDeploy but it doesn't actually let me deploy to S3 bucket, it only uses the bucket as an intermediary for deployment to a EC2 instance. So far I get the feeling CodeDeploy is useful only for deployments involving EC2.
This tutorial with a similar requirement to mine, uses CodePipeline and CodeBuild, but the deployment step is actually a aws s3 sync command (same as I was doing on Gitlab), and the actual deployment step on CodePipeline is disabled.
I was looking into a solution which involves using AWS features made for this specific purpose, but I can't find any.
I'm also aware of LambCI, but to me looks like what CodePipeline / CodeBuild is doing, storing artifacts (not deploying to the root folder of the bucket). Plus, I'm looking for an option which doesn't require me to learn or deploy new configuration files (outside AWS config files).
Is this possible with the current state of AWS features?
Today AWS has announced as a new feature the ability to target S3 in the deployment stage of CodePipeline. The announcement is here, and the documentation contains a tutorial available here.
Using your CodeBuild/CodePipeline approach, you should now be able to choose S3 as the deployment provider in the deployment stage rather than performing the sync in your build script. To configure the phase, you provide an S3 bucket name, specify whether to extract the contents of the artifact zip, and if so provide an optional path for the extraction. This should allow you to deploy your content directly to the root of a bucket by omitting the path.
I was dealing with similar issue and as far as I was able to find out, there is no service which is suitable for deploying app to S3.
AWS CodeDeploy is indeed for deploying code running as server.
My solution was to use CodePipeline with three stages:
Source which takes source code from AWS CodeCommit
Build with AWS CodeBuild
Custom lambda function which after successful build takes artifact from S3 artifact storage, unzip it and copies files to my S3 website host.
I used this AWS lambda function from SeamusJ https://github.com/SeamusJ/deploy-build-to-s3
Several changes had to be made, I used node-unzip-2 instead of unzip-stream for unziping artifict from s3.
Also I had to change ACLs in website.ts file
Uploading from CodeBuild is currently the best solution available.
There's some suggestions on how to orchestrate this deployment via CodePipeline in this answer.

Deploying an Angular 2 app built with webpack using Bitbucket

I have searched high and low for an answer to this question but have been unable to find one.
I am building an Angular 2 app that I would like hosted on an S3 bucket. There will be an EC2 (possibly) backend but that's another story. Ideally, I would like to be able to check my code into Bitbucket, and by some magic that alludes me I would like S3, or EC2, or whatever to notice via a hook, for instance, that the source has changed. Of course the source would have to be built using webpack and the distributables deployed correctly.
Now this seems like a pretty straightforward request but I can find no solution exception something pertaining to WebDeploy which I shall investigate right now.
Any ideas anyone?
Good news, AWS Lambda created for you.
You need to create following scenario and code to achieve your requirement.
1-Create Lambda function, this function should do the following steps:
1-1- Clone your latest code from GitHub or Bitbucket.
1-2- install grunt or another builder for your angular app.
1-3- install node modules.
1-4- build your angular app.
1-5- copy new build to your S3 bucket.
1-6- Finish.
2-Create AWS API gateway with one resource and one method point to your Lambda function.
3-Goto your GitHub or Bitbucket settings and add webhook with your API gateway.
4-Enjoy life with AWS.
;)
Benefits:
1-You only charge when you have the new build.
2-Not need any machine or server (EC2).
3-You only maintain one function on Lambda.
for more info:
https://aws.amazon.com/lambda/
https://aws.amazon.com/api-gateway/
S3 isn't going to listen for Git hooks and fetch, build and deploy your code. BitBucket isn't going to build and deploy your code to S3. What you need is a service that sits in-between BitBucket and S3 that is triggered by a Git hook, fetches from Git, builds, and then deploys your code to S3. You need to search for Continuous Integration/Continuous Deployment services which are designed to do this sort of thing.
AWS has CodePipeline. You could setup your own Jenkins or TeamCity server. Or you could look into a service like CodeShip. Those are just a few of the many services out there that could accomplish this task. I think any of these services will require a bit of scripting on your part in order to get them to perform the actual webpack and copy to S3.