configure bitbucket pipeline to detect in with folder changes were made - amazon-web-services

I am managing my aws lambda code changes with bitbucket pipelines.
I have bitbucket pipelines file that runs a lambda deploy file, the lambda deploy file deploys the new lambda code into aws.
at the moment I have a lot of lambdas and a lot of repos for then, I will want to unite all the lambda files into a single repo.
I thought if I could have bitbucket pipelines to detect in which folder a change was made.
then i want to pass the folder name as a variable to the lambda deploy file
I tried to pass the folder argument in which the change occurred into the lambda-deploy file but I couldn't find a way to do it

Related

How to automatically replicate CodeCommit repo to Bitbucket on-prem

I want to push changes to CodeCommit and have a Bitbucket on prem repo automatically updated as a result.
I've seen a few guides/answers on how to replicate from Bitbucket to CodeCommit, but is there any way to do the opposite?
You could use Eventbridge to monitor repository state change events and trigger an appropriate AWS service to send the data to on premise server.
Some possibilities might be:
Lambda function which connects to Bitbucket Rest API
Codebuild project which does not actually build anything but loads the repository and runs a script which pushes repo to Bitbucket
Datasync (if it is possible to copy data directly on file system level (for example, through SMB)

Version Control And Pipeline for AWS Cloudformation

I'm trying to figure out a way to come up with a CI/CD pipeline for CloudFormation. We use Cloudformation Console directly to deploy our infrastructure and app to the cloud.
Does anyone have any examples of how they have created a CI/CD pipeline using Jenkins or other types of CI tools to do some type of linting, CI, version control, and artifact deployment to Artifactory (or similar toolset)? I'd like to execute a pipeline once a new version of the cloud formation templates is uploaded to Artifactory.
You can always use CodePipeline.
see docs:
CodePipeline
CI:
I am using GitHub, so before i can merge a pull request, my code must pass 3 tests.
Those tests are 3 Codebuilds containers that run tests.
CD:
After my code merged it invoke a CodePipeline that use mainly CodeDeploy and CodeBuild.
About your goal:
I'd like to execute a pipeline once a new version of the cloud formation templates is uploaded to Artifactory.
I don't really think you need a pipeline for this.
Let assume your artifacts uploaded to s3 bucket called artifact-bucket.
You can create a CloudWatch rule that will execute StepFunctions state machine when file added to
artifact-bucket.
see docs:
Trigger StepFunctions from S3
You can easily deploy stack with StepFunctions.

AWS Cloud9: deploy only one Lambda function at a time

I am trying to deploy Lambda functions using AWS Cloud9. When I press deploy, all of my functions are deployed/synced at the same time rather than just the one I selected when deploying. Same thing when right clicking on the function and pressing deploy. I find this quite annoying and wondering if there is any work around?
When you click deploy Cloud9 runs aws cloudformation package and aws cloudformation deploy on your template.yaml file in the background. (source: I developed the Lambda integration for AWS Cloud9).
Because all your files are bundled into one serverless application and there is only one CloudFormation stack they can only be all deployed at once with CloudFormation.
If you're only making a code change for one function and are not modifying any configuration settings you could update that one function from the command line with the command:
zip -r - . | aws lambda update-function-code --function-name <function-name>`
Run this in the same folder as your template.yaml file, replacing <function-name> with it's full generated name like cloud9-myapp-myfunction-ABCD1234 (You can see the full name under the remote functions list in the AWS Resources panel).
In AWS Cloud9, Lambda functions are created within serverless applications and are therefore deployed via CloudFormation. With CloudFormation, the whole stack is deployed at once, so all functions are deployed together (see this discussion for more info).

CodeDeploy to S3

I have a site in a S3 bucket, configured for web access, for which I run an aws s3 sync command every time I push on a specific git repository (I'm using Gitlab at the moment).
So if I push to stable branch, a Gitlab runner performs the npm start build command for building the site, and then aws s3 sync to synchronize to a specific bucket.
I want to migrate to CodeCommit and use pure AWS tools to do the same.
So far I was able to successfully setup the repository, create a CodeBuild for building the artifact, and the artifact is being stored (not deployed) to a S3 bucket. Difference is that I can't get it to deploy to the root folder of the bucket instead of a subfolder, seems like the process is not made for that. I need it to be on a root folder because of how the web access is configured.
For the deployment process, I was taking a look at CodeDeploy but it doesn't actually let me deploy to S3 bucket, it only uses the bucket as an intermediary for deployment to a EC2 instance. So far I get the feeling CodeDeploy is useful only for deployments involving EC2.
This tutorial with a similar requirement to mine, uses CodePipeline and CodeBuild, but the deployment step is actually a aws s3 sync command (same as I was doing on Gitlab), and the actual deployment step on CodePipeline is disabled.
I was looking into a solution which involves using AWS features made for this specific purpose, but I can't find any.
I'm also aware of LambCI, but to me looks like what CodePipeline / CodeBuild is doing, storing artifacts (not deploying to the root folder of the bucket). Plus, I'm looking for an option which doesn't require me to learn or deploy new configuration files (outside AWS config files).
Is this possible with the current state of AWS features?
Today AWS has announced as a new feature the ability to target S3 in the deployment stage of CodePipeline. The announcement is here, and the documentation contains a tutorial available here.
Using your CodeBuild/CodePipeline approach, you should now be able to choose S3 as the deployment provider in the deployment stage rather than performing the sync in your build script. To configure the phase, you provide an S3 bucket name, specify whether to extract the contents of the artifact zip, and if so provide an optional path for the extraction. This should allow you to deploy your content directly to the root of a bucket by omitting the path.
I was dealing with similar issue and as far as I was able to find out, there is no service which is suitable for deploying app to S3.
AWS CodeDeploy is indeed for deploying code running as server.
My solution was to use CodePipeline with three stages:
Source which takes source code from AWS CodeCommit
Build with AWS CodeBuild
Custom lambda function which after successful build takes artifact from S3 artifact storage, unzip it and copies files to my S3 website host.
I used this AWS lambda function from SeamusJ https://github.com/SeamusJ/deploy-build-to-s3
Several changes had to be made, I used node-unzip-2 instead of unzip-stream for unziping artifict from s3.
Also I had to change ACLs in website.ts file
Uploading from CodeBuild is currently the best solution available.
There's some suggestions on how to orchestrate this deployment via CodePipeline in this answer.

AWS CodePipeline adding artifacts to S3 in less useful format than running steps individually

I've set up a CodePipeline with the end goal of having a core service reside on S3 as a private maven repo for other pipelines to rely on. When the core service is updated and pushed to AWS CodeCommit, the pipeline should run, test it, build a jar using a maven docker image, then push the resulting jar to S3 where it can be accessed by other applications as needed.
Unfortunately, while the CodeBuild service works exactly how I want it to, uploading XYZCore.jar to /release on the bucket, the automated pipeline itself does not. Instead, it uploads to a "XYZCorePipeline" folder, which contains the input and output artifacts of the build. The output artifact itself is a zip file that's just a random string of characters. I checked the pipeline and it's using the service correctly, but pipeline specific builds always output there while standalone builds of the CodeBuild service output how I'd like them, allowing me to take advantage of things like versioning. What's the best way for me to fix the settings so the two builds match?
Unfortunately CodePipeline does not support this use case.
As a workaround you could upload the artifact to S3 by invoking AWS CLI (aws s3 cp ...) from your buildspec.yml post_build.