Create dependent triggers in GCP cloud build - google-cloud-platform

I need to create a dependent trigger in cloud build. Currently I have two triggers as shown in below image, both of which are created on push event to master branch in respective repos.
'app-engine-test' is triggered on pushing the code to a cloud repository whereas 'seleniumTest' is triggered on pushing code to a Git repository.
However I want to trigger 'seleniumTest' trigger once 'app-engine-test' build is completed. I could not find any such setting in GCP UI.
Can anyone please help?

You may be able to do this by using a Pub/Sub message as the trigger for your dependent build.
When a CloudBuild build runs it publishes messages to a Pub/Sub topic cloud-builds - see https://cloud.google.com/build/docs/subscribe-build-notifications.
So if you have builds app and test, app would be triggered when you push to source control, and test triggered when a message on the cloud-builds topic is published.
I haven't tested this myself, but need something similar so will update this answer as I go. If it turns out you can't subscribe to the cloud-builds event then at the end of the app build you could also publish a message to your own Pub/Sub topic which you could then use to trigger the second build.
Another solution in your case might be to merge the two projects and simply run the selenium tests as a final build step once you've successfully deployed the code.

Related

Trigger AWS codepipeline manually and not on every commit using bitbucket codestar connection

I am not able to find a way to stop the auto triggering of the pipeline whenever I push code to bitbucket.
My assumption is that you want more control over when your pipeline does certain things.
Rather than achieving this through stopping the pipeline from getting triggered, I'd recommend using either stage transitions or manual approvals to achieve this control inside the pipeline.
Stage transitions are better when you want to "turn off" a pipeline and have the latest thing run through when you turn it back on.
Manual approvals are better when you want the version to be locked while waiting for approval so you can run tests without worrying that the version will change.
You mentioned in your comment that you wanted to only run your pipeline at certain times, so a way you could do that is to enable and disable the stage transition after source on a schedule.
https://docs.aws.amazon.com/codepipeline/latest/userguide/transitions.html
https://docs.aws.amazon.com/codepipeline/latest/userguide/approvals.html
You can disable DetectChanges parameter on your Source action as explained here. Extract with the relevant context:
DetectChanges: Controls automatically starting your pipeline when a new commit is made on the configured repository and branch. If unspecified, the default value is true, and the field does not display by default.
This works on Bitbucket, GitHub, and GitHub Enterprise Server actions. I have a CloudFormation template configured with this option and works. Not sure about the same option on AWS console, because I saw that some configurations are only available from CloudFormation or aws cli. As you can read "this field does not display by default".

cloud-builds pub/sub topic appears to be unlisted or inaccessible

I'm attempting to create an integration between Bitbucket Repo and Google Cloud Build to automatically build and test upon pushes to certain branches and report status back (for that lovely green tick mark). I've got the first part working, but the second part (reporting back) has thrown up a bit of a stumbling block.
Per https://cloud.google.com/cloud-build/docs/send-build-notifications, Cloud Build is supposed to automatically publish update messages to a Pub/Sub topic entitled "cloud-builds". However, trying to find it (both through the web interface and via gcloud command line tool) has turned up nothing. Copious amounts of web searching has turned up https://github.com/GoogleCloudPlatform/google-cloud-visualstudio/issues/556, which seems to suggest that the topic referenced in that doc is now being filtered out of results; however, that issue seems to be specific to the visual studio tools and not GCP as a whole. Moreover, https://cloud.google.com/cloud-build/docs/configure-third-party-notifications suggests that it's still accessible, but perhaps only to Cloud Functions? And maybe only manually via the command line, since the web interface for Cloud Functions also does not display this phantom "cloud-builds" topic?
Any guidance as to where I can go from here? Near as I can tell, the two possibilities are that something is utterly borked in my GCP project and the Pub/Sub topic is either not visible just for me or has somehow been deleted, or I'm right and this topic just isn't accessible anymore.
I was stuck with the same issue, after a while I created the cloud-builds topic manually and created a cloud function that subscribed to that topic.
Build details are pushed to the topic as expected after that, and my cloud function gets triggered with new events.
You can check the existence of the cloud-builds topic an alternate way from the UI, by downloading the gcloud command line tool and, after running gcloud init, running gcloud pubsub topics list to list all topics for the configured project. If the topic projects/{your project}/topics/cloud-builds is not listed, I would suggest filing a bug to the cloud build team here.
Creating the cloud-builds topic manually won't work since it's a special topic that Google managed.
In this case, you have to go to the API central and disable the CloudBuild API, and then enable it again, the cloud-builds topic will be created for you. Enable and disable Cloud Build API

Code pipeline to build a branch on pull request

I am trying to make a code pipeline which will build my branch when I make a pull request to the master branch in AWS. I have many developers working in my organisation and all the developers work on their own branch. I am not very familiar with ccreating lambda function. Hoping for a solution
You can dynamically create pipelines everytime a new pull-request has been created. Look for the CodeCommit Triggers (in the old CodePipeline UI), you need lambda for this.
Basically it works like this: Copy existing pipeline and update the the source branch.
It is not the best, but afaik the only way to do what you want.
I was there and would not recommend it for the following reasons:
I hit this limit of 20 in my region: "Maximum number of pipelines with change detection set to periodically checking for source changes" - but, you definitely want this feature ( https://docs.aws.amazon.com/codepipeline/latest/userguide/limits.html )
The branch-deleted trigger does not work correctly, so you can not delete the created pipeline, when the branch has been merged into master.
I would recommend you to use Github.com if you need a workflow as you described. Sorry for this.
I have recently implemented an approach that uses CodeBuild GitHub webhook support to run initial unit tests and build, and then publish the source repository and built artefacts as a zipped archive to S3.
You can then use the S3 archive as a source in CodePipeline, where you can then transition your PR artefacts and code through Integration testing, Staging deployments etc...
This is quite a powerful pattern, although one trap here is that if you have a lot of pull requests being created at a single time, you can get CodePipeline executions being superseded given only one execution can proceed through a given stage at a time (this is actually a really important property, especially if your integration tests run against shared resources and you don't want multiple instances of your application running data setup/teardown tasks at the same time). To overcome this, I publish an S3 notification to an SQS FIFO queue when CodeBuild publishes the S3 artifact, and then poll the queue, copying each artifact to a different S3 location that triggers CodePipeline, but only if there are are currently no executions waiting to execute after the first CodePipeline source stage.
We can very well have dynamic branching support with the following approach.
One of the limitations in AWS code-pipeline is that we have to specify branch names while creating the pipeline. We can however overcome this issue using the architecture shown below.
flow diagram
Create a Lambda function which takes the GitHub web-hook data as input, using boto3 integrate it with AWS pipeline(pull the pipeline and update), have an API gateway to make the call to the Lambda function as a rest call and at last create a web-hook to the GitHub repository.
External links:
https://aws.amazon.com/quickstart/architecture/git-to-s3-using-webhooks/
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/codepipeline.html
Related thread: Dynamically change branches on AWS CodePipeline

Queue CodeBuild tasks

Is there are way to make a CodeBuild project execute build tasks one at a time (max concurrency = 1)?
I know one of the selling points of CodeBuild is that you can run builds concurrently and I like that feature.
However, for this one specific project, I NEED to make sure only one CodeBuild build task for this project executes at a time. If there is an incoming "startBuild" request while a previous request is still running, I want it to be queued and wait until the previous build task if finished.
As additional info on the project, this project runs integration tests across our various APIs (serverless APIs and legacy APIs on EC2) and running those tests concurrently may cause the tests to fail due to their setup and teardown procedures.
I am from the AWS CodeBuild team. Thanks for your feedback. At this point the feature you requested isn't supported. We'll pass along the feedback to our product management team so they may consider adding it to our future roadmap.
However, you maybe able to implement something at your end by using CodeBuild's Build notifications feature. At a high level, you could listen to the CloudWatch Events sent by CodeBuild to find when a build completes, and at that time "release" a new build from a queue that you maintain at your end.
Update (2021-02-17)
AWS CodeBuild just added the ability to define a maximum number of concurrent builds per project.
Reference: https://docs.aws.amazon.com/codebuild/latest/userguide/create-project-console.html#create-project-console-project-config

Trigger deployment button in Jenkins pipeline

I'm setting up a Continuous Delivery pipeline for my team with Jenkins. As a final step, we want to deploy to AWS.
I came across this while searching: :
The last step is a button where you can click to trigger deploying. Very nice! However, I searched throw Jenkins plugins page but I don't think it is there (or it is under a vague name).
Any ideas what it could be?
I'm not sure about the specific plugin you are looking for, but there is a Jenkins plugin for CodeDeploy, which can automatically create a deployment as a post-build action. See: https://github.com/awslabs/aws-codedeploy-plugin
It really depends on how what kind of requirements you have on the actual deployment procedure. One thing to keep in mind if you do infrastructure as code to setup your pipelines automatically (e.g. through JobDSL or Jenkins Job Builder), is that the particular plugins must be supported. For that reason it some times might be more convenient to just script your deployments instead of relying on plugins. I've implemented multiple deployment jobs from Jenkins to AWS by just using plain AWS CLI commands, e.g. triggering Cloudformation creation/updates.
It turns out that there is a button to trigger an operation in the plugin. It was hard to detect as the UI of the plugin is redesigned and the button became smaller.