I am trying to figure out how to trigger a CI CD pipeline from a non source control trigger.
My plan is to use a Google Web Form, to collect all of the variables needed in my scripts, keeping the on boarding process easy enough for non technical staff. Using the Google Forms API Script Editor, I take the submit response JSON, and do a Put to an s3 Bucket / Object.
I would like that PUT (Write Operation), to trigger a CI CD Pipeline.
The CI CD tool is not important, as it seems all CI CD Tools can only use outgoing Web Hooks to push to something, like a Slack Channel, and not ingest, like an API, or POST / PUT / Event.
My Question:
Is it possible to trigger a Pipeline using a PUT or POST?
Tools i would ideally like to use, would be Gitlab CI, Or even Jenkins if it opens up more possibilities.
I have done alot of reading, and am having a hard time coming up with a solution. I would think this was something people would use often, rather than just a simply commit or merge to a source Control Branch...
From what i have Gathered, the API Endpoints of CI Tools, can only process a source control trigger.
Please if anyone have any input on how to achieve this. I am willing to figure out how to create an API, if that somehow helps.
I would like to focus on AWS atm, but the goals would be to also use this solution, or its equivalent in Azure
In the job settings, scroll to Build Triggers section and find a checkbox named "Trigger builds remotely (e.g., from scripts)". You need to provide a token (so only people who know the token may trigger your job). Once this is done, you can trigger a pipeline using curl:
curl 'myjenkins.mycompany.net/job/MyJobName/build?token=myverysecrettoken&cause=Cause+I+Decided+So'
curl 'myjenkins.mycompany.net/job/MyJobName/buildWithParameters?PARAM1=string1&PARAM2=string2&token=myverysecrettoken'
See also Generic Webhook Trigger Plugin for examples.
For those new to pipelines like me, and looking for similar guidance with Gitlab CI:
The same kind of curl request can be made to trigger a pipeline.
However for my specific question, i was looking to trigger the pipeline by sending a POST to the Gitlab CI API directly using HTTPS endpoint. Curl command did not fit my needs
To achieve this, you can use the Gitlab CI Webhook, for other projects:
Just fill in the Ref (branch name), and the Gitlab Project ID
Example:
https://gitlab.com/api/v4/projects/19577683/ref/master/trigger/pipeline?token=4785b192773907c280845066093s93
To use the curl command, to hit the Gitlab Projects Trigger API, similar to Jenkins:
Simply supply the Tigger Token, you create in the Project / CI CD / Trigger section of Gitlab, and specify the Ref, which is a branch name, or tag
Example:
curl -X POST \
-F token=4785b192773907c280845066093s93 \
-F ref=master \
https://gitlab.com/api/v4/projects/19577683/trigger/pipeline
Related
Background
I want to create the following CI/CD flow in AWS and Github, for a react app using Amplify:
A single main branch, with short-lived feature branches and PRs into main.
Each PR triggers its own test environment in Amplify, with its own temporary subdomain, which gets torn down when the PR is merged, as described here.
Merging into main does not automatically trigger a deploy to production.
Instead, there is a separate mechanism (a web page, or amplify command, or even triggers based on git tags) for manually selecting a commit from main to deploy to production.
Questions
It's not clear to me if...
Support for this flow is already built into Amplify (based on the docs I've read, I think the answer is "no", but I'm not sure).
Support for this flow is already built into AWS CodePipeline, or if it can be configured there.
There is another AWS tool that solves this.
I'm looking for answers to those questions, or specific references in the docs which address them.
The answers for Amplify are Yes, Yes, Yes, Partially.
(1) A single main branch, with short-lived feature branches and PRs into main.
Yes. Feature branch deploys. Can define which branch patterns, such as feature*/, you wish to auto-deploy.
(2) Each PR triggers its own test environment in Amplify, with its own temporary subdomain,
Yes. Web Previews for PRs. "A web preview deploys every pull request made to your GitHub repository to a unique preview URL which is completely different from the URL your main site uses."
(3) Merging into main does not automatically trigger a deploy to production.
Yes. Disable automatic builds on main.
(4) Instead, there is a separate mechanism ... for manually selecting a commit from main to deploy to production.
Partially (HEAD only?). Call the StartJob API to manually trigger a build from, say, Lambda. The job type RELEASE starts a new job with the latest change from the specified branch. I am not sure if jobType: MANUAL with a commitId starts a job from an arbitrary commit hash.
Another workaround for 3+4 is to skip the build for an arbitrary commit. Amplify will skip building if [skip-cd] appears at the end of a commit message.
In my experience, I don't think there is any easy way to meet your requirement.
If you are using Gitlab, you can try Gitlab Review Apps to achieve that (I tried before with some scripts)
Support for this flow is already built into Amplify (based on the docs I've read, I think the answer is "no", but I'm not sure).
Check below links, if this help:
https://www.youtube.com/watch?v=QV2WS535nyI
https://dev.to/rajandmr/deploying-react-app-using-aws-amplify-with-ci-cd-pipeline-setup-3lid
Support for this flow is already built into AWS CodePipeline, or if it can be configured there.
For this, you need to create a full your own pipeline. Yes, you can configure your pipeline.
There is another AWS tool that solves this.
If you are okay with Jenkins, then Jenkins will help you to achieve this.
You can deploy Jenkins docker in AWS EC2 and create your pipeline. You can also use the parameterised option for selecting your environment and git branch.
I'm using an AWS Lambda function to kick off a build in AWS CodeBuild when a Pull Request is created or updated in AWS CodeComimit, which is working well.
However, I'd like to be able to prevent the merging of that Pull Request in to the master branch of the repository, until the latest build for that PR has completed successfully.
Does anyone know if there's a way that can be done in AWS? E.g. so that the Merge button is disabled or not available, like when not enough approvers have been obtained?
I was looking into this myself and from what I understand, it is currently not possible to directly create this rule, but I think it should be doable with a different approach.
Instead of requiring a custom rule that disables merging (which doesn't exist today), you could make it so that the PR requires review from a specific IAM user. With that, you could probably use a fixed "build" user, and fire an automatic approval request for the PR once the build finishes successfully. This will in turn "approve" that rule in the PR and allow it to be merged after the build succeeds.
Since approval can be done via the CLI interface, I'm sure it should also be possible via API. For example, you could use this API to automatically mark any given PR as approved by the calling user, then ensure the service that is calling it is the same user registered in the "build" approval template.
Besides the HTTP WebApi, there are also other ways to call into these CodeCommit actions, like the AWS SDK library (C# example: https://www.nuget.org/packages/AWSSDK.CodeCommit/).
I've seen many discussions on-line about Sonar web-hooks to send scan results to Jenkins, but as a CodePipeline acolyte, I could use some basic help with the steps to supply Sonar scan results (e.g., quality-gate pass/fail status) to the pipeline.
Is the Sonar web-hook the right way to go, or is it possible to use Sonar's API to fetch the status of a scan for a given code-project?
Our code is in BitBucket. I'm working with the AWS admin who will create the CodePipeline that fires when code is attempted to be pushed into the repo. sonar-scanner will be run, and then we'd like the pipeline to stop if the quality does not pass the Quality Gate.
If I would use a Sonar web-hook, I imagine the value for host would be, what, the AWS instance running the CodeBuild?
Any pointers, references, examples welcome.
I created a powershell to use with Azure DevOps, that possible may be migrated to some shell script that runs in the code build activity
https://github.com/michaelcostabr/SonarQubeBuildBreaker
I'm setting up a Continuous Delivery pipeline for my team with Jenkins. As a final step, we want to deploy to AWS.
I came across this while searching: :
The last step is a button where you can click to trigger deploying. Very nice! However, I searched throw Jenkins plugins page but I don't think it is there (or it is under a vague name).
Any ideas what it could be?
I'm not sure about the specific plugin you are looking for, but there is a Jenkins plugin for CodeDeploy, which can automatically create a deployment as a post-build action. See: https://github.com/awslabs/aws-codedeploy-plugin
It really depends on how what kind of requirements you have on the actual deployment procedure. One thing to keep in mind if you do infrastructure as code to setup your pipelines automatically (e.g. through JobDSL or Jenkins Job Builder), is that the particular plugins must be supported. For that reason it some times might be more convenient to just script your deployments instead of relying on plugins. I've implemented multiple deployment jobs from Jenkins to AWS by just using plain AWS CLI commands, e.g. triggering Cloudformation creation/updates.
It turns out that there is a button to trigger an operation in the plugin. It was hard to detect as the UI of the plugin is redesigned and the button became smaller.
I have a SCM that only allows HTTP push/pull/poll requests. Without modifying my SCM, I would like Jenkins to trigger a build (as soon as possible) when new code is checked in.
Developers usually get notified of new code via a RSS Feed.
Is there a recommended Jenkins plugin that can help with this?
If you are using Git then there is github plugin. Git will notify Jenkins whenever there is a new commit.
If you are using svn, use the feature poll scm. Specify a time interval so that Jenkins will look for new commits based on configured time.