I have build triggers set up in the Google Cloud Container Builder that are set to trigger on specific branches and use the cloudbuild.yml config in the repo. For about the first day that I pushed commits to any of these branches, it triggered a container build and completed successfully. Since then, the triggers have only worked intermittently.
Sometimes Google Cloud Container Builder doesn't detect the commit at all (I have checked the commit is in bitbucket and the commit is on the right branch). At that point, I've tried manually triggering a build through the google console, but it uses the older commit that it last built on, not the latest commit. So then I will try to push small changes from the repo or do an empty commit.
Sometimes that triggers the build, sometimes not. The interesting thing is that when the build finally triggers on a branch after a while, it will trigger builds on the other branches if they have a recent commit that hasn't been built.
I have no idea how to resolve this. Has anyone experienced a similar problem?
UPDATE:
I solved my problem. I originally added each bitbucket repository to the Source Repositories in Google Cloud. After that, I added the build triggers for each of those repos in the Container Registry. When adding the trigger, I had to go through the same process of connecting to the repositories in Bitbucket as I was taken through with adding the source repositories. I later realized that it had created a separate connection in the Source Repositories section for each of these repos automatically. So I had two connections to each repo in bitbucket listed in the Source Repositories. Once I deleted the duplicates, the triggers started working consistently.
In summary, make sure you don't have any duplicate connections in the Source Repository.
There were created a separate connection in the Source Repositories section for each of these repos automatically. So there were two connections to each repo in bitbucket listed in the Source Repositories. Once deleted the duplicates, the triggers started working consistently.
In summary, make sure you don't have any duplicate connections in the Source Repository.
Related
I created a repository (without link to git) for Google Dataform and a workspace.
I initialized a first setup and pushed those first files.
Where can I see the repo and all the commits I do in there?
Looked in Cloud Storage, Artifact Registry, Cloud Source Repositories but can't find it.
Dataform does not have all the functionality of Git so without linking to Github you won’t be able to see the repository. For your requirement, you can create Dataform repositories (Git repositories containing Dataform code, essentially) and create code workspaces attached to those repositories. Edit code in those workspaces, push the results to the relevant Git repository. You can also compile the repository/workspace Dataform code to Directed Acyclic Graphs (DAGs) of executable SQL and can also execute the compiled DAGs against BigQuery.
The repositories are listed here, please see the official documentation for more context.
You may also explore the possibility of connecting to a remote repository, as the Dataform repository doesn't meet your requirement for viewing commits.
My team has been running into issues with our CodePipeline where features were pushed out into production when they shouldn't have been due to our Docker image patching. A little background on our architecture: Our pipeline has two sources, one for the source code and one for the Docker image builder. Docker builds via CodeBuild and is deployed to dev, test, and then prod environments with manual approval steps in between.
Our Docker image receives monthly patching which triggers the pipeline to execute and is what caused the features to be pushed out. We redesigned our git branching strategy so that our master branch will only contain stable releases, but I could still see this issue potentially occurring again if a specific release date is specified. Is there a way to push out the image patching without pushing out the latest commit?
Can CodePipeline Use a Specific Commit
This is an often requested feature but unfortunately CodePipeline will always bring the latest commit from the selected branch in the Source action.
CodePipeline tied to a single git branch is more of a feature of CodePipeline as the design is more inclined towards Trunk based development [0]. Also, as per the designers of this service, CodePipeline is designed for post-merge/release validation. That is, once your change is ready to be released to production and is merged into your master/main branch, CodePipeline takes over and automatically tests and releases the final merged set of changes. CodePipeline has a lot of features like stage locking, superseding versions, etc. which don't work well for the case where you want to test a change in isolation before it's merged (e.g. feature branch testing or pull request testing.) Therefore there currently isn't a recommended way to do this in CodePipeline.
[0] https://trunkbaseddevelopment.com/
Having said that, there is a way to hack this with S3 Source action in pipeline instead of GitHub/CodeCommit source action. Essentially your pipeline's S3 source action is tied to S3 bucket/key. You can then upload a zip of any specific commit to this S3 bucket/key and trigger the pipeline.
I have created a build pipeline.
have master, develop and feature/* branches in my Azure repo.
I have created a branch policy to require a build for feature/* branches.
How do I trigger an automatic build on pull request? Or even how do I queue a build manually on the pull request?
I can't see such option on my pull request screen in DevOps.
As far as I know the build policy should appear above Work Items on the right hand side. My policy does not appear there and I have no even a possibility to trigger the build manually.
I am not sure what I am doing wrong? Or what is missing?
The screenshot you provided shows that the PR is for the develop branch. If you want a PR for develop to trigger a build, then set a policy on the develop branch.
Branch policies apply to the target branch, not the source branch.
I have defined a simple configuration in Google Cloud Build that mirrors a github repository and triggers when I push to master. However, for some time, the build is not triggered anymore when I push. And when I trigger the build manually, an old commit is built.
Deleting and recreating the trigger didn't help.
How can I fix this?
As far as I can tell, this is a bug on Google's side but here's a workaround how I was able to fix it.
First, delete your Cloud Build trigger.
Then, navigate to Google Cloud Source Repositories. You should be able to find the repo that is mirrored from Github. Click on the settings icon next to repo and then click on "Disconnect this repository".
Now, recreate the trigger from scratch.
I created a new build pipeline for my latest project at Azure DevOps Server 2019. I use the provided Git repository with branch policies to protect master and develop from changes without pull request.
Each pull request requires a successful build.Unfortunately, Azure DevOps Server always creates a new build for each pull request (highlighted red in the image below).
I don’t need that build. The last build status of the assigned branch is enough.Is there any way to disable the pull request build and use the last branch status instead?
No, the new build is required, because it should verify whether the build works fine after migrating the code from source branch to target branch (master/develop).
Regarding the build for assigned branch, it just verifies the current branch instead of the migrated code. (maybe there are some bugs/issues after migrating code)