How to pass branch/commit to Could Run instance? - google-cloud-platform

I have several projects that run on Google Cloud Run. Cloud Build connects each instance to a corresponding branch of a Git repository. Each time a commit is pushed to a branch, a build is triggered to update the Cloud Run instance.
I'd like to be able to show information about the build within the Cloud Run application (e.g. branch and commit that the build has been built from). How can I pass this information from the repo/commit/build to the instance?

As #guillaume blaquiere stated in his comment:
You have the information in Cloud Build when it runs. You can get the data and paste them in your container somewhere. Then you have to serve them. Depends on your implementation.

Related

In GCP, how can I trigger the automatic deployment of Cloud Function in to Production project from the source files present in repo of DEV project

I want to automate the deployment of Cloud Function in to Production project through Cloud Build whose source files are present in Cloud Source Repository of DEV project. How can I ensure that the moment I push the code in production branch of Cloud Source Repository of DEV project, the Cloud Function gets created in to Production Project .
If I understand, you are trying to trigger a build from a repository stored on another project.
This is not possible, the build triggers must be on the same project than the repositories
I think my answer will help here: How to pass API parameters to GCP cloud build triggers
Basically what Claudio recommended, use the examples to build your steps. I believe what you want to do is create a step that the. Triggers the cloud function when you push changes to the dev production branch. When the trigger is called and ran, you then add a step to either run the cloud function or use the REST API to trigger the build by its ID. See my example above.

Can I run a Cloud build on my own VM intances

Cloud build uses worker pool of VM and that is not able to access my on-prem Compute Engine resources So, is there any way to run cloud build on my own VM or any solution for these?
While waiting for the custom worker-pool feature you mentioned in your previous question to become available to public, you can use the custom builder remote-builder.
You'll need to first build the builder image that you'll be able to use then in your Cloud Builds steps. When using the remote-builder image, the following will happen:
A temporary SSH key will be created in your Container Builder
workspace
A instance will be launched with your configured flags
The workpace will be copied to the remote instance
Your command will be run inside that instance's workspace
The workspace will be copied back to your Container Builder
workspace
The build steps using this builder image will therefore run on a VM instance in your project's network and will be able to access other resources, provided your network configuration allows it.
Edit: The cos image used in the example cloudbuild.yaml file seems to include it so you'd be able to run it directly. In case you'd like to customize your instances with specific software, you have several options:
you can create an instance template (based on a custom image that includes the software or with a startup script that will install it at boot time) and specify that instance template in INSTANCE_ARGS in your cloudbuild.yaml.
you can use a standard image and just pass the startup script installing the software as INSTANCE_ARGS.
you can install it within a shell script executed in your build step.
Why can't you just fix the access issue? You can configure cloud build to create build workers within your VPC within your cloud infrastructure:
See the following video which explain how this works:
https://youtu.be/IUKCbq1WNWc?t=820
Hope this helps.

Google Cloud Function - What is the default update behaviour for functions that rely on code in a Repo?

I have a Google Cloud Function that runs from source code stored in a Google Cloud Source Repository. If I update the source code in the repo, do I have to manually update the cloud function or is this done automatically?
There is no automatic deployment. You will have to run whatever command line you would normally run to deploy the code to Cloud Functions.

How come Google Cloud Build trigger can't find custom named .yaml files?

The Problem
When using, cloudbuild.yaml files, specifically named for their build environment, such as cloudbuild-dev.yaml and cloudbuild-prod.yaml, and configured/targeted in the Trigger settings they aren't found/recognized when GCB reacts to a GitHub event (push etc).
However, it's working just fine when manually running the Trigger from GCB console.
When using an ordinarily named cloudbuild.yaml, in the root of the project, Cloud Build correctly runs the expected steps.
The Workaround
In short, there isn't an easy one (imo). But to get it run you need to use just a single _cloudbuild.yaml).
However, to effectively re-use that for both dev and prod environments one is blocked by this issue

Google cloud container builder not always triggering from bitbucket

I have build triggers set up in the Google Cloud Container Builder that are set to trigger on specific branches and use the cloudbuild.yml config in the repo. For about the first day that I pushed commits to any of these branches, it triggered a container build and completed successfully. Since then, the triggers have only worked intermittently.
Sometimes Google Cloud Container Builder doesn't detect the commit at all (I have checked the commit is in bitbucket and the commit is on the right branch). At that point, I've tried manually triggering a build through the google console, but it uses the older commit that it last built on, not the latest commit. So then I will try to push small changes from the repo or do an empty commit.
Sometimes that triggers the build, sometimes not. The interesting thing is that when the build finally triggers on a branch after a while, it will trigger builds on the other branches if they have a recent commit that hasn't been built.
I have no idea how to resolve this. Has anyone experienced a similar problem?
UPDATE:
I solved my problem. I originally added each bitbucket repository to the Source Repositories in Google Cloud. After that, I added the build triggers for each of those repos in the Container Registry. When adding the trigger, I had to go through the same process of connecting to the repositories in Bitbucket as I was taken through with adding the source repositories. I later realized that it had created a separate connection in the Source Repositories section for each of these repos automatically. So I had two connections to each repo in bitbucket listed in the Source Repositories. Once I deleted the duplicates, the triggers started working consistently.
In summary, make sure you don't have any duplicate connections in the Source Repository.
There were created a separate connection in the Source Repositories section for each of these repos automatically. So there were two connections to each repo in bitbucket listed in the Source Repositories. Once deleted the duplicates, the triggers started working consistently.
In summary, make sure you don't have any duplicate connections in the Source Repository.