Google Cloud Function - What is the default update behaviour for functions that rely on code in a Repo? - google-cloud-platform

I have a Google Cloud Function that runs from source code stored in a Google Cloud Source Repository. If I update the source code in the repo, do I have to manually update the cloud function or is this done automatically?

There is no automatic deployment. You will have to run whatever command line you would normally run to deploy the code to Cloud Functions.

Related

How to pass branch/commit to Could Run instance?

I have several projects that run on Google Cloud Run. Cloud Build connects each instance to a corresponding branch of a Git repository. Each time a commit is pushed to a branch, a build is triggered to update the Cloud Run instance.
I'd like to be able to show information about the build within the Cloud Run application (e.g. branch and commit that the build has been built from). How can I pass this information from the repo/commit/build to the instance?
As #guillaume blaquiere stated in his comment:
You have the information in Cloud Build when it runs. You can get the data and paste them in your container somewhere. Then you have to serve them. Depends on your implementation.

How to re-deploy (update) a Cloud Function via CLI?

I have created a http-triggered Cloud Function via the GUI, which uses source code from a repository of mine. If the source code changes, I can fairly easily re-deploy(update) the Cloud Function manually via the UI by clicking on Edit -> Code -> Deploy.
I would like to set up a CI/CD pipeline, using Cloud Build, with a trigger on the repo, so that when the source code changes (on master branch), the Cloud Function is re-deployed. In order to do this, I need to figure out how to re-deploy the Cloud Function via the CLI. Reading the cloud functions deploy docs, it says "create or update a Google Cloud Function". However, how much I try, I don't manage to update my existing Cloud Function, only create new ones.
I've tried updating it like specified in this SO answer, by running;
gcloud functions deploy <MY-FUNCTION-NAME> --source=https://source.developers.google.com/projects/<MY-PROJECT>/repos/<MY-REPO>/paths/<FOLDER-PATH>
which gives the error One of arguments [--trigger-topic, --trigger-bucket, --trigger-http, --trigger-event] is required: You must specify a trigger when deploying a new function. Notice the ... when deploying a new function.
Any ideas of how I can re-deploy (update) my existing one, and then (automatically), also use the latest source code?
After a lot of testing-different-stuff, I finally figured it out. In order to re-deploy the same Cloud Function, I needed to specify all arguments that defined my Cloud Function. Only the required ones were not enough. To re-deploy;
gcloud functions deploy <MY-FUNCTION-NAME> --source=https://source.developers.google.com/projects/<MY-PROJECT>/repos/<MY-REPO>/moveable-aliases/<MY-BRANCH>/paths/<FOLDER-PATH> --trigger-http --runtime=<RUNTIME> --allow-unauthenticated --entry-point=<ENTRY-POINT> --region=<REGION> --memory=<MEMORY>
You need to pass all arguments you used in the first deploy again so it'll conclude that's the same function. Give a look here to more into creating CI pipelines to functions with cloud build.

In GCP, how can I trigger the automatic deployment of Cloud Function in to Production project from the source files present in repo of DEV project

I want to automate the deployment of Cloud Function in to Production project through Cloud Build whose source files are present in Cloud Source Repository of DEV project. How can I ensure that the moment I push the code in production branch of Cloud Source Repository of DEV project, the Cloud Function gets created in to Production Project .
If I understand, you are trying to trigger a build from a repository stored on another project.
This is not possible, the build triggers must be on the same project than the repositories
I think my answer will help here: How to pass API parameters to GCP cloud build triggers
Basically what Claudio recommended, use the examples to build your steps. I believe what you want to do is create a step that the. Triggers the cloud function when you push changes to the dev production branch. When the trigger is called and ran, you then add a step to either run the cloud function or use the REST API to trigger the build by its ID. See my example above.

Is it possible to call Google Cloud Build via API with cloudbuild.yaml in source repo

I have an Cloud Build trigger that is set up to fetch a specific branch from a remote repository and use the cloudbuild .yaml file from it to run the build. I need to do the same programatically via an API call and the only way I can see this happening is with an API call here: https://cloud.google.com/cloud-build/docs/api/reference/rest/v1/projects.builds/create.
By looking at the required request body which is of type Build https://cloud.google.com/cloud-build/docs/api/reference/rest/Shared.Types/Build it seems that while I can provide the source repository (branch, commit or tag) I'm not sure how to provide the path to cloudbuild.yaml which in this case really needs to sit with the repo source code. Also it seems a steps param is required which actually has to be the steps from the cloudbuild file.
Is there any way that I can create a build using the same resources that I would otherwise set up in a cloud trigger (path to (cloudbuild.yaml instead of the steps to which I don't have access to in this case)?

Do I need to deploy function in gcloud in order to have OCR?

This GCloud Tutorial has a "Deploying the function", such as
gcloud functions deploy ocr-extract --trigger-bucket YOUR_IMAGE_BUCKET_NAME --entry-point
But at Quickstart: Using Client Libraries does not mention it at all, all it needs is
npm install --save #google-cloud/storage
then a few lines of code will work.
So I'm confused, do I need the "deploy" in order to have OCR, in other words what do/don't I get from "deploy"?
The command
npm install --save #google-cloud/storage
is an example of installing the Google Cloud Client Library for Node.js in your development environment, in this case, Cloud Storage API. This example is part of Setting Up a Node.js Development Environment tutorial.
Once you have coded, tested and set all the configurations for the app as described in the tutorial the next step would be the deployment, in this example a Cloud Function:
gcloud functions deploy ocr-extract --trigger-bucket YOUR_IMAGE_BUCKET_NAME --entry-point
So, note that this commands are two different steps to run OCR with Cloud Functions, Cloud Storage and other Cloud Platform components in the tutorial example using Node.js environment.
While Cloud Function (CF) is easy to understand, this answers specifically my own question, what does the "Deploy" actually do:
to have the code work for you, they must be deployed/uploaded to the GC. For people like me never done GCF this is new. My understanding was all I need to supply is credentials and satisfy the whatever server/backend (sorry, cloud) settings when my local app calls the remote Web API. That's where I stucked. The key I missed is the sample app itself is a server/backend event-handler trigger functions, and therefore Google requires them to be "deployed" just like when we deploy something during a staging or production release in a traditional corporate environment. So it's a real deploy. If you still don't get it, go to your GC admin page, menu, Cloud Function, "Overview" tab, you will see them. Hence goes to next
The 3 GC deploy command used in the Deploying Functions have ocr-extract ocr-save ocr-translate, they are not switches, they are function names that you can name them anything. Now, still in the Admin page, click on any of 3, "Source". Bang, they are there, deployed (uploaded).
Google, as this is a tutorial no one has digged into command reference book yet, I recommend adding a piece of note telling readers those 3 ocr-* can be anything you want to name.