Google Kubernetes Engine CD CI with Git - google-cloud-platform

I'm relatively new to Google Kubernetes Engine and Google cloud platform.
I managed to use and connect the following services.
Source Repositories
Cloud Builder and Container Registry
Kubernetes
Engine
I'm currently using git bash on my local machine to push it to Google source repositories. Google Cloud Build builds the image and creates a new artifact. Each time I change my app and push the changes to cloud repositories a new artifact is created. I would then copy the new artifact to Kubernetes Workloads Rolling Update
Is there a better way to automate this? e.g. CD/CI without

You can set the rolling update strategy in your deployment spec from the beginning.
You can then use Cloud Build to push new images to your cluster once the image has been built instead of manually going to the GKE console and update the image.

Related

How to auto-deploy to Compute Engine from Artifact Registry?

Context
I'm working on an app. The code is in a Cloud Source Repository. I've set up a Build Trigger with Cloud Build so that when I push new commits, the app is automatically built: it's containerized and the image is pushed to the Artifact Registry.
I also have a Compute Engine VM instance with a Container-Optimized OS. It's set up to use my app's container image. So when I start the VM, it pulls the latest image from the Artifact Registry and runs the container.
Issue
So, currently, deploying involves two steps:
Pushing new commits, which updates the container in the Artifact Registry.
Restarting my VM, which pulls the new container from the Artifact Registry.
Is there a way to combine these two steps?
Build Triggers detect code changes to trigger builds. Is there similar way for automatically triggering deployments from the Artifact Registry to Compute Engine?
Thank you.

Deploy to GKE when there is new image in Cloud Repository

I have a very interesting task.
I'm using 5 different triggers to build 5 images on every push in gitbranch.
After they are built I push them to 5 different GCP Container registries.
I want to automate the GKE deployment, so when I have new image in the registry to deploy them to GKE automatically.
How can I achieve that ?
I proposed you something very similar. Instead of use Container Registry, use Artifact registry. Create a Docker registry and use it as you use Container Registry.
Then, activate the audit logs on artifact registry
You only need the write audit logs in this case
Finally, create a Cloud Logging sink to PubSub on this filter:
protoPayload.serviceName="artifactregistry.googleapis.com"
protoPayload.methodName="Docker-PutManifest"
protoPayload.authorizationInfo.resource="projects/<PROJECT_ID>/locations/<YOUR_LOCATION>/repositories/<REPOSITORY_NAME>"
Now, you only have to listen this event with Cloud Function or Cloud Run and do what you want, deploy a new revision for example.

How to deploy a container to multiple GCP projects and host with Cloud Run?

We have a requirement to deploy our application in multiple GCP projects with new projects being provisioned by Terraform. We have Terraform configured Cloud Build in each project, but we run into issues when the Cloud Build attempts to access the Source Repo in our centralized project.
We would prefer not to clone the repo, but rather instruct Cloud Build to consume and deploy from the central repo. It is also important that we have Cloud Build update each project as new code is deployed.
You should use a central project to run a single Cloud Build trigger that will build, push built container image in the project and deploy to Cloud Run services in other projects.
In order for the Cloud Build trigger to be allowed to deploy to Cloud Run in other projects, follow these instructions to grant the Cloud Build service agent the appropriate permission on the other projects
In order for Cloud Run to be able to import images from the central project, make sure you follow these instructions for each Service agent of each project.

Can I use Cloud Build to perform CI/CD tasks in VM instances?

I'm using Google Cloud Platform and exploring its CI/CD tools.
I have an app deployed in a VM instance and I'm wondering if I can use GCP's tool such as Cloud Build to do CI/CD instead of using Jenkins.
From what I've learned over several resources, Cloud Build seems to be a nice tool for Cloud Run (deploying Docker images) and Cloud Functions.
Can I use it for apps deployed in VM instances?
When you create a job in Cloud Build, you set up a cloudbuild.yaml file in which you specify the build steps. How to write the step such that it will go into a linux VM, log in as a particular user, cd into a directory, pull the master branch of the project repo, and start running the main.py (say it's a python project)?
You can do this like that
- name: gcr.io/cloud-builders/gcloud
entrypoint: "bash"
args:
- "-c"
- |
gcloud compute ssh --zone us-central1-a my_user#oracle -- whoami;ls -la;echo "cool"
However, it's not a cloud native solution to deploy an app. The VM aren't "pets" but "cattle", that means, when you no longer need it, kill it, no emotion!
So, a modern way to use the cloud, is to create a new VM with the new version of your app. Optionally, you can keep the previous VM, stopped (to pay nothing) in case of rollback. To achieve this, you can add a startup script which install all the required packages, libraries, and you app on the VM, and start it.
An easiest way is to create a container. Like this, all the system dependencies are inside the container, and the VM doesn't need any customization: simply download the container and run it
Cloud Build allows you creating a VM with a startup script with the gcloud CLI. You can also stop the previous one. Do you have a persistent to reuse (for the data between version)? with cloud build you can also clone it and attach it to the new VM; or detach it from the previous one and attach it to the new one!

Deploying a custom build of Datalab to Google Cloud platform

For a project we are trying to expand Google Cloud Datalab and deploy the modified version to the Google Cloud platform. As I understand it, the deploying process normally consists of the following steps:
Build the Docker image
Push it to the Container Registry
Use the container parameter with the Google Cloud deployer to specify the correct Docker image, as explained here.
Since the default container registry, i.e. gcr.io/cloud_datalab/datalab:<tag> is off-limits for non-Datalab contributors, we pushed the Docker image to our own container registry, i.e. to gcr.io/<project_id>/datalab:<tag>.
However, the Google Cloud deployer only pulls directly from gcr.io/cloud_datalab/datalab:<tag> (with the tag specified by the containerparameter) and does not seem to allow specification of the source container registry. The deployer does not appear to be open-source, leaving us with no way to deploy our image to Google Cloud.
We have looked into creating a custom deployment similar to the example listed here but this never starts Datalab, so we suspect the start script is more complicated.
Question: How can we deploy a Datalab image from our own container registry to Google Cloud?
Many thanks in advance.
The deployment parameters can be guessed but it is easier to get the Google Cloud Datalab deployment script by sshing to the temporary compute node that is responsible for deployment and browsing the /datalab folder. This contains a runtime configuration file for use with the App Engine Flexible Environment. Using this configuration file, the google preview app deploy command (which accepts an --image parameter for Docker images) will deploy this to the App Engine correctly.