Deploy to GKE when there is new image in Cloud Repository - google-cloud-platform

I have a very interesting task.
I'm using 5 different triggers to build 5 images on every push in gitbranch.
After they are built I push them to 5 different GCP Container registries.
I want to automate the GKE deployment, so when I have new image in the registry to deploy them to GKE automatically.
How can I achieve that ?

I proposed you something very similar. Instead of use Container Registry, use Artifact registry. Create a Docker registry and use it as you use Container Registry.
Then, activate the audit logs on artifact registry
You only need the write audit logs in this case
Finally, create a Cloud Logging sink to PubSub on this filter:
protoPayload.serviceName="artifactregistry.googleapis.com"
protoPayload.methodName="Docker-PutManifest"
protoPayload.authorizationInfo.resource="projects/<PROJECT_ID>/locations/<YOUR_LOCATION>/repositories/<REPOSITORY_NAME>"
Now, you only have to listen this event with Cloud Function or Cloud Run and do what you want, deploy a new revision for example.

Related

Gitlab & AWS parameter store

We want to save all our AWS accounts credentials in AWS parameter store for better security.
now the question is:
How can we use the credentials stored in AWS parameter store in GitLab for deployment?
In your project, you can configure .gitlab_ci.yaml to make many things, one of them is to deploy your application, and there are many ways, one of them is to:
Create a docker of your project
Push the image to ECR
Create a new ECS task definition with the new version of your docker image
Create a new ECS service with the new version of the task definition
and to do so, you need effectively the credentials of AWS that you have configured in your GitLab repository.
After that their many ways to deploy from GitLab to AWS, it depends on your company and what tools you are using.

How to auto-deploy to Compute Engine from Artifact Registry?

Context
I'm working on an app. The code is in a Cloud Source Repository. I've set up a Build Trigger with Cloud Build so that when I push new commits, the app is automatically built: it's containerized and the image is pushed to the Artifact Registry.
I also have a Compute Engine VM instance with a Container-Optimized OS. It's set up to use my app's container image. So when I start the VM, it pulls the latest image from the Artifact Registry and runs the container.
Issue
So, currently, deploying involves two steps:
Pushing new commits, which updates the container in the Artifact Registry.
Restarting my VM, which pulls the new container from the Artifact Registry.
Is there a way to combine these two steps?
Build Triggers detect code changes to trigger builds. Is there similar way for automatically triggering deployments from the Artifact Registry to Compute Engine?
Thank you.

Deploy a container to Cloud Run after pushing to Artifact Registry automaticlly

I want to setup a connection (without writing code) between Google Cloud Artifact Registry and Cloud Run. So on every push, I want to create/update a service on Cloud Run with the same name.
Is this possible?
No, you can't. The system that push the image must just after trigger the deployment of a new Cloud Run revision, with the new image. the latest tag if you want.
Nothing automatic is possible on a push event on Artifact registry.
I would suggest u thinking about Container Registry , then u can connect automatic events.

AWS-ECS Task Services restart automation

Currently we are running application with serverless architecture microservices using AWS ECS, whenever we deployed or update new artifacts on ECR, we need to restart the services by changing the tasks from 0 to 1 vice versa to restart the service and pick-up the new artifacts. As we know this process is very manual and taking some steps to accomplish, I want to automate this, is it possible to use AWS-lambda or cloudwatch? or any configuration as long as to skip the manual process. What kind of code and language and example of automation do I need to achieve this?
Take a look at ecs-deploy script. Basically it will replace an existing service with a latest (or specific) image from ECR. So if you have automation to update ECR with the latest image this script will deploy that image to ECS
A setup which could work if you have a CI/CD pipeline is upon uploading to ECR, trigger a lambda which resets that corresponding service. Supplying any variables to the lambda such as ECR tag to pull or service name.
ECS has an option to restart services with ForceNewDeployment. In Python the call would look like.
updateTrigger = client.update_service(
cluster = myClusterName,
service = serviceToUpdate,
desiredCount = 1,
forceNewDeployment=True
)
From https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ecs.html#ECS.Client.update_service

Google Kubernetes Engine CD CI with Git

I'm relatively new to Google Kubernetes Engine and Google cloud platform.
I managed to use and connect the following services.
Source Repositories
Cloud Builder and Container Registry
Kubernetes
Engine
I'm currently using git bash on my local machine to push it to Google source repositories. Google Cloud Build builds the image and creates a new artifact. Each time I change my app and push the changes to cloud repositories a new artifact is created. I would then copy the new artifact to Kubernetes Workloads Rolling Update
Is there a better way to automate this? e.g. CD/CI without
You can set the rolling update strategy in your deployment spec from the beginning.
You can then use Cloud Build to push new images to your cluster once the image has been built instead of manually going to the GKE console and update the image.