Google Cloud Build - How to backup builds? - google-cloud-platform

I have Google Cloud Build and Kubernetes Engine set up in my project and I want to back my builds to another project. I am doing it in order to have a backup for a case of a disaster so I will be able to restore the builds.
I noticed that all of the builds are saved into a bucket called: artifacts.{project-id}.appspot.com
Option I came up with
Making a transfer of this bucket into another project.
This will physically backup these builds.
Questions
If the original project gets deleted will this be enough for me to restore the builds? How will i do that?
What other ways can I backup these builds?

Cloud build creates a Docker image and it uploads it to Google Cointainer Registry.
Answer to 1:
yes, if the bucket is transfered from project A to project B if project A is deleted the images in project B will not be affected.
Answer to 2:
You can copy it from a container registry location to another or dowload it to your local computer.
To copy the docker image in container registry to another location you can use the following command from your cloud Shell:
gcloud container images add-tag \
[SOURCE_HOSTNAME]/[SOURCE_PROJECT-ID]/[SOURCE_IMAGE]:[SOURCE_TAG] \
[DESTINATION_HOSTNAME]/[DESTINATION_PROJECT-ID]/[DESTINATION_IMAGE]:[DESTINATION_TAG]
The hostnames will be one of: grc.io, eu.gcr.io us.gcr.io asia.gcr.io
Project-IDs are the source and destination project ids
and the image and tags are the ones you choose for the image

Related

Google Cloud Container Registry Subdirectory Create

I have an image in registry1 and want to copy it over to a registry2.
Assuming I would get that done by running this command (as stated here):
gcloud container images add-tag \ gcr.io/registry1/image1:tag1 \ gcr.io/registry2/subdirectory/image1:tag1
But I want to create a new subdirectory within the registry2 it's being moved such as:
gcr.io/registry2/subdirectory/image1:tag1 how can I create it?
Also, I'm super new to GCR. Are these called subdirectories, folders?
These are called repositories not folders or directories! See here.
BTW, Google Container Registry (gcr) doesn't provide full control on these repositories, moreover, you can't assign specific IAM role/s to specific repository. to get these options, move to Artifact Registry.
To create a repository in gcr, you need to have write access to the registry. and don't do it with gcloud, I mean don't re-tag the source image while it is still in your source repo.
To do that, just pull the source image locally and then re-tag it using docker with the full tag
gcr.io/registry2/repository/image1:tag1
then push it and that's it! your image should be pushed to the destination repository in registry2. See Pushing an image to a registry

Backup AWS Workspace file system

I have an workspace in AWS workspace with a lot of configuration files, installed software and files with templates, shell scripts and code, so it's fully configured.
My problem is that when I try to create an image, I lost everything but the installed software. So anybody knows how can I create a backup of my AWS workspace to avoid to have to configure the desktop in terrible case where my images and my workspaces was accidentally removed?
Thanks.
As per the official docs,
A custom image contains only the OS, software, and settings for the WorkSpace. A custom bundle is a combination of both that custom image and the hardware from which a WorkSpace can be launched.
Seems like the image does not carry forward the personal settings like set wallpaper or any browser settings. I experienced this myself.
However, if you are worried about losing whatever configurations you have done if workspace becomes unhealthy, then you can use Rebuild or Restore option.
By default aws takes auto snapshots of root & user volumes of your workspace every 12 hrs.
You can read more bout this here
in terrible case where my images and my workspaces was accidentally removed
If your workspace is deleted/ terminated, no data can be retrieved.

How to create a folder in Google Drive Sync created cloud directory?

This question assumes you have used Google Drive Sync or at least have knowledge of what files it creates in your cloud drive
While using rclone to sync a local ubuntu directory to a Google Drive (a.k.a. gdrive) location, I found that rclone wasn't able to (error googleapi: Error 500: Internal Error, internalError; the Google Cloud Platform API console revealed that the gdrive API call drive.files.create was failing)
By location I mean the root of the directory structure that the Google Drive Sync app creates on the cloud (eg. emboldened of say: Computers/laptopName/(syncedFolder1,syncedFolder2,...)). In the current case, the gdrive sync app (famously unavailable on Linux) was running from a separate Windows machine. It was in this location that rclone wasn't able to create a dir.
Forget rclone. Trying to manually create the folder in the web app also fails as follows.
Working...
Could not execute action
Why is this happening and how to achieve this - making a directory in the cloud region where gdrive sync has put all my synced folders?
Basically you can't. I found an explanation here
If I am correct in my suspicion, there are a few things you have to understand:
Even though you may be able to create folders inside the Computers isolated containers, doing so will immediately create that folder not only in your cloud, but on that computer/device. Any changes to anything inside the Computers container will automatically be synced to the device/computer the container is linked to- just like any change on the device/computer side is also synced to the cloud.
It is not possible to create anything at the "root" level of each container in the cloud. If that were permitted then the actual preferences set in Backup & Sync would have to be magically altered to add that folder to the preferences. Thus this is not done.
So while folders inside the synced folders may be created, no new modifications may be made in the "root" dir

Can I run a Cloud build on my own VM intances

Cloud build uses worker pool of VM and that is not able to access my on-prem Compute Engine resources So, is there any way to run cloud build on my own VM or any solution for these?
While waiting for the custom worker-pool feature you mentioned in your previous question to become available to public, you can use the custom builder remote-builder.
You'll need to first build the builder image that you'll be able to use then in your Cloud Builds steps. When using the remote-builder image, the following will happen:
A temporary SSH key will be created in your Container Builder
workspace
A instance will be launched with your configured flags
The workpace will be copied to the remote instance
Your command will be run inside that instance's workspace
The workspace will be copied back to your Container Builder
workspace
The build steps using this builder image will therefore run on a VM instance in your project's network and will be able to access other resources, provided your network configuration allows it.
Edit: The cos image used in the example cloudbuild.yaml file seems to include it so you'd be able to run it directly. In case you'd like to customize your instances with specific software, you have several options:
you can create an instance template (based on a custom image that includes the software or with a startup script that will install it at boot time) and specify that instance template in INSTANCE_ARGS in your cloudbuild.yaml.
you can use a standard image and just pass the startup script installing the software as INSTANCE_ARGS.
you can install it within a shell script executed in your build step.
Why can't you just fix the access issue? You can configure cloud build to create build workers within your VPC within your cloud infrastructure:
See the following video which explain how this works:
https://youtu.be/IUKCbq1WNWc?t=820
Hope this helps.

GCP: Duplicate an existing project in Google Cloud Platform

Is there a way to duplicate an entire project?
The project contains:
2x Cloud SQL: main + backup
1x Cloud Storage
4x Google Compute Engine
We have an exactly the same project already built up and configured, so it would be much easier for us if we could just make a copy of those.
The projects are not under the same account.
There is no such a way to replicate as-is a project.
However, you can use Terraformer starting from your current project: this CLI tool will generate Terraform template files starting from the existing infrastructure (reverse Terraform). Then, you can use these files to re-create the target resources within a second GCP project in a programmatic fashion (see https://cloud.google.com/community/tutorials/getting-started-on-gcp-with-terraform).
Disclaimer: Comments and opinions are my own and not the views of my employer.