Can Google Cloud Repositories be shared cross projects? - google-cloud-platform

I am trying to set up a continous developemnt system for creating an app and I would like to know if this is idea is feasible within GCP:
Project A - Hosts Cloud Source Repository
Project B - Cloud Run for the app
On project B, I have the Cloud Run option of 'Continously deploy new revisions from a source repository' which I would like to point to the CSR from project A.
My question is, Can CSR be shared cross-project or do I need to go for GitHub or BitBucket to be able to share code between projects?

You can access y our Cloud Source Repository from any project as long as your account (service or user) has the permission to access it.
However, you can't configure Cloud Build triggers on Cloud Source Repository that is in another project (the continuous deployment on Cloud Run configure a Cloud Build trigger behind the scene for you. It's simply a shortcut).
But you can also create a Cloud Build Trigger in your Cloud Source Repository project and grant the permission to the Cloud Build service account to deploy the Cloud Run service to the target project.
Because the continuous deployment on Cloud Run is a shortcut to configure Cloud Build trigger and deployment pipeline, you can do the same manually (longer and required more skill/experience with GCP), but it's not impossible!!

Related

Mirror Bitbucket Repositories to Google Cloud Source Repositories

I am trying to build CICD using cloud build in GCP. As a part of that, I am trying to mirror the repositories from Bitbucket into CSR. But I am not able to mirror the repositories. I am able to view the repositories that are present in the Bitbucket after authorizing to bitbucket from GCP.
https://cloud.google.com/build/docs/automating-builds/create-manage-triggers
https://cloud.google.com/source-repositories/docs/mirroring-a-bitbucket-repository
IAM Permissions:
I have Admin access for Source Repositories in GCP along with Cloud Build Service Account.
I have Admin access for the bitbucket repository and the workspace. The workspace in bitbucket is private.
Per the Cloud Source Repositories
If you are mirroring your Bitbucket repository to Cloud Source
Repositories to integrate with Cloud Build and do not need any other
Cloud Source Repositories features, follow the Cloud Build
instructions on building repositories from Bitbucket Cloud instead.
The referenced guide, Building repositories from Bitbucket Cloud, mentions that you need to create an SSH key in order to authenticate your connection to Bitbucket Cloud.
Bitbucket documenentation also confirms that the connection fails if there is no SSH key.
I have learned that the repositories in the Bitbucket are in Private Workspace and are IP restricted. So adding a set of Google Cloud's Public IPs solved this issue.

How to fully automate the Google Cloud Build trigger creation

i try to fully automate the cloud build trigger creation via sh script
As source I use Github.
So far it's possible to create the trigger
gcloud beta builds triggers create github \
--repo-name=organisation/repo \
--repo-owner=organisation \
--branch-pattern="^main$" \
--build-config=cloudbuild.yaml
BUT each repo has to be authorized manually before otherwise you get the Error:
ERROR: (gcloud.beta.builds.triggers.create.github) FAILED_PRECONDITION: Repository mapping does not exist. Please visit https://console.cloud.google.com/cloud-build/triggers/connect?project=********* to connect a repository to your project
Which links me to the UI to create the authorization manually
Is there a way to also automate that step?
Currently there is no way to connect to external repositories using the API, but there is an ongoing feature request for this to be implemented.
There are two options you can adopt now:
Connect all the repositories at once from the Cloud Console. This way, you will be able to automate the creation of triggers for those repositories.
Use Cloud Source Repositories, which are connected to Cloud Build by default, as indicated here. Check this documentation on how to create a remote repository in CSR from a local git repository.
If you use another hosted Git provider, such as on GitHub or Bitbucket, and still need to mirror the repository into Cloud Source Repositories, you must have the cloudbuilds.builds.create permission for the Google Cloud project with which you're working. This permission is typically granted through the cloudbuild.builds.editor role.
Here are some links to this information.
Creating and managing build trigger

How to integrate cloudbuild (GCP) with CodeCommit

I have my source code in code commit and my new client is with GCP. They wanted to connect code-commit from google cloud-build, is there any option for that ?
Given the fact that GCP and AWS are competitor cloud providers I would say that you will not find a way to trigger Google Cloud Build from AWS CodeCommit, which is what I believe you mean with "integrate" both products.
What I would do in your scenario is replicate you CodeCommit repository in it's equivalent in GCP, which is Google Cloud Source Repositories. You can find a tutorial for how to setup
Build Triggers from Cloud Source Repositories in this documentation. Another option is pushing a container ready to be deployed into Cloud Registry and deploying that instead, you can follow these steps for that.

Cloud build trigger doesn't see Cloud Source Repository from another project

I want create Cloud Build trigger linked to Cloud Source Repository in another project.
But when I'm on a step where I am supposed to choose a repository, the list is empty.
I tried different some permission, but without luck.
Could someone tell whether such configuration is possible and how do it?
The cloudbuild trigger can only see repositories that are in the same project.
We ran into the same issue with Bitbucket repos that we are mirroring into the Cloud Source Repos in our projects.
What we discovered was that we needed to mirror the repo into BOTH projects so that the cloudbuild trigger could see the repository. I am not sure how this would work with a repo that only lives in the GCP source code repo.
When you have project A that has a trigger to build a container and place it in a repository owned by project B, you must add an IAM permission on project B that allows creation of images from a service account on project A. When you are using triggers, a service account on project A is created called A_number#cloudbuild.gserviceaccount.com. On project B, you must then use IAM to give permissions for this service account to create containers. For example, you may add the role "Cloud Build Editor".
This appears to be quite well document in the following Cloud Build docs:
Configuring access control
Setting service account permissions

What would be the best way to manage cloud credentials as part of an Azure DevOps build pipeline?

We are going to be creating build/deploy pipelines in Azure DevOps to provision infrastructure in Google Cloud Platform (GCP) using Terraform. In order to execute the Terraform provisioning script, we have to provide the GCP credentials so it can connect to our GCP account. I have a credential file (JSON) that can be referenced in the Terraform script. However, being new to build/deploy pipelines, I'm not clear on exactly what to do with the credential file. That is something we don't want to hard-code in the TF script and we don't want to make it generally available to just anybody that has access to the TF scripts. Where exactly would I put the credential file to secure it from prying eyes while making it available to the build pipeline? Would I put it on an actual build server?
I'd probably use build variables or store variables in key vault and pull those at deployment time. storing secrets on the build agent is worse, because that means you are locked in to this build agent.
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-key-vault?view=azure-devops
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch