In GCP, how can I trigger the automatic deployment of Cloud Function in to Production project from the source files present in repo of DEV project - google-cloud-platform

I want to automate the deployment of Cloud Function in to Production project through Cloud Build whose source files are present in Cloud Source Repository of DEV project. How can I ensure that the moment I push the code in production branch of Cloud Source Repository of DEV project, the Cloud Function gets created in to Production Project .

If I understand, you are trying to trigger a build from a repository stored on another project.
This is not possible, the build triggers must be on the same project than the repositories

I think my answer will help here: How to pass API parameters to GCP cloud build triggers
Basically what Claudio recommended, use the examples to build your steps. I believe what you want to do is create a step that the. Triggers the cloud function when you push changes to the dev production branch. When the trigger is called and ran, you then add a step to either run the cloud function or use the REST API to trigger the build by its ID. See my example above.

Related

How to pass branch/commit to Could Run instance?

I have several projects that run on Google Cloud Run. Cloud Build connects each instance to a corresponding branch of a Git repository. Each time a commit is pushed to a branch, a build is triggered to update the Cloud Run instance.
I'd like to be able to show information about the build within the Cloud Run application (e.g. branch and commit that the build has been built from). How can I pass this information from the repo/commit/build to the instance?
As #guillaume blaquiere stated in his comment:
You have the information in Cloud Build when it runs. You can get the data and paste them in your container somewhere. Then you have to serve them. Depends on your implementation.

How to use multi project multi environment deployment using google deployment manager and google cloud build

Currently we're having a dev environment in a gcp project. We're using GDM templates and other stuffs along with repo in bitbucket. Whenever we push any changes in bitbucket it builds and deploy to this dev environment. Suddenly, we've decided to have a new gcp project as test environment and we want to deploy automatically to this environment like dev environment. Our preference will be to deploy to this environment from the cloudbuild execution in dev environment. Can you suggest us any guideline that'll help us to set up things in one place that'll automatically deploy this in multiple projects as multiple environments automatically?
You can use Terraform to achieve this.
There's a lot of information on how to start here.
However, I would suggest having projects in separate deployments. This way you limit the blast radius and protect production from errors occurring in other environments.
You need separate calls for separate projects. Just like almost all Google API resources deploymentmanager/deployments lives inside a project (https://www.googleapis.com/deploymentmanager/v2/projects/[PROJECT]/global/deployments), thus you cannot deploy to multiple projects in one call.

How do I put AWS Amplify project into CodeCommit?

I am just starting to use AWS Amplify but can't figure out how you are supposed to commit the project to a source code repository so that others can work on the same project.
I created an react Serverless project 'web_app' and have created a few APIs and a simple front end application and now want to commit this to CodeCommit so it can be accessed by others.
Things get a bit confusing now because for the CI/CD it seems once should create a repository for the front end application - usually the source files are in the 'web_app/src' folder.
But Amplify seems to have already created a git repository at the 'web_app' folder level so am I supposed to create a CodeCommit repository and push the 'web_app' local repo to the remote repository and then separately create another repository for the front end in order to be able to use the CI/CD functions in AWS?
For some reason if I do try and push anything to AWS CodeCommit I always get an error 403.
OK - I'll answer this myself.
You just commit the entire project to a repo in CodeCommit. The project folder contains both the backend and the frontend code. The frontend code is usually in the /src folder and the backend code (CloudFormation files) is usually in the amplify folder.
Once you have the CodeCommit repo setup you can use the Amplify Console or the amplify-cli to create a new backend or frontend environment. Amplify is smart enough to know where to find the backend and frontend code.
Bear in mind that the backend amplify-cli code creates a bunch of files that are placed in the frontend folder (/src), including the graphql mutations and queries that will be used in the frontend code.
If you have set up CI/CD then any 'git push' will result in a new build for the environment you are in. You can modify the build script to include or exclude rebuilding the backend - I think by default it will rebuild the backend if there are changed.
You can also manually rebuild the backend by using the amplify-cli 'amplify push' command.
Take care because things can get out of sync and it seems old files can be left lying around that cause problems. Fortunately it doesn't take long to delete and rebuild and entire environment. Of course you may have to backup and reload your data first. Having some scripts to automatically load any seed data for development or testing is useful.
There is a lot of documentation out there but a lot of it seems to be quite confusing.

Is it possible to call Google Cloud Build via API with cloudbuild.yaml in source repo

I have an Cloud Build trigger that is set up to fetch a specific branch from a remote repository and use the cloudbuild .yaml file from it to run the build. I need to do the same programatically via an API call and the only way I can see this happening is with an API call here: https://cloud.google.com/cloud-build/docs/api/reference/rest/v1/projects.builds/create.
By looking at the required request body which is of type Build https://cloud.google.com/cloud-build/docs/api/reference/rest/Shared.Types/Build it seems that while I can provide the source repository (branch, commit or tag) I'm not sure how to provide the path to cloudbuild.yaml which in this case really needs to sit with the repo source code. Also it seems a steps param is required which actually has to be the steps from the cloudbuild file.
Is there any way that I can create a build using the same resources that I would otherwise set up in a cloud trigger (path to (cloudbuild.yaml instead of the steps to which I don't have access to in this case)?

Google Cloud Function - What is the default update behaviour for functions that rely on code in a Repo?

I have a Google Cloud Function that runs from source code stored in a Google Cloud Source Repository. If I update the source code in the repo, do I have to manually update the cloud function or is this done automatically?
There is no automatic deployment. You will have to run whatever command line you would normally run to deploy the code to Cloud Functions.