If I'm describing and managing the resources in my GCP projects via GDM (Google Deployment Manager), is it possible to propagate a change to multiple Google Cloud projects via GDM (Google Deployment Manager)?
Specifically, I'm looking at defining a BigQuery schema (set of datasets & tables) via GDM, and then syncing this schema across numerous projects (each of which will have their own BigQuery db). So ideally if I make a schema update like adding a column to a table, I only need to update a single "deployment" for the schema change to propagate to all the relevant projects.
Google Cloud Deployment Manager allows for actions to be propagated within the project. That being said, to make changes through multiple projects, you have to make a call per project. Consequently, the schema changes produced on one project will not be reproduced on BigQuery tables found in other projects.
Related
I've been recently added to a new GCP project which has litterally tons and tons of pods, workloads and bases.
I want to visualize all of it in a schema or model.
Is there any tool or plugin that i can use to modelize the project ?
Probably the best mechanism would be to use Cloud Console and view the project's resources through the various pages built in to the console.
Google provides very many APIs (services) and these may contain multiple resource (types) and, as you've seen, there can be many instances of the resources.
I think anything that enumerate all a project's resources could be somewhat overwhelming whereas Console provides structure.
Choose your project at or append a query string project=... to:
https://console.cloud.google.com
I have a project in Google cloud using the following resources
-BigQuery, Google functions (Python), google storage, Cloud Scheduler
is it possible to save the whole project as code and share it, so someone else can just use that code and deploy it using his own tenant ?
the reason, I am asking, I have published all the code and SQL queries in Github, but some users find it very hard to reproduce, they are not necessarily very familiar with Google Cloud, in an ideal situation, they need just to get a file and click deploy ?
When you create a solution for GCP we will commonly find that it consists of code, data and configuration. The code and data you can save in a source repository like GitHub ... but what of the configuration? What if your "solution" expects to have BQ datasets and tables or GCS buckets or Scheduler jobs defined? This is where you can create "Infrastructure As Code" definitions. Google supports its own IaC technology called Deployment Manager but you can also use the popular Terraform as it too has a GCP provider. The definitions for these IaC coordinators are typically text / yaml files that you can also package with your code. Sprinkle in some Make, Chef, Puppet for building apps and pushing code to deployment environments and you have a "build it from source" story. Study also the concepts of CI/CD and you will commonly find that the steps you perform for building CI/CD overlap with the steps for trivial deployment.
There are also projects such as terraformer that can do some kind of a job of reverse engineering an existing configuration to create IaC description that, when run elsewhere, will recreate the configuration.
Is there a way to duplicate an entire project?
The project contains:
2x Cloud SQL: main + backup
1x Cloud Storage
4x Google Compute Engine
We have an exactly the same project already built up and configured, so it would be much easier for us if we could just make a copy of those.
The projects are not under the same account.
There is no such a way to replicate as-is a project.
However, you can use Terraformer starting from your current project: this CLI tool will generate Terraform template files starting from the existing infrastructure (reverse Terraform). Then, you can use these files to re-create the target resources within a second GCP project in a programmatic fashion (see https://cloud.google.com/community/tutorials/getting-started-on-gcp-with-terraform).
Disclaimer: Comments and opinions are my own and not the views of my employer.
I am setting up a relationship where two Google App Engine applications (A and B) need to share data. B needs to read data from A, but A is not directly accessible to B. Both A and B currently use Google Datastore (NOT persistent disk).
I have an idea where I take a snapshot of A's state and upload it to a separate Google Cloud Storage location. This location can be read by B.
Is it possible to take a snapshot of A using Google App Engine and upload this snapshot (perhaps in JSON) to a separate Google Cloud Storage location to be read from by B? If so, how?
What you're looking for is the Datastore managed export/import service:
This page describes how to export and import Cloud Firestore in
Datastore mode entities using the managed export and import service.
The managed export and import service is available through the gcloud
command-line tool and the Datastore mode Admin API (REST,
RPC).
You can see a couple of examples described in a bit more details in these more or less related posts:
Google AppEngine Getting 403 forbidden trying to update cron.yaml
Transferring data from product datastore to local development environment datastore in Google App Engine (Python)
You may need to take extra precautions:
if you need data consistency (exports are not atomic)
to handle potential conflicts in entity key IDs, especially if using manually-generated ones or referencing them in other entities
If A is not directly accessible to B isn't actually something intentional and you'd be OK with allowing B to access A then that's also possible. The datastore can be accessed from anywhere, even from outside Google Cloud (see How do I use Google datastore for my web app which is NOT hosted in google app engine?). It might be a bit tricky to set it up, but once that's done it's IMHO a smoother sharing approach than the export/import one.
I am looking for some information related to Deployment... [Not Deployment manager]
After I have designed my BigQuery tables Schema and if I want to use the same model to move to a different project which is considered as Production environment, how should I move.
Is it like saving the schema from non prod project and deploying or creating in production project? Is this approach correct? or is this model of non production and production project versioning is that good?
I am not able to find any resource related to this.
I do not really understand what you are trying to do. If you have a look in the BigQuery quickstart, you will see that there is not a "deployment" for BigQuery. Consider BigQuery like a tables store.
If you want to have a backup, you can export the data to Cloud Storage in several formats (as for example CSV).
If you want to exactly duplicate your project, follow this official documentation. To do it programmatically, follow this guide, written by a Google Developer.