I want to copy the data from my User_Log kind from my Test GCP project to my Live project. I have exported the User_Log kind from the Datastore to the Google Cloud Storage bucket for the Test project. But when I go to import it using the GCP GUI into the Live project I can see the Test project buckets - even though I have given Storage Admin access to testProject#appspot.gserviceaccount.com in my Live project and vice versa Storage Admin access to LiveProject#appspot.gserviceaccount.com in the Test project.
From what I have read it should be possible to transfer files from one project's bucket to another.
Thanks
TimN
It looks like you can't import/export from one project to another using the GCP Console GUI, but you can if you use gcloud using the commands in the post: Export GCP Datastore and import to a different GCP Project
You are correct, the Cloud Console UI only allows you to select the buckets that exist in you current project. However, if the overall_export_metadata file is located in another project, you'll have to use other methods like gcloud tool or REST for the import - link
Related
I'm using the project per environment method to manage my staging and production environment on GCP, but I'm not sure how I can make sure the two environments have the same configurations.
For example, can I export the IAM config from one project and import it into another project? Or is there a way that I can make sure that the configuration of the two projects is close enough?
Thanks.
You can use “get-iam-policy” and “set-iam-policy” in your projects to perfectly duplicate the policies from one project onto another (the command is singular but it copies all parts of the policy you do not need to iterate through the roles or anything of the sort).
Here are the links that you can refer more information on the gcloud commands mentioned
https://cloud.google.com/sdk/gcloud/reference/projects/get-iam-policy
https://cloud.google.com/sdk/gcloud/reference/projects/set-iam-policy
I have a project in Google cloud using the following resources
-BigQuery, Google functions (Python), google storage, Cloud Scheduler
is it possible to save the whole project as code and share it, so someone else can just use that code and deploy it using his own tenant ?
the reason, I am asking, I have published all the code and SQL queries in Github, but some users find it very hard to reproduce, they are not necessarily very familiar with Google Cloud, in an ideal situation, they need just to get a file and click deploy ?
When you create a solution for GCP we will commonly find that it consists of code, data and configuration. The code and data you can save in a source repository like GitHub ... but what of the configuration? What if your "solution" expects to have BQ datasets and tables or GCS buckets or Scheduler jobs defined? This is where you can create "Infrastructure As Code" definitions. Google supports its own IaC technology called Deployment Manager but you can also use the popular Terraform as it too has a GCP provider. The definitions for these IaC coordinators are typically text / yaml files that you can also package with your code. Sprinkle in some Make, Chef, Puppet for building apps and pushing code to deployment environments and you have a "build it from source" story. Study also the concepts of CI/CD and you will commonly find that the steps you perform for building CI/CD overlap with the steps for trivial deployment.
There are also projects such as terraformer that can do some kind of a job of reverse engineering an existing configuration to create IaC description that, when run elsewhere, will recreate the configuration.
Is there a way to duplicate an entire project?
The project contains:
2x Cloud SQL: main + backup
1x Cloud Storage
4x Google Compute Engine
We have an exactly the same project already built up and configured, so it would be much easier for us if we could just make a copy of those.
The projects are not under the same account.
There is no such a way to replicate as-is a project.
However, you can use Terraformer starting from your current project: this CLI tool will generate Terraform template files starting from the existing infrastructure (reverse Terraform). Then, you can use these files to re-create the target resources within a second GCP project in a programmatic fashion (see https://cloud.google.com/community/tutorials/getting-started-on-gcp-with-terraform).
Disclaimer: Comments and opinions are my own and not the views of my employer.
I have a daily export that backs up my Datastore to a Cloud Storage bucket on Google Cloud Platform. I followed the directions as described here: https://cloud.google.com/datastore/docs/schedule-export
I do not specify by kind or namespace. As such, the storage bucket contains a folder structure looking like
Buckets/<bucket-name>/<YYYYMMDD-######>/default_namespace/all_kinds/
In which are the output-### files.
I want to import from this backup, but I only want one Kind of entity. According to this reference: https://cloud.google.com/sdk/gcloud/reference/datastore/import
I should be able to do that with this command: $ gcloud datastore import --kinds='Customer','Order'
However, when I run the following with the variables filled in:
$ gcloud datastore import --kinds='<kind>' gs://<bucket-name>/YYYYMMDD-######/YYYYMMDD-######.overall_export_metadata
ERROR: (gcloud.datastore.import) INVALID_ARGUMENT: The requested kinds/namespaces are not available
I get the above error. I have tried with multiple Kinds which I am sure are part of the Datastore.
Am I able to import by specific Kind if I exported without heed to Kind?
Thanks.
Hello I'm working at Google Cloud. I've been able to reproduce your case and it seems that it is an issue in Google Cloud Datastore. I created an entry for you in issuetracker, you can stay tuned here. Thank you for reporting.
We are having two projects PROJECT-1 and PROJECT-2 in GCP.
So what we are trying is to run GAE(standard in Python) in PROJECT-1 will generate some data which should be inserted in DATASTORE & GBQ in PROJECT-2
Have tried to find any documentation for the same but no luck so far so first, is it possible to write into the different project where GAE is running and if yes how and if there is documentation suggesting same?
It is not possible to use the Cloud Datastore ndb or db libraries to write from GAE standard in python to Cloud Datastore database in another project.