Write into different project's datastore and GBQ using GAE - google-cloud-platform

We are having two projects PROJECT-1 and PROJECT-2 in GCP.
So what we are trying is to run GAE(standard in Python) in PROJECT-1 will generate some data which should be inserted in DATASTORE & GBQ in PROJECT-2
Have tried to find any documentation for the same but no luck so far so first, is it possible to write into the different project where GAE is running and if yes how and if there is documentation suggesting same?

It is not possible to use the Cloud Datastore ndb or db libraries to write from GAE standard in python to Cloud Datastore database in another project.

Related

How to copy a GCP Datastore Entity to another project

I want to copy the data from my User_Log kind from my Test GCP project to my Live project. I have exported the User_Log kind from the Datastore to the Google Cloud Storage bucket for the Test project. But when I go to import it using the GCP GUI into the Live project I can see the Test project buckets - even though I have given Storage Admin access to testProject#appspot.gserviceaccount.com in my Live project and vice versa Storage Admin access to LiveProject#appspot.gserviceaccount.com in the Test project.
From what I have read it should be possible to transfer files from one project's bucket to another.
Thanks
TimN
It looks like you can't import/export from one project to another using the GCP Console GUI, but you can if you use gcloud using the commands in the post: Export GCP Datastore and import to a different GCP Project
You are correct, the Cloud Console UI only allows you to select the buckets that exist in you current project. However, if the overall_export_metadata file is located in another project, you'll have to use other methods like gcloud tool or REST for the import - link

can we deploy whole project in Google Cloud using only Code?

I have a project in Google cloud using the following resources
-BigQuery, Google functions (Python), google storage, Cloud Scheduler
is it possible to save the whole project as code and share it, so someone else can just use that code and deploy it using his own tenant ?
the reason, I am asking, I have published all the code and SQL queries in Github, but some users find it very hard to reproduce, they are not necessarily very familiar with Google Cloud, in an ideal situation, they need just to get a file and click deploy ?
When you create a solution for GCP we will commonly find that it consists of code, data and configuration. The code and data you can save in a source repository like GitHub ... but what of the configuration? What if your "solution" expects to have BQ datasets and tables or GCS buckets or Scheduler jobs defined? This is where you can create "Infrastructure As Code" definitions. Google supports its own IaC technology called Deployment Manager but you can also use the popular Terraform as it too has a GCP provider. The definitions for these IaC coordinators are typically text / yaml files that you can also package with your code. Sprinkle in some Make, Chef, Puppet for building apps and pushing code to deployment environments and you have a "build it from source" story. Study also the concepts of CI/CD and you will commonly find that the steps you perform for building CI/CD overlap with the steps for trivial deployment.
There are also projects such as terraformer that can do some kind of a job of reverse engineering an existing configuration to create IaC description that, when run elsewhere, will recreate the configuration.

How can I snapshot the state of my Google App Engine application and upload it to a separate Google Cloud Storage?

I am setting up a relationship where two Google App Engine applications (A and B) need to share data. B needs to read data from A, but A is not directly accessible to B. Both A and B currently use Google Datastore (NOT persistent disk).
I have an idea where I take a snapshot of A's state and upload it to a separate Google Cloud Storage location. This location can be read by B.
Is it possible to take a snapshot of A using Google App Engine and upload this snapshot (perhaps in JSON) to a separate Google Cloud Storage location to be read from by B? If so, how?
What you're looking for is the Datastore managed export/import service:
This page describes how to export and import Cloud Firestore in
Datastore mode entities using the managed export and import service.
The managed export and import service is available through the gcloud
command-line tool and the Datastore mode Admin API (REST,
RPC).
You can see a couple of examples described in a bit more details in these more or less related posts:
Google AppEngine Getting 403 forbidden trying to update cron.yaml
Transferring data from product datastore to local development environment datastore in Google App Engine (Python)
You may need to take extra precautions:
if you need data consistency (exports are not atomic)
to handle potential conflicts in entity key IDs, especially if using manually-generated ones or referencing them in other entities
If A is not directly accessible to B isn't actually something intentional and you'd be OK with allowing B to access A then that's also possible. The datastore can be accessed from anywhere, even from outside Google Cloud (see How do I use Google datastore for my web app which is NOT hosted in google app engine?). It might be a bit tricky to set it up, but once that's done it's IMHO a smoother sharing approach than the export/import one.

Proper Design of Prod and Non Prod for Big Query and Deployment Management

I am looking for some information related to Deployment... [Not Deployment manager]
After I have designed my BigQuery tables Schema and if I want to use the same model to move to a different project which is considered as Production environment, how should I move.
Is it like saving the schema from non prod project and deploying or creating in production project? Is this approach correct? or is this model of non production and production project versioning is that good?
I am not able to find any resource related to this.
I do not really understand what you are trying to do. If you have a look in the BigQuery quickstart, you will see that there is not a "deployment" for BigQuery. Consider BigQuery like a tables store.
If you want to have a backup, you can export the data to Cloud Storage in several formats (as for example CSV).
If you want to exactly duplicate your project, follow this official documentation. To do it programmatically, follow this guide, written by a Google Developer.

WSO2 EMM mysql database setup

I am using WSO2 EMM 1.1.0. The documents talk about using a MySQL instead of H2 https://docs.wso2.com/display/EMM110/Setting+up+MySQL. It talks about editing the master-datasource.xml file and updating the WSO2_CARBON_DB, WSO2_EMM_DB and WSO2AM_DB databases. It then gives steps on priming those db's. But the master-datasource.xml file also contains the WSO2_IDENTITY_DB, SOCIAL_CACHE, SOCIAL_CASSANDRA_DB and JAGH2. I expect all of those can be moved to MySQL as well but I don't see the database scripts to set them up. What is the proper procedures to set up a system that uses MySQL instead of H2? Not to mention that the emm database had the database name hard coded into the setup script "USE WSO2EMM_DB" thus nullifying the master-datasource.xml file.
Thanks,
Brian
It is mentioned in this documentation[1] under the topic 'How to migrate from H2 to MySQL'
[1] - https://docs.wso2.com/display/EMM110/Upgrading+from+a+Previous+Release
You need to configure WSO2EMM_DB, WSO2AM_DB and WSO2CARBON_DB and WSO2IDENTITY_DB if you are going ahead with a larger deployment. H2 is setup just for make the out of the box experience better. You can create those DBs, Configure master_datasources.xml properly for all above DBs. And then run the server with the flag -Dsetup. It will get the configurations done automatically.
If it fails, you can also go to SERVER_HOME/dbscripts folder and find all the scripts for all above databases. Run them separately and run the server in the usual way which mentioned in our documentation.