Monitoring Performance metrics more than one GCP environment - google-cloud-platform

My requirement is to monitor performance metrics of GCP Environments. We have to monitor more than one GCP environment (Service Account). Now Problem is how can I set the all service Accounts JSON files in Environment variable "GOOGLE_APPLICATION_CREDENTIALS".
creating MetricServiceClient like below after setting JSON file in an environment variable
MetricServiceClient client = MetricServiceClient.create()
Is there another way to create MetricServiceClient using credentials.

I suggest you use the StackDriver Workspace and add all the GCP Project you want to monitor to that workspace. Here is the detailed guide https://cloud.google.com/monitoring/workspaces/guide.
By using a single Workspace for all GCP Project, you will have all the Metrics/Logging data in a single place and then you can use one set of credentials to access all GCP Project Monitoring data.
If the Single workspace is not a feasible option, then you can create a single GCP Service account and add StackDriver related permission from all the projects. Then you can use this service account to interact with Stackdriver metrics.
Note: Always try to use the principle of least privilege
Hope this helps.

Related

How do you deploy Cloud Identity or Organisation Policies in GCP via Terraform?

New to GCP and use IAC for our Terraform. I've managed to build most of the initial organisation config in Terraform no problem with the exception of Cloud Identity and Organisation Policies. I'm using gcloud provided login credentials. Whenever I try to build for those two services I get this:
Error creating Group: googleapi: Error 403: Your application has authenticated using end user credentials from the Google Cloud SDK or Google Cloud Shell which are not supported by the cloudidentity.googleapis.com. We recommend configuring the billing/quota_project setting in gcloud or using a service account through the auth/impersonate_service_account setting. For more information about service accounts and how to use them in your application, see https://cloud.google.com/docs/authentication/. If you are getting this error with curl or similar tools, you may need to specify 'X-Goog-User-Project' HTTP header for quota and billing purposes. For more information regarding 'X-Goog-User-Project' header, please check https://cloud.google.com/apis/docs/system-parameters.
So in this case i'm using the Google Cloud SDK, so the error makes sense. However, the two options it presents don't work:
Setting a quota project makes no difference
I can't create a service account at the organisational level (and when I create one within a project it can't configure these organisational level constructs)
So how do I go about Terraforming these services?
Thanks.

Difference between Google managed service account and default service account in GCP

I've been reading the Google Cloud documentation and can't exactly figure out what the difference between these two are. I know that both of them are automatically created in GCP, but I really don't know much more.
You aren't alone, and that's why google has started a new video series on this topic. To summarize,
The Google managed service account are account created on Google side (managed by Google, you can't delete them) but that you can grant on your project to allow them to perform actions. They are also named service agent. They are used when you used serverless product, such as Cloud Build for example, or Cloud Run (to pull the image, not to run the instance)
The default service account (mainly Compute Engine default service account and App Engine default service account) are service account created automatically in YOUR project (so managed by you, you can delete them if you want) when you activate some APIs. They are used by default when you create some service's instance.

Cloud Dataflow job reading from one Bigquery project and writing to another BigQuery project

I'm implementing a Cloud Dataflow job on GCP that needs to deal with 2 GCP projects.
Both input and output are Bigquery partitionned tables.
The issue I'm going through now is that I must read data from a project A and write it into a project B.
I havent seen anything related to cross project service accounts and I can't give Dataflow two different credential key either which is a bit annoying ?
I don't know if someone else went through that kind of architecture or how you dealt with it.
I think you can accomplish this with the following steps:
Create a dedicated service account in the project running the Dataflow job.
Grant the service account the Dataflow Worker and BigQuery Job User roles. The service account might need additional roles based on the full resource needs of the Dataflow job.
In Project A, grant the service account the BigQuery Data Viewer role to either the entire project or to specific datasets.
In Project B, grant the service account the BigQuery Data Editor role to either the entire project or to specific datasets.
When you start the Dataflow job, override the service account pipeline option supplying the new service account.
It is very simple. you need to give required permission/access to your service account from both the project.
So you need only service account which has required access/permission in both the project
Hope it helps.

Can I monitor multiple projects in one place using Google Cloud Platform?

I have multiple projects in GCP and I am trying to read all my projects' logs in one place.
Any suggestions?
Unlike monitoring, Stackdriver logging UI does not provide a multi-project view of logs.
It is possible to query the logs from multiple projects using the API. See the resourceNames (or projectIds) field https://cloud.google.com/logging/docs/reference/v2/rest/v2/entries/list
It's also possible to export logs from multiple projects to one place: https://cloud.google.com/logging/docs/export/
You can check our documentation about Monitoring multiple projects using a single Stackdriver account that can contain up to 100 GCP projects for more details.
A project organizes all your Google Cloud Platform resources. A project consists of a set of users; a set of APIs; and billing, authentication, and monitoring settings for those APIs. So, for example, all of your Cloud Storage buckets and objects, along with user permissions for accessing them, reside in a project. You can have one project, or you can create multiple projects and use them to organize your Google Cloud Platform resources, including your Cloud Storage data, into logical groups.
Users can only view and list projects they have access to via IAM roles. The Organization Admin can view and list all projects in the organization.
For logging you have to pass the project id:
projects/[PROJECT_ID]/logs/
Reference: https://cloud.google.com/logging/docs/

Want to create StackDriver dashboard from template

Is it possible to create StackDriver dashboards programmatically from a declarative template (similar to how Google Cloud Deployment allows you to create and manage cloud resources)?
We use a different GCP project for each of our environments (master, staging, prod) and want to have StackDriver dashboards with the same metrics in each project.
I checked the StackDriver Monitoring API docs as well as the Cloud Deployment supported resources and did not find any evidence of support for this use case.
I'm a Product Manager with Stackdriver. There's not currently a public available dashboard API. Until that's available, one solution many of our customers use is to create a canonical dashboard template and then copy this each time a new one is needed, modifying the input data.