I have a deployment manifest which includes amongst others, a zone property. I want the user to be able to select the zone from a dropdown. I know it is possible to do this in AWS CloudFormation. Is there an equivalent method for GCP DM? There a number of properties for which such a selection would be useful.
When I launch a deployment from the Marketplace, these dropdowns are populated, so definitely it is possible.
It seems that the additional functionality provided by Deployment Manager AutoGen to generate a package that can be deployed on the GCP Marketplace (with additional API's) is only available to Google Technology Partners, and there is an approval process for that.
Related
Problem: Google Cloud Run only provides a subset of documented metadata
I have a simple JVM based application running on Google Cloud Run that queries http://metadata.google.internal for available metadata.
The only metadata available is at the following paths:
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts
http://metadata.google.internal/computeMetadata/v1/instance/zone
http://metadata.google.internal/computeMetadata/v1/project/project-id
http://metadata.google.internal/computeMetadata/v1/project/numeric-project-id
As per the documentation, I was expecting more than this and hoping that I would be able to query the metadata server for the name of the Cloud Run service and the metadata required to configure Stackdriver Monitoring for a generic_node.
One clue that I have found is in the server header in the response from querying the Metadata server gives the value: Metadata Server for Serverless
Theory: Cloud Run is in beta and the Metadata Server for Serverless is separate from the typical metadata server and is a work in progress.
Question(s):
Is this theory valid?
Is this limitation documented somewhere?
Is there a roadmap for adding additional metadata?
Is there an alternative for determining the metadata needed to configure Stackdriver?
Compute Metadata service you linked is only available to Compute Engine products (such as GCE, GKE). Many of the endpoints in there are about VM details, VM metadata/tags, VM startup scripts etc.
These concepts don't apply to serverless compute environments. Therefore I don't think a feature request here will succeed.
Serverless products such as App Engine, Cloud Functions and Cloud Run support a minimal version of the metadata service to provide basic functionality to SDKs (such as Google Cloud client libraries, Stackdriver or OpenTelemetry/OpenCensus clients, or gcloud CLI). Using these endpoints, Google’s own client libraries can automatically get auth tokens, discover project IDs etc.
Also, these serverless products don't run on GCE, and don't have the same concepts. That's why a full metadata service isn't available for these products.
The applicable for serverless environments are the endpoints you listed in your question.
I don't think you will find much information in order to validate your theory as that has to do with the product's architecture and I don't think Google will share it for the moment, however, it does seem to be valid based on the evidence you found.
What can be done is to open a feature request to Google so that they work on adding more information to the metadata so that it cover your needs. As the product is on beta, they should be open to do some changes.
Hope you find this useful.
My requirement is to monitor performance metrics of GCP Environments. We have to monitor more than one GCP environment (Service Account). Now Problem is how can I set the all service Accounts JSON files in Environment variable "GOOGLE_APPLICATION_CREDENTIALS".
creating MetricServiceClient like below after setting JSON file in an environment variable
MetricServiceClient client = MetricServiceClient.create()
Is there another way to create MetricServiceClient using credentials.
I suggest you use the StackDriver Workspace and add all the GCP Project you want to monitor to that workspace. Here is the detailed guide https://cloud.google.com/monitoring/workspaces/guide.
By using a single Workspace for all GCP Project, you will have all the Metrics/Logging data in a single place and then you can use one set of credentials to access all GCP Project Monitoring data.
If the Single workspace is not a feasible option, then you can create a single GCP Service account and add StackDriver related permission from all the projects. Then you can use this service account to interact with Stackdriver metrics.
Note: Always try to use the principle of least privilege
Hope this helps.
I have a host project with 2 VPCs , both of them aew shared with a service project that has no VPCs. in the console all works great, but I want to create automation for that. I am not able to list the VPCs in the service project. I am trying to use
https://www.googleapis.com/compute/v1/projects/{project}/aggregated/subnetworks/listUsable
from the documantation
Retrieves an aggregated list of all usable subnetworks in the project. The list contains all of the subnetworks in the project and the subnetworks that were shared by a Shared VPC host project.
but I am getting empty result set
what I am missing?
You have to be relatively careful with the permissions and what user you authenticate as. You will only be able to see subnetworks where the calling user has the appropriate compute.subnetworks.* permissions.
If you're looking at the Cloud Console, you will be acting with your Google Account which most likely has owners or at least roles/compute.networkUser access.
Depending on how you authenticate your API calls, you are most likely using a service account. Ensure that this service account has the required roles as well.
For further debugging, you can also try using the gcloud CLI tool. It has a handy option: --log-http that will show you all HTTP calls done. This is often a great help when piecing together functionality in external code.
I have looked on how GCP console is doing it
1. it query to see if there is host project
2. if there is host project - it send query to the host project to list the subnets
I have multiple projects in GCP and I am trying to read all my projects' logs in one place.
Any suggestions?
Unlike monitoring, Stackdriver logging UI does not provide a multi-project view of logs.
It is possible to query the logs from multiple projects using the API. See the resourceNames (or projectIds) field https://cloud.google.com/logging/docs/reference/v2/rest/v2/entries/list
It's also possible to export logs from multiple projects to one place: https://cloud.google.com/logging/docs/export/
You can check our documentation about Monitoring multiple projects using a single Stackdriver account that can contain up to 100 GCP projects for more details.
A project organizes all your Google Cloud Platform resources. A project consists of a set of users; a set of APIs; and billing, authentication, and monitoring settings for those APIs. So, for example, all of your Cloud Storage buckets and objects, along with user permissions for accessing them, reside in a project. You can have one project, or you can create multiple projects and use them to organize your Google Cloud Platform resources, including your Cloud Storage data, into logical groups.
Users can only view and list projects they have access to via IAM roles. The Organization Admin can view and list all projects in the organization.
For logging you have to pass the project id:
projects/[PROJECT_ID]/logs/
Reference: https://cloud.google.com/logging/docs/
Is it possible to create StackDriver dashboards programmatically from a declarative template (similar to how Google Cloud Deployment allows you to create and manage cloud resources)?
We use a different GCP project for each of our environments (master, staging, prod) and want to have StackDriver dashboards with the same metrics in each project.
I checked the StackDriver Monitoring API docs as well as the Cloud Deployment supported resources and did not find any evidence of support for this use case.
I'm a Product Manager with Stackdriver. There's not currently a public available dashboard API. Until that's available, one solution many of our customers use is to create a canonical dashboard template and then copy this each time a new one is needed, modifying the input data.