How do I get the query quotas from Deployment Manager via the API? - google-cloud-platform

Over at https://console.cloud.google.com/apis/api/deploymentmanager.googleapis.com/quotas or https://console.cloud.google.com/iam-admin/quotas?service=deploymentmanager.googleapis.com, I am able to see the query and well as the write quotas and are can determine if I'm going to hit limits if any.
Unfortunately, there seems to be no way to get these values programmatically using the Deployment Manager APIs (using Go) or using gcloud.
Am I missing something here, or there are some other ways of getting at these values, possibly, not via the APIs directly.

Currently, there's no way to get the quotas programmatically or with gcloud(apart from the compute engine quotas) , however, there's a feature request to get/set the project quotas via API. I suggest starring this issue to track it and ask for updates from it.

knowing of no API, which could be used to do so ...
guess one could only limit the quota per user; see the documentation.
there are several questions concerning other API (all the same).

Related

How can I get virtual machines specifications from Google cloud?

I would like to know if there is a way to collect the technical specifications of a virtual machine from Google Cloud (CPU, frequency, memory, storage) ?
I am using the billing API (https://cloudbilling.googleapis.com/v1/services/6F81-5844-456A/skus?key=API_KEY) to get the pricing information but there is no technical specifications.
Is there a way to get that data ? May be using the products' SKU or something ? I know AWS and Azure SDKs/APIs allows developers to get the technical information but I did not find the GCP equivalent for this.
I searched for a while for something like this but it seems like a lot of people had the same issue but no one had a working answer.
The compute API offers you several operations that you can use to obtain the desired information.
Especially consider review the instances get operation, it will provide you all the details of the instance.
You can obtain the list of instances in a certain zone if required using the instances list operation.
If you are trying to obtain information about machines families, GCP provides a rich API as well.

Google Cloud Healthcare API not supporting PlanDefintion/$apply and Activity Definition/$apply

I am trying to create a carePlan as output by doing $apply operation on PlanDefinition and ActivityDefintion.
For example:
https://healthcare.googleapis.com/v1beta1/projects/<project_id>/locations/<location_name>/datasets/<dataset_name>/fhirStores/<fhirstore_name>/fhir/PlanDefinition/<plandefintion_id>/$apply?subject=Patient/
This endpoint is throwing Error 404 (Not Found)!!1.
Tried many ways of making this endpoint work as per FHIR R4 documentation. But seems like GCP is not yet supporting $apply operation in PlanDefintion and ActivityDefinition in both v1 and v1beta1 version of Healthcare APIs.
Need help on this.
On their documentation page they indicate that you can view your server's capabilities by retrieving the CapabilityStatement. If this does not list the $apply operation, it is not supported.
And actually further down on the page, the documentation mentions: "Most extended operations aren't implemented". If you need that functionality, maybe you can reach out to them to see if it can be integrated, or look into using a different FHIR server implementation.

(GCloud SQL) Is there a way to opting in to maintenance notifications via CLI or API?

I'm looking for a way to toggle this notification via gcloud CLI or API call since I need to automate it.
Is there a way of doing it? If not, is this going to be available in the future?
Having a lot of environment is hard to keep track of all of them via UI.
I have checked the Cloud SQL Admin API and it seems it is not possible yet. The best way to proceed in this cases is to create a Feature Request in the Public Issue Tracker. I searched for an existing one, but I didn't find any. When submitting a Feature Request, the Engineering team has more visibility of your needs and they can prioritize those requests by the number of users affected.

Google Cloud APIs usage data by projects

Is there any way to programmatically get data similar to APIs overview of Google CLoud dashboard. Specifically, I'm interested in the list of APIs enabled for the project and their usage/error stats for some predefined timeframe. I belive there's an API for that but I struggle to find it.
There's currently no API that gives you a report similar to the one you can see through the Google Cloud Console.
The Compute API can retrieve some quotas with the get method but it's somewhat limited (only Compute Engine quotas) and, for what I understood from your question, not quite what you're looking for.
However, I've found in Google's Issue Tracker a feature request that's close to what you're asking for.
If you would need something more specific or want to do the feature request yourself, check the "Report feature requests" documentation and create your own. The GCP team will take a look at it to evaluate and consider implementation.

Restrict access to Google Cloud Functions to a given network?

I'm looking through Google Cloud Functions docs and I wonder if it is possible to restrict access to HTTP cloud function to the given network? I would like to avoid anyone to exhaust the free quota.
Is there any firewall rules or similar mechanism for Cloud Functions?
I don't believe there is any in-built security restrictions at the moment.
In terms of avoid quota exhaustion you could pass a header or parameter with some kind of shared secret. Even a fixed string value would help avoid this problem.
You can add authentication to a cloud function by using firebase authentication. Here's a github example of how to do to it: https://github.com/firebase/functions-samples/tree/master/authorized-https-endpoint
Note however that the authentication code is executed by your function, so rejecting unauthorized access would still consume a small portion of your free resource allowance.
The Google Function Authorizer module might be what you're looking for. It provides "a simple user authentication and management system for Google Cloud HTTP Functions." It doesn't seem to have a lot of users yet, but the project seems simple enough that you could at least use it as a basis to modify or implement your own solution if you prefer.
This article was helpful for me.
https://cloud.google.com/solutions/authentication-in-http-cloud-functions
Anyone can still invoke the function but it must contain credentials from a user that has access to the resources accessed by the function.
Before that I was doing something very simple that is probably not great for production but does provide a little bit more security that just leaving it open publicly. I call my function with a password in the payload and if it doesn't match one of the passwords I hardcoded on the function it just fails with a 403.
If you need to restrict to IP range then you can follow instructions here: https://sukantamaikap.com/posts/load-balancing-cloud-functions
The UI of Google Cloud has unfortunately changed and you need to do some searching before you get all done, but I managed to set it up. But note that the related services will cost roughly 25 eur per month at minimum.
You can estimate the pricing here:
https://cloudpricingcalculator.appspot.com/
You need to search for "Cloud Load Balancing and Network Services" and then enable "Cloud Load Balancing", "Google Cloud Armor", and "IP addresses".
Alternatively, in some cases it might be sufficient if you set the name of the function or some suffix to the name complex enough so that it will be effectively like a sort of password. Something like MyGoogleCloudFunc-abracadabra. Then it will not restrict the network but perhaps outsiders would not know the secret name anyway.