GCP Storage API read Limits - google-cloud-platform

I tried to find everywhere if there is a rate-limit for read requests to the Storage API, but didn't manage to find any info. Any idea where could it be?
I would like to see information regarding rate-limit per serviceAccount/project/session or anything of that sort.

Related

Throttling google cloud endpoints

I am implementing public API using Google Cloud Endpoints & Google Cloud Functions. This API will later be used in web application, and I need a way to throttle number of requests to prevent people from flooding API with huge number of requests, which could increase project maintenance costs. I don't care if API becomes unavailable due to throttling, protecting myself from those costs takes higher priority.
What should I do or which tools should I use to achieve this on Google Cloud?
If we look at the Google Cloud Endpoints documentation in the section called About Quotas we will find a description of a capability to limit the number of requests from calling applications. The article then goes into depth on how to set it all up which appears to be to add additional attributes to your API's exposed Open API spec.
There is also a great article called Rate-limiting strategies and techniques that provides a rich set of alternatives and thoughts. My suggestion would be to read this article in depth which will arm you with an overview of each of the choices at your disposal. There is also a rich set of additional references at the end of the article for further reading.

How do I get the query quotas from Deployment Manager via the API?

Over at https://console.cloud.google.com/apis/api/deploymentmanager.googleapis.com/quotas or https://console.cloud.google.com/iam-admin/quotas?service=deploymentmanager.googleapis.com, I am able to see the query and well as the write quotas and are can determine if I'm going to hit limits if any.
Unfortunately, there seems to be no way to get these values programmatically using the Deployment Manager APIs (using Go) or using gcloud.
Am I missing something here, or there are some other ways of getting at these values, possibly, not via the APIs directly.
Currently, there's no way to get the quotas programmatically or with gcloud(apart from the compute engine quotas) , however, there's a feature request to get/set the project quotas via API. I suggest starring this issue to track it and ask for updates from it.
knowing of no API, which could be used to do so ...
guess one could only limit the quota per user; see the documentation.
there are several questions concerning other API (all the same).

Collecting client side browser metrics with Stackdriver?

I have seen some solutions for capturing client-side errors and reporting them to Stackdriver.
Does anybody know if it's possible to utilize Stackdriver in some way to collect page load timing metrics and report those? I couldn't find any example of how I might be able to do that in the documentation.
I believe a better approach is to send these information to your back end and have it forward them to stackdriver.
Otherwise, you have to either share credentials to the client to allow them to hit the stackdriver endpoint or open them as public. These are both horrible as someone could start hammering our logging and hide info/increase cost for you.
If you still want to go the "client logging directly" way, it's simply hitting the monitoring.googleapis.com endpoint with authenticated calls (here the auth is the hard part).

AWS WAF Rate Limit by Request URL Component

I want to start by saying that I'm a total newcomer to AWS.
I'm investigating using AWS WAF for dynamic rate limiting based on a component of the request URL. The AWS website has a tutorial for doing this by IP address, but I have no idea if it can be modified to do what I need.
So, with that in mind, please tell me what, if any, of the following is actually possible:
Rate limit by a component of the URL (an API key in this case)
Determine limit dynamically (different behaviour for different keys)
Perform some non-blocking action in the first instance of exceeding
the limit, then block if the limit is exceeded consistently
Log both of the above actions and do something with the outputted logs (i.e. forward them somewhere)
Again, I'm not looking for detailed how-tos here as they would probably warrant seperate questions - just want to know if this is possible.
API Gateway is probably the right fit for what you are looking to implement. It has throttling implemented out of the box.
Take a look at API Gateway Usage Plans for implementation details for your specific use case.

Restrict access to Google Cloud Functions to a given network?

I'm looking through Google Cloud Functions docs and I wonder if it is possible to restrict access to HTTP cloud function to the given network? I would like to avoid anyone to exhaust the free quota.
Is there any firewall rules or similar mechanism for Cloud Functions?
I don't believe there is any in-built security restrictions at the moment.
In terms of avoid quota exhaustion you could pass a header or parameter with some kind of shared secret. Even a fixed string value would help avoid this problem.
You can add authentication to a cloud function by using firebase authentication. Here's a github example of how to do to it: https://github.com/firebase/functions-samples/tree/master/authorized-https-endpoint
Note however that the authentication code is executed by your function, so rejecting unauthorized access would still consume a small portion of your free resource allowance.
The Google Function Authorizer module might be what you're looking for. It provides "a simple user authentication and management system for Google Cloud HTTP Functions." It doesn't seem to have a lot of users yet, but the project seems simple enough that you could at least use it as a basis to modify or implement your own solution if you prefer.
This article was helpful for me.
https://cloud.google.com/solutions/authentication-in-http-cloud-functions
Anyone can still invoke the function but it must contain credentials from a user that has access to the resources accessed by the function.
Before that I was doing something very simple that is probably not great for production but does provide a little bit more security that just leaving it open publicly. I call my function with a password in the payload and if it doesn't match one of the passwords I hardcoded on the function it just fails with a 403.
If you need to restrict to IP range then you can follow instructions here: https://sukantamaikap.com/posts/load-balancing-cloud-functions
The UI of Google Cloud has unfortunately changed and you need to do some searching before you get all done, but I managed to set it up. But note that the related services will cost roughly 25 eur per month at minimum.
You can estimate the pricing here:
https://cloudpricingcalculator.appspot.com/
You need to search for "Cloud Load Balancing and Network Services" and then enable "Cloud Load Balancing", "Google Cloud Armor", and "IP addresses".
Alternatively, in some cases it might be sufficient if you set the name of the function or some suffix to the name complex enough so that it will be effectively like a sort of password. Something like MyGoogleCloudFunc-abracadabra. Then it will not restrict the network but perhaps outsiders would not know the secret name anyway.