I need to access Google Docs Audit Activity for my domain. The limit for the same is 1000 records in a single API call. Also, the number of API calls per day is 10K.
What is the way to increase the limits for API calls per day? Google Support is unable to answer this question and redirected me to Stack Overflow.
You may want to refer with this thread regarding quota increase for Report API:
There are several quotas for the Google Analytics APIs and Google APIs in general.
requests/day 0 of 50,000
requests/100seconds/user 100
requests/perView 10000
Your application can make 50000 requests per day by default. This can be extended but it takes a while to get permission when you are getting close to this limit around 80% its best to request an extension at that time.
Your user can max make 100 requests a second which must be something that has just gone up last I knew it was only 10 requests a second. User is denoted by IP address. There is no way to extend this quota more then the max you cant apply for it or pay for it.
Then there is the last quota the one you asked about. You can make max 10000 requests a day to a view. This isn't just application based if the user runs my application and your application then together we have only 10000 requests that can be made. This quota is a pain if you ask me. Now for the bad news there is no way to extend this quota you cant apply for it you cant pay for it and you cant beg the Google Analytics dev team (I have tried)
Answer: No you cant extend the per view per day quota limit.
If you encountered error, it is recommended to catch the exception and, using an exponential backoff algorithm, wait for a small delay before retrying the failed call.
Related
I have 400 quotas and if I add one more I'm getting an error
'Maximum number of Resources for this API has been reached.'
What is the maximum number? 500-800?
I want to know if I can extend it for another 200-300 quotas or I need to create another API, thank you!
As per the documentation, the default quota for Resources per API is 300. Reviewing the documentation further we can see that this limit can be increased which I would suspect has already occurred on your account.
If you would like to increase this further, you can use the console again and request a service increase, a useful guide for this is here.
As for the upper limit, this is not listed and most likely wont be listed as it will be at the AWS service teams discretion to do so. Based on my experience, you can usually get 100-150% more than the default quotas just by requesting a service increase in the console. If you would like more than this you may have to create a support case and give justification for the request, but, as long as it is reasonable, it will usually be accepted.
I've been trying for a while to figure out if Google Cloud has a mechanism for a "crowbar" limit on API usage, as a security measure.
The scenario I'm concerned about is, say I have an API keyfile on a server, and a hacker steals it and plugs it into their system that (for the purposes of some nebulous scam) is running an unbounded number of translate requests as quickly as possible. I then receive a $25,000 bill. I don't see any mechanisms to really prevent this.
I can limit the roles for that key, but if the hacker's interest is in the same API I use for my project, that doesn't help. I can set a billing limit, but that just sends me an email, and I am not a person who notices incoming email, so in practice it would take me days to realize this had happened.
I can set a quota, but all the quotas in the dashboard seem to be per-minute or per-day limits. I need a per-month quota, and I don't see any way to do it. Am I just missing something, or does Google simply provide no option for this?
I understand that "well just don't let your API key get compromised" is probably the common wisdom, but that's unacceptable to my use cases for a number of reasons, so I'm not interested in discussing it. Thanks!
Edit: I should note, Google's documentation says "you can set requests per day caps" - but there are no instructions on that page for how to do this, and I can see no mechanism for it. I don't want a per-day cap, I want a per-month cap, but I'd take a per-day cap if I could find one. These are the only quotas I see for Cloud Vision, for instance:
Quotas part 1
Quotas part 2
As per Link 1 there is no Hard quota limit for Vision API on a monthly basis . If you need to use this feature you can request for this feature using the link 2.
In the meantime or as workaround you can control your vision API budget by using the Cloud Billing Budget API by following the link 3.
I ask because the errors don't seem to line up with what the documentation says.
We have a daily limit of 1 million requests. Upon reaching about 995,000 requests, we started getting errors about hitting the limit and requests flat-lined. This happened around 7:20pm. The dashboard says "Daily quotas reset at midnight Pacific Time (PT)". However, requests started going through again at about 8:30pm.
Since 995,000 < 1,000,000 and 8:30pm != midnight, this leaves me with the feeling that we can't actually predict and prevent hitting the rate limit, or know when it will reset.
What do I need to know in order to do occasional heavy-use of the API while staying under the limits? Nothing in the documentation or dashboards gives me what I need.
You can see all your API quotas on the Google Cloud Console's "IAM & admin" page's "Quotas" tab. It should contain the details on how much you have used and whether or not you hit the limit. If you only used 995,000 / 1,000,000 and it show as such in your quota page (give it a bit of time to update since it might have delay) and it doesn't work, there's something wrong and should be reported to us directly.
I suspect you might be hitting a rate limit, which temporary prevented you from making further requests. You just have to bring your rate below the limit to have it working again. The quota page contain information on the rate limit (per 100 seconds) as well.
Your official support channels for Civic Information API is here.
Around 90 or 100 calls per second to
pubsub_client.projects().topics().publish(topic='projects/xxxx',body=body).execute(num_retries=0)
per second from Google App Engine App to Google Cloud Pub/Sub, results in
HttpError: <HttpError 429 when requesting https://pubsub.googleapis.com/v1/projects/xxxx:publish?alt=json returned "Request throttled due to user QPS limit being reached.">
I know there is a limit on administrative operations at 100 QPS, but certainly publishing to a topic is not an administrative operation? I know pub/sub should support millions of operations per second so I know there's something wrong.
Any help or insight would be appreciated. I need to get up to at least 300 publishes per second, trying to streamline an existing implementation using pubsub. I think this may be a bug with the implementation.
I am running this code on Google App Engine python 2.7 -- using the appengine runtime, not the flexible one as that's not approved for production code yet.
Note that publisher quota is not in terms of QPS, but in terms of throughput. The default limit is 100MB/s. See the Quotas documentation for more details. Depending on the message size you are sending, you may be running into these limits.
The "user QPS limit being reached" message on a publish usually means one of three things:
You are publishing at a throughput that is higher than the default 100MB/s quota. If that is the case, then you can apply for more quota by clicking on the "Apply for higher quota" on the Pub/Sub Quota page.
You are not authenticated against the correct Cloud project. If you are authenticated in or running your Google App Engine instances in a Cloud project that differs from the one your topic is defined in, the quota you run into may not be defined in the project you expect. More information can be found in the Google Application Defaults Credentials page.
You have manually set quota in the Quota page and that is the limit you are running into.
I am planning to get an app developed but the developer has told me that there is a limit of 600 calls per 600 seconds per IP. The app has plenty of scenarios in which this would not suffice. Is there a way to increase the limit somehow? Or does Facebook offer any premium account or something probably with a yearly fee that does not have such a limit?
Thanks.
If you exceed, or plan to exceed, any of the following thresholds please contact us as you may be subject to additional terms: (>5M MAU) or (>100M API calls per day) or (>50M impressions per day).
Pulled from : https://developers.facebook.com/policy/
100M API Queries per day should be for a single app. So that should restrict you, but I dont think that matters.
Another thing, what you mentioned in your question, and I have read that elsewhere as well.
I've found 600 calls per 600 seconds, per token & per IP to be about where they stop you.
Pulled from : http://www.quora.com/Whats-the-Facebook-Open-Graph-API-rate-limit
Note, it is per token. Every other user has a different access token and IP as well. If it happens to be a cron running from the server, still I dont think they would catch you for the IP as long as you keep changing the tokens.
Another thing to implement is the Real time updates API, which will ping you when something changes so you dont have to run a 24*7 monitoring script.
P.S : Real Time Updates is Buggy! Have experienced it myself.