I would like to know is there are some limits when using Facebook API for web applications. I am interested especially in batch size limit (FacebookClient.Batch method) - how many params can I safely execute at once?
Thanks for help.
Limits
We currently limit the number of batch requests to 50.
https://developers.facebook.com/docs/reference/api/batch/
[edit: the batch limit was raised from 20 to 50 since this was first posted]
Related
I have 400 quotas and if I add one more I'm getting an error
'Maximum number of Resources for this API has been reached.'
What is the maximum number? 500-800?
I want to know if I can extend it for another 200-300 quotas or I need to create another API, thank you!
As per the documentation, the default quota for Resources per API is 300. Reviewing the documentation further we can see that this limit can be increased which I would suspect has already occurred on your account.
If you would like to increase this further, you can use the console again and request a service increase, a useful guide for this is here.
As for the upper limit, this is not listed and most likely wont be listed as it will be at the AWS service teams discretion to do so. Based on my experience, you can usually get 100-150% more than the default quotas just by requesting a service increase in the console. If you would like more than this you may have to create a support case and give justification for the request, but, as long as it is reasonable, it will usually be accepted.
i am a beginner in aws opensearch.I have one important question regarding it that is
how much data(in MB or GB) i can insert in bulk at a single time in aws opensearch.
i tried to find solution of my question on aws website but couldn't get the answer please let me know if you can help
The amount of data you can insert using the bulk operations will depend on the cluster, configuration and data (among other factors) but I've found this to be a good recommendation:
"Start with the bulk request size of 5 MiB to 15 MiB. Then, slowly increase the request size until the indexing performance stops improving."
I want to transcribe 1000 audios, each 15 minutes long.
I already tried it for one audio and it worked.
So my question is, could I run multiple long transcription commands at the same time without google-cloud-speech failing, or should I do it one-by-one instead?
And if I could run multiple commands at the same time, what is the limit of number of commands google-cloud-speech could handle?
Yes, you can do this by submitting a long-running audio processing operation. You could submit 1000 audio files and return immediately, save the process id's in a list, and poll for the operations to check whether they have been completed. This is what google exactly describes in its documentation.
Regarding your question about the API limits, I would suggest you take a look at the speech-to-text quotas page. The request limit is 900 requests per 60 seconds. Moreover, the amount of speech you can process every day is 460 hours. In your specific case, it will probably stay within those limits.
Currently, Cloud Run has a request limit of 32 Mb per request, which makes it impossible to upload files like videos (which placed with no changes to GCP Storage). Meantime All Quotas page doesn't list this limitation as the one you can request an increase from support. So question is - does anyone know how to increase this limit or how to make it possible (uploading video and bigger files) to Cloud Run with given limitation?
Google recommended best practice is to use Signed URLs to upload files, which is likely to be more scalable and reliable (over flaky networks) for file uploads:
see this url for further information:
https://cloud.google.com/storage/docs/access-control/signed-urls
As per official GCP documentation, the maximum request size limit for Cloud Run (which is 32 MB) cannot be increased.
Update since the other answers were posted - the request size can be unlimited if using HTTP/2.
See https://cloud.google.com/run/quotas#cloud_run_limits which says "Request Maximum HTTP/1 request size, in MB 32 if using HTTP/1 server. No limit if using HTTP/2 server."
I need to access Google Docs Audit Activity for my domain. The limit for the same is 1000 records in a single API call. Also, the number of API calls per day is 10K.
What is the way to increase the limits for API calls per day? Google Support is unable to answer this question and redirected me to Stack Overflow.
You may want to refer with this thread regarding quota increase for Report API:
There are several quotas for the Google Analytics APIs and Google APIs in general.
requests/day 0 of 50,000
requests/100seconds/user 100
requests/perView 10000
Your application can make 50000 requests per day by default. This can be extended but it takes a while to get permission when you are getting close to this limit around 80% its best to request an extension at that time.
Your user can max make 100 requests a second which must be something that has just gone up last I knew it was only 10 requests a second. User is denoted by IP address. There is no way to extend this quota more then the max you cant apply for it or pay for it.
Then there is the last quota the one you asked about. You can make max 10000 requests a day to a view. This isn't just application based if the user runs my application and your application then together we have only 10000 requests that can be made. This quota is a pain if you ask me. Now for the bad news there is no way to extend this quota you cant apply for it you cant pay for it and you cant beg the Google Analytics dev team (I have tried)
Answer: No you cant extend the per view per day quota limit.
If you encountered error, it is recommended to catch the exception and, using an exponential backoff algorithm, wait for a small delay before retrying the failed call.