maximum number of automatic posts to page allowed? - facebook-graph-api

I have an app where a user can enter a message in our CMS and select a list of pages where it should be posted to. The pages are all company pages (local branches of the company) and I have the page access token.
Is there a hard limit in how many posts I can send out?
Is there a difference if I send out all posts at once or if I put in a delay in between posts?

The limit is flexible. It is based on Facebook's complex algorithms on how naughty/nice your app is. There is no hard answer for you. Just monitor the exceptions coming back from the API and follow what they tell you when you do get one.

FB Policy
If you exceed 5M MAU, 100M API calls per day, or 50M impressions per day, you may be subject to additional terms.

Related

Using geobytes web service to get cities list

I wanted a free web service to get cities list and found geobytes. Its good. I wanted to know What is the meaning of 50000 request? On every key pressed it makes a HTTP request.So do they count this way?
but if you expect to be performing more than 50,000 requests per day (your average unique visitors X 5), then please tell us
Anyone who has used this please help.
I would imagine what it means is that going over 50,000 requests can be penalized in someway. A key press is not a request - but entering a city and fetching that cities' details would constitute 1 of the 50,000 requests.
Hope this helps.
I am the author and administrator of Goebytes'AutoCompleteCity API and there is now no practical limit to genuine use, and the reference to 50,000 lookup per day has been removed from the web site. I say practical, because it does have DOS attack prevention measures, but as the API is intended to be called from the browser (as opposed to a server - for that you would use the GetCityDetails API) its DOS protection measure of "1024 look ups per IP Address, per hour", should never cut in under any circumstances that I can imagine.

Limit on number of Graph API calls

I am planning to get an app developed but the developer has told me that there is a limit of 600 calls per 600 seconds per IP. The app has plenty of scenarios in which this would not suffice. Is there a way to increase the limit somehow? Or does Facebook offer any premium account or something probably with a yearly fee that does not have such a limit?
Thanks.
If you exceed, or plan to exceed, any of the following thresholds please contact us as you may be subject to additional terms: (>5M MAU) or (>100M API calls per day) or (>50M impressions per day).
Pulled from : https://developers.facebook.com/policy/
100M API Queries per day should be for a single app. So that should restrict you, but I dont think that matters.
Another thing, what you mentioned in your question, and I have read that elsewhere as well.
I've found 600 calls per 600 seconds, per token & per IP to be about where they stop you.
Pulled from : http://www.quora.com/Whats-the-Facebook-Open-Graph-API-rate-limit
Note, it is per token. Every other user has a different access token and IP as well. If it happens to be a cron running from the server, still I dont think they would catch you for the IP as long as you keep changing the tokens.
Another thing to implement is the Real time updates API, which will ping you when something changes so you dont have to run a 24*7 monitoring script.
P.S : Real Time Updates is Buggy! Have experienced it myself.

why limit to number of posts downloadable?

I am not able to download all pots in a wall. I have mentioned 99999 as the limit, but it stops anywhere between 150 to 300!!
Why is that?
eg /some_page_id/feeds
it returns different number of posts for different pages, and never do I het all the posts?
Anyone knows why?
Because like most of the apis out there clients have quotas. same thing for tweeter or netflix or else, because when you request something you are draining facebook bandwith or tweeter's bandwith ,so it make sense for them to limit the amount of request you can do.

Facebook Graph API-Account suspension

I have a .Net application that uses list of names/email addresses and finds there match on Facebook using the graph API. During testing, my list had 900 names...I was checking facebook matches for each name in in a loop...The process completed...After that when I opened my Facebook page...it gave me message that my account has been suspended due to suspicious activities?
What am I doing wrong here? Doesn't facebook allow to search large number requests to their server? And 900 doesn't seem to be a big number either..
per the platform policies: https://developers.facebook.com/policy/ this may be the a suspected breach of their "Principals" section.
See Policies I.5
If you exceed, or plan to exceed, any of the following thresholds
please contact us by creating confidential bug report with the
"threshold policy" tag as you may be subject to additional terms: (>5M
MAU) or (>100M API calls per day) or (>50M impressions per day).
Also IV.5
Facebook messaging (i.e., email sent to an #facebook.com address) is
designed for communication between users, and not a channel for
applications to communicate directly with users.
Then the biggie, V. Enforcement. No surprise, it's both automated and also monitored by humans. So maybe seeing 900+ requests coming from your app.
What I'd recommend doing:
Storing what you can client side (in a cache or data store) so you make fewer calls to the API.
Put logging on your API calls so you, the developer, can see exactly what is happening. You might be surprise at what you find there.

How do sites count other sites' visitors and "value", and how can they tell users' location?

Hi actually this is a simple question but just came up out of the curiosity...
I have seen a web evaluation online tool recently called teqpad.com.I have lots of queries on it
How do they do it?? eg:page views daily visitors etc. without mapping real website??...
Website worth...is this getting any near to any site??
I don't know how do they got daily revenue??
I like traffic by country..it has seen same like in Google analytic s..how they got that info??
another one is ISP info and Google map location of server..
is there any one here done similar scripts?? if so what is your opinion??
They may be tracking user browser stats like Alexa does. (More info on Wikipedia.) A group of users installs a plug-in that reports which sites each user visits, like TV ratings work in most (all?) countries. This method is obviously not very reliable, and often nowhere near the actual numbers of visitors.
This is usually based on bullshit pseudo-scientific calculations and never a viable basis for evaluating the "value" of a web site, even though it may be possible to guesstimate the approximate ad revenues a site yields (see 3) But that is only one revenue stream - it says nothing about how expensive the site's daily maintenance is - servers, staff, content creation....
It should be possible to very roughly estimate daily revenue by taking the guesses on daily visitors/page views, count the frequency with which ads are shown, and look at what those ads usually yield per page view. It is probably pretty easy to get some rough numbers on what an ad view is worth on a big site if you're in the market.
and 5. It is possible to track down most IP addresses down to the visitor's country and sometimes even city. See the Geo targeting article on Wikipedia