I try to get all likes for a post on Facebook, using the Graph API. I used the two following URLs (access token hidden):
https://graph.facebook.com/29576928167_10151158219658168/likes?since=1&limit=25&access_token=AAA...
https://graph.facebook.com/29576928167_10151158219658168/likes?since=1&limit=750&access_token=AAA...
Notice the only change is the limit parameter. When I use a limit of 25, I receive 42 pages of 25 likes each, and a final page of 7 likes, for a total number of 1057 likes. Using a limit of 750, I get a page of 99 likes, then a page with 307 results, then a page with 0 likes. My system stops when it encounters a page of 0 results. Total results, 406. In both cases, I follow the URLs as returned by Facebook, namely paging.next.
Does anybody know why changing the limit changes the total number of results?
Related
I am trying to extract all the posts from a private FB group of which I am an admin. I am using a Python script to access the data. Whether I use the Graph API Explorer via the web, or my Python script, I am having the exact same problem. I am able to gather the first 6 pages of the feed, each page containing 25 posts. The very first request looks like this:
https://graph.facebook.com/<groupID>/feed?access_token=<accessToken>
That will return, as I stated, the latest 25 posts on the group page.
At the bottom of the JSON that is returned for each request is a section like this:
"paging": {
"previous": "https://graph.facebook.com/v13.0/<pageID>/feed?access_token=<tokenID>&pretty=0&until&__previous=1&since=1649789940&__paging_token=<paging_token>",
"next": "https://graph.facebook.com/v13.0/<pageID>/feed?access_token=<tokenID>&pretty=0&until=1647885515&since&__paging_token=<paging_token>&__previous"
}
I use the value in next to launch the next query. This works until I get to the 6th request. At that point when I request the URL in next it spins for about 15 seconds and then I get the following error:
{
"error": {
"code": 1,
"message": "Please reduce the amount of data you're asking for, then retry your request"
}
}
How exactly do I reduce my data that I'm requesting? I've tried adding the feed.limit() to the request, and it works for the very first request. But that limit is never included in the next URL. Adding it in myself via the script still always returns 25 posts, not what the limit was on the first try. So if I set feed.limit(7) it returns 7 posts on the first request, but then when I use the next link I get 25.
I've set the limit to 100, the first request works, next works the first time, but not the second. If I set the limit to 120 it works with the first query, but now next doesn't. So it seems like it has this built in barrier at 125, it won't give me any more data than that. Any help would be greatly appreciated.
I have a list of football team names my_team_list = ['Bayern Munich', 'Chelsea FC', 'Manchester United', ...] and try to search for their official Facebook page to get their fan_count using the Python facebook-api. This is my code so far:
club_list = []
for team in my_team_list:
data = graph.request('/pages/search?q=' + team[0])
for i in data['data']:
likes = graph.get_object(id=i['id'], fields='id,name,fan_count,is_verified')
if likes['is_verified'] is True:
club_list.append([team[0],likes['name'],likes['fan_count'],likes['id']])
However, because my list contains over 3000 clubs, with this code I will get the following rate limit error:
GraphAPIError: (#4) Application request limit reached
How can I reduce the calls to get more than one club's page per call (e.g. batch calls?)
As the comment on the OP states, batching will not save you. You need to be actively watching the rate limiting:
"All responses to calls made to the Graph API include an X-App-Usage HTTP header. This header contains the current percentage of usage for your app. This percentage is equal to the usage shown to you in the rate limiting graphs. Use this number to dynamically balance your call load to avoid being throttled."
https://developers.facebook.com/docs/graph-api/advanced/rate-limiting#application-level-rate-limiting
On your first run through, you should save all of the valid page ids so in the future you can just query those ids instead of doing a search.
I am sending queries to my RESTclient for example
url = "http://yboss.yahooapis.com/ysearch/web?q="Yorkshire%20Capital"ANDfraud&format=xml&abstract=long"
and I am noticing that there is a maximum of 50 results being returned each time. How can I change the query in order to get all the results?
You can't, by design.
http://developer.yahoo.com/boss/search/#pricing
Query Type Definition Price/1000 Queries (USD) Max. Results/Query
Full Web Web Search results only. $0.80 50
Currently I am using following API call to retrieve Post Likes and Post Comments for Facebook Page (PageId). Here in below i am making only one API call and retrieving ALL posts and their comments total count.
1). https://graph.facebook.com/PageId/posts?access_token=xyz&method=GET&format=json
But, as per "July 2013 Breaking Changes" : - Now comments counts are not available with above API call. so , as per Road Map documentation I am using following API call to retrieve comments count ('total_count') for that particular POST ID.
2). https://graph.facebook.com/post_ID/?summary=true&access_token=xyz&method=GET&format=json
So , with second API call - I am able to retrieve comments count per Post Wise. But, here you can see that I need to iterate through each post & need to retrieve its comments count one by one per each post id. then need to sum up all to find out total comments count. so that requires too much API calls.
My Question is :- Is it possible to retrieve Page -> Posts -> ALL comments total count in single API call by considering 10 July breaking changes ?
Is there any alternative to my second API call to retrieve all comments total count per Facebook page posts ?
Hmm, well, I don't believe there is a way to bundle this all in a single api call. But, you can batch requests to get this in the seemingly same api call (will save time), but they will count against your rate limits separately. (my example below would be 4 calls against the limits)
Example batch call (json encoded) - and i'm storing the post ID in the php variable $postId.:
[{"method":"GET","relative_url":"' . $postId . '"},
{"method":"GET","relative_url":"' . $postId . '/likes?limit=1000&summary=true"},
{"method":"GET","relative_url":"' . $postId . /comments?filter=stream&limit=1000&summary=true"},
{"method":"GET","relative_url":"' . $postId . '/insights"}]
I'm batching 4 queries in this single call. First to get post info, second to get likes (up to 1000, plus the total count), third to get all the comments, plus the summary count, and finlly, insights (if it's the page's own posts).
You can drastically simplify this batch call if you don't want all the details I'm pulling.
In this case you still need to iterate though all. But, Facebook allows you to bundle up to 50 calls per batch request I believe, so you could request multiple post ids in the same batch call to speed things up too.
I am https://graph.facebook.com/me/statuses?limit=250000, but it shows only statuses since 2011-09-19.
I also tried to use https://graph.facebook.com/me/statuses?since=1199167200, but result was the same.
Is there any option, how to get all my statuses? Thanks a lot.
Yes, limit greater than 100 will always return 100 result, So just request limit=100 with offset
Example
for 1st 100 status
https://graph.facebook.com/me/statuses?limit=100&offset=0
for second 100 status
https://graph.facebook.com/me/statuses?limit=100&offset=100
and so on until you get empty response.