I am https://graph.facebook.com/me/statuses?limit=250000, but it shows only statuses since 2011-09-19.
I also tried to use https://graph.facebook.com/me/statuses?since=1199167200, but result was the same.
Is there any option, how to get all my statuses? Thanks a lot.
Yes, limit greater than 100 will always return 100 result, So just request limit=100 with offset
Example
for 1st 100 status
https://graph.facebook.com/me/statuses?limit=100&offset=0
for second 100 status
https://graph.facebook.com/me/statuses?limit=100&offset=100
and so on until you get empty response.
Related
I am trying to extract all the posts from a private FB group of which I am an admin. I am using a Python script to access the data. Whether I use the Graph API Explorer via the web, or my Python script, I am having the exact same problem. I am able to gather the first 6 pages of the feed, each page containing 25 posts. The very first request looks like this:
https://graph.facebook.com/<groupID>/feed?access_token=<accessToken>
That will return, as I stated, the latest 25 posts on the group page.
At the bottom of the JSON that is returned for each request is a section like this:
"paging": {
"previous": "https://graph.facebook.com/v13.0/<pageID>/feed?access_token=<tokenID>&pretty=0&until&__previous=1&since=1649789940&__paging_token=<paging_token>",
"next": "https://graph.facebook.com/v13.0/<pageID>/feed?access_token=<tokenID>&pretty=0&until=1647885515&since&__paging_token=<paging_token>&__previous"
}
I use the value in next to launch the next query. This works until I get to the 6th request. At that point when I request the URL in next it spins for about 15 seconds and then I get the following error:
{
"error": {
"code": 1,
"message": "Please reduce the amount of data you're asking for, then retry your request"
}
}
How exactly do I reduce my data that I'm requesting? I've tried adding the feed.limit() to the request, and it works for the very first request. But that limit is never included in the next URL. Adding it in myself via the script still always returns 25 posts, not what the limit was on the first try. So if I set feed.limit(7) it returns 7 posts on the first request, but then when I use the next link I get 25.
I've set the limit to 100, the first request works, next works the first time, but not the second. If I set the limit to 120 it works with the first query, but now next doesn't. So it seems like it has this built in barrier at 125, it won't give me any more data than that. Any help would be greatly appreciated.
I am sending queries to my RESTclient for example
url = "http://yboss.yahooapis.com/ysearch/web?q="Yorkshire%20Capital"ANDfraud&format=xml&abstract=long"
and I am noticing that there is a maximum of 50 results being returned each time. How can I change the query in order to get all the results?
You can't, by design.
http://developer.yahoo.com/boss/search/#pricing
Query Type Definition Price/1000 Queries (USD) Max. Results/Query
Full Web Web Search results only. $0.80 50
I have two otherwise identical posts on a Facebook page that I administer. One post we'll call "full" returns the full range of insight values (31) I'd expect even when the values are zero, while the other which we'll call "subset" returns only a very limited subset of values (7). See below for the actual values returned.
Note that I've confirmed this is the case by using both the GUI-driven export to Excel and the Facebook Graph API Explorer (https://developers.facebook.com/tools/explorer).
My first thought was that the API suppresses certain values such as post_negative_feedback if they are zero (i.e., nobody has clicked hide or report as spam/abusive), but this is not the case. The "full" post has no such reports (or at the very least the return value for all the post_negative_* fields are zero.
I've even tried intentionally reporting the post with no negative return values as spam, and then repulling what I thought was a real-time field (i.e., post_negative_feedback), but data still comes back empty:
{
"data": [
],
(paging data)
}
What gives?
Here is the more limited subset returned for the problematic post:
post_engaged_users
post_impressions
post_impressions_fan
post_impressions_fan_unique
post_impressions_organic
post_impressions_organic_unique
post_impressions_unique
And here is the full set returned for most other posts (with asterisks added to show the subset returned above):
post_consumptions
post_consumptions_by_type
post_consumptions_by_type_unique
post_consumptions_unique
*post_engaged_users
*post_impressions
post_impressions_by_story_type
post_impressions_by_story_type_unique
*post_impressions_fan
post_impressions_fan_paid
post_impressions_fan_paid_unique
*post_impressions_fan_unique
*post_impressions_organic
*post_impressions_organic_unique
post_impressions_paid
post_impressions_paid_unique
*post_impressions_unique
post_impressions_viral
post_impressions_viral_unique
post_negative_feedback
post_negative_feedback_by_type
post_negative_feedback_by_type_unique
post_negative_feedback_unique
post_stories
post_stories_by_action_type
post_story_adds
post_story_adds_by_action_type
post_story_adds_by_action_type_unique
post_story_adds_unique
post_storytellers
post_storytellers_by_action_type
The issue (besides "why does this happen?") is that I've tried giving negative feedback to the post that fails to report any count whatsoever for this -- and I still receive no data (would expect "1" or something around there). I started out waiting the obligatory 15 minutes (real-time field) and then when that didn't work give it a full 24 hours. What gives?
I try to get all likes for a post on Facebook, using the Graph API. I used the two following URLs (access token hidden):
https://graph.facebook.com/29576928167_10151158219658168/likes?since=1&limit=25&access_token=AAA...
https://graph.facebook.com/29576928167_10151158219658168/likes?since=1&limit=750&access_token=AAA...
Notice the only change is the limit parameter. When I use a limit of 25, I receive 42 pages of 25 likes each, and a final page of 7 likes, for a total number of 1057 likes. Using a limit of 750, I get a page of 99 likes, then a page with 307 results, then a page with 0 likes. My system stops when it encounters a page of 0 results. Total results, 406. In both cases, I follow the URLs as returned by Facebook, namely paging.next.
Does anybody know why changing the limit changes the total number of results?
I have been querying Geonames for parks per state. Mostly there are under 1000 parks per state, but I just queried Conneticut, and there are just under 1200 parks there.
I already got the 1-1000 with this query:
http://api.geonames.org/search?featureCode=PRK&username=demo&country=US&style=full&adminCode1=CT&maxRows=1000
But increasing the maxRows to 1200 gives an error that I am querying for too many at once. Is there a way to query for rows 1000-1200 ?
I don't really see how to do it with their API.
Thanks!
You should be using the startRow parameter in the query to page results. The documentation notes that it takes an integer value (0 based indexing) and should be
Used for paging results. If you want to get results 30 to 40, use startRow=30 and maxRows=10. Default is 0.
So to get the next 1000 data points (1000-1999), you should change your query to
http://api.geonames.org/search?featureCode=PRK&username=demo&country=US&style=full&adminCode1=CT&maxRows=1000&startRow=1000
I'd suggest reducing the maxRows to something manageable as well - something that will put less of a load on their servers and make for quicker responses to your queries.