Yahoo Boss return more than 50 results - rest-client

I am sending queries to my RESTclient for example
url = "http://yboss.yahooapis.com/ysearch/web?q="Yorkshire%20Capital"ANDfraud&format=xml&abstract=long"
and I am noticing that there is a maximum of 50 results being returned each time. How can I change the query in order to get all the results?

You can't, by design.
http://developer.yahoo.com/boss/search/#pricing
Query Type Definition Price/1000 Queries (USD) Max. Results/Query
Full Web Web Search results only. $0.80 50

Related

How to search for pages by name in Facebook Graph API with least amount of calls using Python?

I have a list of football team names my_team_list = ['Bayern Munich', 'Chelsea FC', 'Manchester United', ...] and try to search for their official Facebook page to get their fan_count using the Python facebook-api. This is my code so far:
club_list = []
for team in my_team_list:
data = graph.request('/pages/search?q=' + team[0])
for i in data['data']:
likes = graph.get_object(id=i['id'], fields='id,name,fan_count,is_verified')
if likes['is_verified'] is True:
club_list.append([team[0],likes['name'],likes['fan_count'],likes['id']])
However, because my list contains over 3000 clubs, with this code I will get the following rate limit error:
GraphAPIError: (#4) Application request limit reached
How can I reduce the calls to get more than one club's page per call (e.g. batch calls?)
As the comment on the OP states, batching will not save you. You need to be actively watching the rate limiting:
"All responses to calls made to the Graph API include an X-App-Usage HTTP header. This header contains the current percentage of usage for your app. This percentage is equal to the usage shown to you in the rate limiting graphs. Use this number to dynamically balance your call load to avoid being throttled."
https://developers.facebook.com/docs/graph-api/advanced/rate-limiting#application-level-rate-limiting
On your first run through, you should save all of the valid page ids so in the future you can just query those ids instead of doing a search.

Vtiger. Change query limit

In vtiger wiki written:
Query always limits its output to 100 records, client application can use limit operator to get different records.
This query does not work:
doQuery("select * from Leads limit='200';")
How to specify the operator in a query?
The "limit" clause only works if the number given is lower than 100. You can't get more records than 100 using "limit" with 1 request.
To get more than 100 records from vTiger services you need to make various request using the "offset" in the "limit" clause.
If you really read the Wiki documentation, you'd see that you need to use:
select *
from Leads
limit 200;
Stop using unnecessary single quotes ('200') - the limit expects a numerical value, there's absolutely no point in converting that to a string (by using single quotes) .....
and drop the equal sign, too - it's not shown in the docs anywhere .....

Get Lowest Offers for Entire Inventory on Amazon

We are just getting started with MWS. We'd like to be able to use the lowest offers on each product to help calculate our price. There is an API to GetLowestOfferListForSku but that only returns a single sku and there is a throttle limit which would make it so we'd have to take several days to get all the data.
Does anybody know a way to get that data for multiple products in a single request?
You can fetch data on up to 20 SKUs using GetLowestOfferListingsForSKU by adding a SellerSKUList.SellerSKU.n parameter for each product (where n is a number from 1 to 20). The request looks something like this:
https://mws.amazonservices.com/Products/2011-10-01
?AWSAccessKeyId=AKIAJGUVGFGHNKE2NVUA
&Action=GetMatchingProduct
&SellerId=A2NK2PX936TF53
&SignatureVersion=2
&Timestamp=2012-02-07T01%3A22%3A39Z
&Version=2011-10-01
&Signature=MhSREjubAxTGSldGGWROxk4qvi3sawX1inVGF%2FepJOI%3D
&SignatureMethod=HmacSHA256
&MarketplaceId=ATVPDKIKX0DER
&SellerSKUList.SellerSKU.1=SKU1
&SellerSKUList.SellerSKU.2=SKU2
&SellerSKUList.SellerSKU.3=SKU3
Here's some relevant documentation which explains this: http://docs.developer.amazonservices.com/en_US/products/Products_ProcessingBulkOperationRequests.html
You might also find the MWS scratchpad helpful for testing:
https://mws.amazonservices.com/scratchpad/index.html

Facebook Graph API Explorer won't show all statuses

I am https://graph.facebook.com/me/statuses?limit=250000, but it shows only statuses since 2011-09-19.
I also tried to use https://graph.facebook.com/me/statuses?since=1199167200, but result was the same.
Is there any option, how to get all my statuses? Thanks a lot.
Yes, limit greater than 100 will always return 100 result, So just request limit=100 with offset
Example
for 1st 100 status
https://graph.facebook.com/me/statuses?limit=100&offset=0
for second 100 status
https://graph.facebook.com/me/statuses?limit=100&offset=100
and so on until you get empty response.

Querying geonames for rows 1000-1200

I have been querying Geonames for parks per state. Mostly there are under 1000 parks per state, but I just queried Conneticut, and there are just under 1200 parks there.
I already got the 1-1000 with this query:
http://api.geonames.org/search?featureCode=PRK&username=demo&country=US&style=full&adminCode1=CT&maxRows=1000
But increasing the maxRows to 1200 gives an error that I am querying for too many at once. Is there a way to query for rows 1000-1200 ?
I don't really see how to do it with their API.
Thanks!
You should be using the startRow parameter in the query to page results. The documentation notes that it takes an integer value (0 based indexing) and should be
Used for paging results. If you want to get results 30 to 40, use startRow=30 and maxRows=10. Default is 0.
So to get the next 1000 data points (1000-1999), you should change your query to
http://api.geonames.org/search?featureCode=PRK&username=demo&country=US&style=full&adminCode1=CT&maxRows=1000&startRow=1000
I'd suggest reducing the maxRows to something manageable as well - something that will put less of a load on their servers and make for quicker responses to your queries.