I develop WP7 app and I'm calling last 20 results from webservice and I wonder how to call next 20 when user goes to the end of listbox?
I have found some topics how to recognize when user reaches end of the list but I'm struggling how to re-call WebService and ask for next entries.
EDIT:
So okay, here is the thing. In my API I have two options:
- take some amount of results (like 10, 20, 30) and then show them all on the list
- second options is to ask API to give me like 3 pages of 20 records on each page
Thinking about second option: okay I can display just 1/3 pages and then when user goes down call another page (already stored on phone) but that makes no sense as user will download all records (even he don't want to see more than top 5...
The only idea is to call next results, but don't know how to re-call webservice on some point
Your problem seems more of a web-services related than a windows phone related. Because if you are getting some data from a web service then the web service provider should ideally provide you with some documentation on how to fetch next/previous records or entries.
Here are two links from Twitter API which gives you some idea on fetching the data in pages.
Getting the home_timeline data
Working with Timelines
Here is another link which gives idea on how to implement paging in a Silverlight Application (I am not sure how far this method is compatible with WP app)
If this data couldn't answer your question, then update your question with some additional data like which url you are using to fetch the first 20 records etc
Related
I've been self studying web design and want to implement something, but I'm really not sure how to accomplish it, even if I can.
The only frontend I have dealt with is angular 4, and the only backend I have dealt with is django rest framework. I have managed to get user models done in drf, and the frontend to get the user authenticated with json web tokens, and done different kinds of get and post requests.
What I want to do is on the front end have a button, when the button is hit, it will send some get request, that basically runs a text mining algorithm that will produce a list, it may take some time to fully complete, maybe in the range of 20-30 seconds, but I don't want the user to wait that long to get back the single response containing the fully compiled list.
Is it possible to say create a table in angular, and then every couple of seconds the backend sends another response containing more data, where the backend then appends the new results to that table. Something like:
00.00s | button -> GET request
01.00s drf starts analysis
05.00s drf returns the first estimated 10% of overall list
09.00s drf finds 10% more, returns estimated 20% of overall list
then repeat this process until the algorithm has stopped. The list will be very small in size, probably a list of around 20 strings, of about 15 words in each,..
I already tried in django to send multiple responses in a for loop, but the angular front end just receives the first one and then doesnt listen anymore.
No, that's not possible. For each request will be one response, not multiple.
You have two options:
- Just start your algorithm with an endpoint like /start, and check the state in an interval on an endpoint like /state
- Read about websockets or try firebase (or angularfire). This provides a two way communication
I'm currently running into the problem that I am using a webservice system to load products into magento.
I'm using the REST api in conjunction with Oauth to create products and assign a category. It works and when I go to the admin I can see the products as well as see they are properly assigned to the correct category. When I open the category management in the management console i can see i have (example: 106) items assigned in the category.
However, the problem is: It does not show in the site.. even with refreshing anything that is cache or index.
When I open up the management console and open 1 article and save it without changing any other property and then Save it. I can suddenly see the item in the front end webshop...
I'm lost to why this occurs.. also for 19k product updates it is becoming a bit of an annoying bit of work to update this amount of products since any bulk update method does not do the same as editing just 1 product at a time.
Any help is much appreciated.
In the end I have discovered the answer myself. Thought it might be nice to list it here as well.
In the 'rights' tab i added all the accessrights for the user using the api. This allowed me to read products etc. Very stupid mistake but somehow I overlooked this at first.
IF you'd expect security errors.. you wont get any. just empty lists and null responses.
I am using Amazon's Product Advertising API. When I'm searching products by keyword from an item search operation I get only 10 results, is there any way to get all result in one single call?
No - Paging Through Results explains some of the details:
It is possible to create a request that returns many thousands of
items in a response. This is problematic for several reasons.
Returning all of the item attributes for those items would
dramatically impact the performance of Product Advertising API in a
negative way. Also, posting a thousand responses on a web page is
impractical.
...
This example shows that 9729 items matched the search criteria. Also,
it shows that those results are on 973 (~9729/10) pages. You might try
putting in an ItemPage value over 10. If you do, Product Advertising
API returns the following error.
...
So, how do you get that 973rd page? You cannot. A better approach is
to submit a new request that is more targeted and yields fewer items
in the response.
I have an application where I want to show items that your friends have shared. This is basically a subset of data that would appear on your Facebook News Feed, so I am grabbing /me/home and then filtering out some things that I don't need.
The problem is that /me/home is extremely slow. I'm seeing a range of response times that is between 1200 and 10000 milliseconds with an average probably around 4 seconds.
Even with cached connections and a HTTP library that does SSL correctly these request times do not change much.
Does anyone know a better way to grab the News Feed? When I open Facebook in my browser, the News Feed appears pretty much immediately. So I am wondering if there is some Graph API call that is optimized for this data or has this result cached already.
Is there maybe an FQL alternative for this?
You can do this in FQL. This query should get you started:
SELECT post_id, actor_id, target_id, message, attachment FROM stream WHERE filter_key = 'others'
In the Graph API explorer on my feed, I get ~1000ms response times for the FQL query vs. ~2500ms for me/home.
For Facebook's home page, keep in mind that they use a series of AJAX queries to fill each of the boxes on your page a little at a time. I was on a very slow connection in a hotel last week and watched these fill box by box. The news feed fills first, five posts at a time, followed by the other boxes on the page. If page load performance could be an issue, you may want to move to an asynchronous model.
FQL will definitely help with that, as you'll be able to filter the data before it is returned by FB more finely than you can with just the Graph API.
I have a cronjob that runs every hours and parse 150,000+ records. Each record is summarized individually in a MySQL tables. I use two web services to retrieve the user information.
User demographic (ip, country, city etc.)
Phone information (if landline or cell phone and if cell phone what is the carrier)
Every time I get 1 record I check if I have information and if not I call these web services. After tracing my code I found out both of these calls takes 2 to 4 seconds and it makes my cronjob very slow and I can't compile statistics on time.
Is there a way to make these web service faster?
Thanks
simple:
get the data locally and use mellissa data:
for ip: http://w10.melissadata.com/dqt/websmart/ip-locator.htm
for phone: http://www.melissadata.com/fonedata.html
you can also cache them using memcache or APC which will make it faster since he does not have to request the data from the api or database.
A couple of ideas... if the same users are returning, caching the data in another table would be very helpful... you would only look it up once and have it for returning users. Upon re-reading the question it looks like you are doing that.
Another option would be to spawn new threads when you need to do the look-ups. This could be a new thread for each request, or if this is not feasible you could have n service threads ready to do the look-ups and update the results.