Better way to get last modified time for Google contact using Google people API - google-people-api

I use the Java Google People API to sync Google contacts with a desktop application. To do it, I need to get the last modified date/time for each Google contact. For that, I have been using updateTime in the Source field, which I get by iterating over person.metadata.sources.
But it sometimes returns a date/time which is more recent than the date/time of when I actually updated the contact. And that causes the contact to get synced unnecessarily. I wonder if Google is touching my contacts periodically in the background which updates the updateTime. If so, is there a better way for me to get the "real" update time for a contact?
Thanks.

Related

What is the cheapest way to send fast website update alerts to users as push notifications?

I'm trying to help an animal shelter deliver faster updates when a new pet is added to their website. This is likely to happen between 0-20 times a day.
The website is a simple data dump, animals are in tables with row delineation (easy to parse) and have unique IDs. When a new pet is added, ideally this would trigger a mobile notification to subscribed users (could also be an email message). The faster updates are sent, the better, but checking every 30 mins or so would be fine. Because this is for a charity, I want to spend as little as possible on resources (because I also want to be able to scale this up for other shelters that might want to use this).
For instance mobile notifications, Twitter seems to be a good candidate. It looks like my needs wont run into fees/restrictions.
The part that I'm stuck on is how best to ping the site for updates and publish those updates to twitter. The two options I've come up with are:
Build my own system. Use a web crawler like Scrapy to periodically crawl the site and check for new petIDs. Using AWS, I think I could get by with a nano instance (~$57 a year). Using dynamoDB to cache existing petIDs seems like a small additional cost. Use twitter API to post updates
Use an RSS feed generator like Feedity. These seem to be pretty expensive: Feedity is $180/year for hourly updates and $390 for 15 minute updates. Has API integrated with Twitter.
I'd like to know if there are any better/simpler/cheaper/more obvious options I may be overlooking. Thanks!

Json2HTML integrating with my webstore for real-time data delivery

I'm running a few webstores with Prestashop for the cart, but I have found a really cool service, but trying to figure out, how it can work for me though..
I've want to transform JSON to HTML on my store's backend before it reaches the customer within seconds since this is going to be a real-time update service for my descriptions and such.
I can send anyone that thinks they can help me the script that I have to use to get this data brought down.

Fetch data from website real time

Ok basically i'm fetching data from website using curl and parsing the contents using CkHtmlToText.
My issue is how to fetch new data website is writing down.
For example website contents are as follow:
-test1
-test2
After 1 second contents are :
-test1
-test2
-test3
How to fetch only the next line website wrote down that i didnt get yet which is " test3".
Any ideas ? Thank you.
Language im using is : Visual c++
HTTP requests are stateless. You make a request, you get a result, then you make another completely independent request, you get another result, and so on. If the resource you are trying to access is changing over time, you need to make multiple requests, where each time you will get the full updated resource.
I imagine you may be describing a web page that automatically updates while you are looking at it (like a Twitter feed, for example). In that case, the response contains a script that allows the browser to fetch new data and inject it into the DOM. Unless you also plan to build the DOM and use a JavaScript engine (basically implementing a web browser) to run the script, this is probably not useful to you. Instead, you are better off finding an API that gives you data in a format that is easy to parse and get updates for. If this API is a REST API (built on HTTP), then you will still need to make independent requests to get updates.

how to update fusion table dynamically from python

I am working on a health care project we have a device which continiously generates values for the fields ACTIVITY AND FREQUENCY .The values need to be updated continously from python to google fusion table.
The question is quite broad, you probably want to have a look at the documentation of the Google Fusion Tables API if you haven't so far: https://developers.google.com/fusiontables/docs/v1/using
Also it may be worth checking the quota section to make sure that Google Fusion Tables is indeed what you want to use:
https://developers.google.com/fusiontables/docs/v1/using#quota
I'll be glad to try to help if you come up with more specific questions :)
EDIT: since there are quite a few questions around the topic, I'll add some "hints".
A (Google Fusion) table belongs to a Google account. Your script must therefore include a step where it asks for your permission to modify data attached to your Google Account. You can therefore see your script as a web application which needs an authorization to achieve its goal. This web application will use the Google Fusion Tables API and therefore it must be registered in the Google API Console. You will find details about the process of registration and authentication with a Python script here:
https://developers.google.com/fusiontables/docs/articles/oauthfusiontables?hl=fr
I just checked that this works and you can insert rows to a table thereafter, so you may want to have a quick look at my script. Note that you can neither use my application credentials (which are by the way not included) nor my table as you are not authorized to edit it (it's mine!). So you must download your application credentials from the Google API console after having registered and adapt the script so it loads your credentials. Also, the script does not create a table (as of now) so as a first step you can create a table with two columns in the UI and copy paste the table id in the script so it will know in which table to write. Here's the script (sorry it's a bit of a mess right now, I'll do as soon as I can):
https://github.com/etiennecha/master_code/blob/master/code_fusion_tables/code_test_fusion_tables.py
Hope this helps.

get personal Amazon purchase history and simiar titles

I'm writing my own service to track my growing library and notify me of when books become available. I'm in the middle of 5 series waiting for the next book to come out. I also pick some up locally and would like to grab similar titles from amazon. How can I get my purchase history and similar titles? Is there an API for these? I haven't found anything from searches.
I don't think Amazon exposes an API for order history.
The closest thing seems to be the product advertising API: http://docs.aws.amazon.com/AWSECommerceService/latest/DG/Welcome.html
That would allow you to search for items, for example using ItemSearch:
http://docs.aws.amazon.com/AWSECommerceService/latest/DG/ItemSearch.html
Alternatively, you probably could write a script scrape the data by navigating through the order history page, or to help you capture each page of results as you manually navigate your order history. You're on your own for this option, though.