I'm currently building a web app for a UK company with many outlets in the UK. I want to implement a 'find my nearest' based on the following.
Postcode
Landmarks
So the user could enter either to get a list of their nearest. I've done this before using postcode data in a database and then using Pythagoras to figure out the nearest ones.
Ideally I would like to use a web service to do this but I cant seem to find any at all.
My question would be - What would be the best way to implement such a service ? 3rd party app or do it myself. ?
There are various options listed very sanely and even with UK specific notes here
You should externalize the reverse geocoding (from postcode to coordinates) and then intersect the stores location with the coordinates of the postcode or location to get the nearest one. There is another example here.
The reason to externalize the geocoding is to relieve you from the need to update such a database continuously which might not even be feasible given your constraints.
Take a look at the Geokit gem, for ideas if not for implementation.
Also, in the UK you can get quite markedly incorrect answers if you use Pythagoras. You really want Haversine, and someone else has almost certainly done the hard work no matter what platform you're on.
Related
I've been working with a logistics company and using Here services for Geocoding, Routing, and Telematics. We've been facing a couple of issues regarding the precision of the geocoding API when compared to Google results.
Here are some examples:
1) Returning wrong address even though the information is complete
That's a use case from Florianopolis (capital of SC), in a very well-known street.
If I try to geocode the following address:
Rua João Pio Duarte Silva, 526
It's gonna return an address that is almost 500m far away from the original street number. It happens in the API, but also in the Here Maps, which led me to think the precision in Brazil is not trustable. That's just one scenario, but we've faced similar situations like that.
Here comes my first question, what's the expected precision of the Here Geocoding API in South Brazil?
2) Effect of trailing zeros in the address
Some of the services we use to grab the address return trailing zeros in the house number. We thought it wouldn't be an issue until we faced the following scenario:
The only thing that changed in the request is the number from 59 to 00059.
Here's the difference when displaying the coordinates returned by the Here Geocoding API for the cases above on Google Maps:
SUMMARY
I've been using the Here Geocoding API for a while and I feel it's not meeting our expectations, as we require a very precise service so our drivers can be more productive and less exposed to errors. Is there a known issue for Geocoding in Brazil, especially in the South? What's the relevance of the results compared to Google? Anything we could do to overcome those issues above (especially #1)?
Thanks in advance.
Can you please try by using mapview if you are specifically in Brazil. Attaching one example reference -
developer.here.com/documentation/examples/rest/geocoder/latitude-longitude-by-mapview-parameter
For coverage details, refer :
developer.here.com/documentation/geocoder/dev_guide/topics/coverage-geocoder.html
Thanks for reporting the discrepancy. The map however gets loaded with refresh data. Please use
1) Map creator (locate the poi and make the change, Here team will review and approve it)
2) Use Map Feedback API :
developer.here.com/documentation/map-feedback/dev_guide/topics/what-is.html
why doesn't geocoding allow me to create markings for more than 11 addresses? I have hundreds of addresses in a database, but no Long Lat information. I need to mark all these addresses on a map. Somehow it displays only the first 11 markings.
This question has been asked earlier i know and the solution is to set an interval between markers. I was able to display all by using a time interval between the markings. This solution is obviously too slow. Is there a better solution now?
Rgds
Your question isn't very clear to me, but I understand that you are trying to show address locations on a map without knowing their coordinates. Using Google Maps, for example, you don't actually need latitude/longitude. But do you know the addresses are correct? Or, if you aren't using Google Maps but have a different use case entirely, then perhaps you do need the coordinates.
I work for SmartyStreets where we perform both of these services (verifying addresses and geocoding them, meaning supplying lat/lon information).
Getting lat/lon can be tricky, especially considering that addresses are often so different and anything but "normalized" or standardized. Google and similar services approximate addresses but do not verify them, and their lat/lon is sometimes equally a best-guess.
Judging from your question, it seems like something like the LiveAddress API would suit you well -- and it's free for low volume use. It's quite accurate, but in cases where it's "really" far off (meaning, a city block usually; still not too bad), it does return the level of resolution for each query.
If you have further questions or clarifications, feel free to respond (I'll help you out).
Geocoding has some limitations on converting address into lat long. This is casued by OVER_QUERY_LIMIT.
Client side geocoding has some limitation of 20 queries per minute or sec. Server side geocoding also has limitations but after 2500 queries
I have worked on this issue and I used tips based on this solution via PHP/JavaScript and AJAX:
http://webstutorial.com/google-server-side-geocoding-php-infobox/website-tweaks/google#comment-429147758
I am looking for a geocoding service where I can make a request with an address or intersection, not necessarily separated into separate fields (street, city, state, etc.) and get the latitude and longitude, along with suggestions and corrections for misspelled or ambiguous queries.
I really like the Google Geocoding API, but the terms of use say that I am not allowed to store the responses or use the service for any purpose other than showing the result on one of their maps. I am planning to use it for a lightweight, mobile-friendly website that may have the option of displaying results with text only, so this would not work, assuming I am interpreting their terms correctly.
The Yahoo PlaceFinder API looks nice but it comes with similar restrictions.
I am trying to decide what would be a good choice. The Bing API looks good. I don't see any sort of restriction in their terms but am I missing something?
Does anyone know what would be a good choice? I have very limited funding, so I would prefer something that is free or cheap, at least for the near future.
You could try Nominatim, it's a tool to search OpenStreetMap data by name and address.
MapQuest provide a free API as long as you give the appropriate credit
I'm not sure how well it handles misspellings or ambiguous queries though!
I'm really hoping there's an existing service for something like this. I have a location (could be GPS coordinates or a street address, I can use geocoding or reverse geocoding services to switch between them) and I want to find a business that's listed as being approximately at that place.
If this service doesn't already exist, I'm thinking the best way to do what I want is to get a list of businesses close to a location, go through those and single out the closest one to the point I want, and say I'm "in" it if the distance is less than such and such.
If you have some pointers for which services I should look into (for either pinpointing one business or getting a list proximate to a location) or you think my methodology is stupid, please let me know!
edit: it's looking like the yahoo local search thing can pretty much do what I want. I'm going to start tinkering with that
Google Maps doesn't offer this yet. They do reverse geocoding from a lat/long to an address but not a business or interest.
I'm looking this up myself to see who offers this but the two I know of so far are GeoAPI (recently purchased by twitter) and SimpleGeo.
What you're looking for is Google Places which also allows you to specify the business type as well.
This is just a hunch, but have you checked out the Google Maps API?
I'm working on building intelligence around link propagation, and because I need to deal with many short URL services where a reverse-lookup from an exact URL address is required, I need to be able to resolve multiple approximate versions of the same URL.
An example would be a URL like http://www.example.com?ref=affil&hl=en&ct=0
Of course, changing GET params in certain circumstances can refer to a completely different page, especially if the GET params in question refer to a profile or content ID.
But a quick parse of the page would quickly determine how similar the pages were to each other. Using a bit of machine learning, it could quickly become clear which GET params don't effect the content of the pages returned for a given site.
I'm assuming a service to send a URL and get a list of very similar URLs could only be offered by the likes of Google or Yahoo (or Twitter), but they don't seem to offer this feature, and I haven't found any other services that do.
If you know of any services that do cluster together groups of almost identical URLs in the aforementioned way, please let me know.
My bounty is a hug.
Every URL is akin an "address" to a location of data on the internet. The "host" part of the URL (in your example, "www.example.com") is a web-server, or a set of web-servers somewhere in the world. If we think of a URL as an "address", then the host could be a "country".
The country itself might keep track of every piece of mail that enters it. Some do, some don't. I'm talking about web-servers! Of course real countries don't make note of every piece of mail you get! :-)
But even if that "country" keeps track of every piece of mail - I really doubt they have any mechanism in place to send that list to you.
As for organizations that might do that harvesting themselves, I think the best bet would be Google, but even there the situation is rather grim. You see, because Google isn't the owner every web-server ("country") in the world, they cannot know of every URL that accesses that web-server.
But they can do the reverse. Since they can index every page they encounter, they can get a pretty good idea of every URL that appears in public HTML pages on the web. Of course, this won't include URLs people send to each other in chats, SMSs, or e-mails. But still, they can get a pretty good idea of what URLs exist.
I guess what I'm trying to say is that what you're looking for doesn't exist, really. The only way you can get all the URLs used to access a single website, is to be owner of that website.
Sorry, mate.
It sounds like you need to create some sort of discrete similarity rank between pages. This could be done by finding the number of similar words between two pages and normalizing the value to a bounded range then mapping certain portions of the range to different similarity ranks.
You would also need to know for each pair that you compare what GET parameters they had in common or how close they were. This information would become the attributes that define each of your instances (stored along side the rank mentioned above). After you have amassed a few hundred pairs of comparisons you could perhaps do some feature subset selection to identify the GET parameters that most identify how similar two pages are.
Of course, this could end up not finding anything useful at all as this dataset is likely to contain a great deal of noise.
If you are interested in this approach you should look into Infogain and feature subset selection in general. This is a link to my professors lecture notes which may come in handy. http://stuff.ttoy.net/cs591o/FSS.html