According to Nominatim and MapQuest, the following end-points should provide the same data:
Nominatim: http://nominatim.openstreetmap.org/reverse
MapQuest: http://open.mapquestapi.com/nominatim/v1/reverse
In fact
The Nominatim Search Service is similar to our MapQuest Search Service with its simple interface and powerful capabilities, but instead relies solely on data contributed to OpenStreetMap.
Therefore I would expect the two services to provide the same data, nevertheless the following query for the same coordinates (41.904163, 12.485597) yields different results:
Nominatim: http://nominatim.openstreetmap.org/reverse?lat=41.904163&lon=12.485597&addressdetails=1&format=xml&zoom=18
MapQuest: http://open.mapquestapi.com/nominatim/v1/reverse.php?lat=41.904163&lon=12.485597&addressdetails=1&format=xml&zoom=18
Results:
Nominatim: Via Francesco Crispi
MapQuest: Via Gregoriana
Why?
NOTE: as of 12.44 PM UTC (July 1st, 2013) OSM is returning the same results as MapQuest. There are a couple of related discussions on GitHub:
Issue 66
Issue 67
This is a bug in Nominatim, which is going to be fixed:
Nominatim was no longer searching for objects below street level (house numbers, POIs, etc.) due to this commit. It only concerned the instance on osm.org.
See issue #66 on GitHub: https://github.com/twain47/Nominatim/issues/66
Related
I am told that the following list of "puppy" image URL's are from imagenet.
https://github.com/asharov/cute-animal-detector/blob/master/data/puppy-urls.txt
How do I download another category for e.g. "cats"?
Where can I get the entire list of imagenet categories along with their explanation in csv?
Unfortunately, ImageNet is no longer as easily accessible as it previously was. You now have to create a free account, and then request access to the database using an email address that demonstrates your status as a non-commercial researcher. Following is an excerpt of the announcement posted on March 11, 2021 (does not specifically address the requirements to obtain an account and request access permission but explains some of their reasons for changing the website generally).
We are proud to see ImageNet's wide adoption going beyond what was originally envisioned. However, the decade-old website was burdened by growing download requests. To serve the community better, we have redesigned the website and upgraded its hardware. The new website is simpler; we removed tangential or outdated functions to focus on the core use case—enabling users to download the data, including the full ImageNet dataset and the ImageNet Large Scale Visual Recognition Challenge (ILSVRC).
ORIGINAL ANSWER (LINKS NO LONGER VALID):
You can interactively explore available synsets (categories) at http://www.image-net.org/explore, each synset page has a "Downloads" tab where you can download category image URLs.
Alternatively, you can use the ImageNet API. You can download image URLs for a particular synset using the synset id or wnid. The image URL download link below uses the wnid n02121808 for domestic cat, house cat, Felis domesticus, Felis catus.
http://www.image-net.org/api/text/imagenet.synset.geturls?wnid=n02121808
You can find the wnid for a particular synset using the explore link above (the id for a selected synset will be displayed in the browser address bar).
You can retrieve a list of all available synsets (by id) from:
http://www.image-net.org/api/text/imagenet.synset.obtain_synset_list.
You can retrieve the words associated with any synset id as follows (another cat example).
http://www.image-net.org/api/text/wordnet.synset.getwords?wnid=n02121808
or you can download smaller size of imagenet, mini-imagenet:
https://github.com/yaoyao-liu/mini-imagenet-tools
2-1. https://github.com/dragen1860/LearningToCompare-Pytorch/issues/4
2-2. https://github.com/twitter/meta-learning-lstm/tree/master/data/miniImagenet
You can easily use the python package MLclf to download and transform the mini-imagenet data for the traditional image classification task or the meta-learning task. just use:
pip install MLclf
You can also see for more details:
https://pypi.org/project/MLclf/
I'm taking my first steps at using Google Places API am currently experimenting with different types. I was wondering, what kind of type I have to use, if I want to search for tram/cable car/light-rails stations?
What I want is get a list of subway, bus and tram stations inside an defined radius for an arbitrary coordinate.
Subway and bus seem to be easy (types=subway_station or types=bus_station) but there does not seem to be an equivalent for trams.
Just for experimenting:
Search for the tram station "Agnes-Bernauer-Platz" at Munich (coordinates: 48.1398418,11.496119, good example because there are not subway or bus stations in direct vicinity.) If you interactively browse Google Maps, the station is found (with a "tram icon"), but Places API does not find it:
https://maps.googleapis.com/maps/api/place/nearbysearch/xml?location=48.1398418,11.496119&radius=100&key=....
Any ideas?
Thanks in advance!
Update:
types=light_rail_station
Ok, it seems there is already a type which is not yet documented at developers.google.com/places/supported_types: types=light_rail_station does the job.
Why is it that certain common addresses won't resolve using the API but they show up fine in google maps?
1400 Welton Street, Denver, CO
http://maps.googleapis.com/maps/api/geocode/json?sensor=false&address=1400+Welton+Street, Denver,+CO
Google places doesn't find it either. This is a common location seems strange..
Seems like an error in Google's map data. When you put in that address to Google maps it does zoom in to the general location, but you'll notice that it does not show an exact point on the map. If you play with the street numbers it looks like other addresses around that - 1399 and 1401 for example - do in fact return relevant results with the API call, and show a dot in Google Maps.
Bing maps says there's a "Pi Kitchen and Bar" at this location, so I assume it's a real location and this is just wrong map data.
I suggest reporting it to Google so they can fix: https://support.google.com/maps/answer/3094088
I'm familiar with the geonames API, whose findNearbyPlaceName resource takes an optional radius parameter and can return multiple places for a given lat/lng pair.
When doing a similar lookup in YDN PlaceFinder, using the R gflag to do a reverse lookup from a lat/lng, will there ever be more than 1 result returned? I have not found any instances yet, and from what I can tell, the PlaceFinder API doesn't allow for radius (though you can request offset from street).
Yes, the PlaceFinder service will only return a single location. (In the case of an error or location not found, you would get 0 results.) It's designed to go between a place description and location (e.g., street address to lat/long).
I'm not sure what your needs are, but you could also check out the Yahoo GeoPlanet API which provides more hierarchical data around the point of interest. PlaceFinder and GeoPlanet can be used together in this way.
Here are a couple of hack projects that help demonstrate each of these APIs:
Placefinder Explorer
GeoPlanet Explorer
Once, there was this awesome SNIPPETS library of C (and C++) code. I used it from 1995 on, but its origin is much older and it was updated until at least 2007. From time to time, I found that I needed a piece of code not from a library, but something I could cut and paste into my own projects.
To my horror, it's now gone. There are traces of it, dead links on SO for instance.
Can anybody find me a mirror, or an archive of it? I had an early version on my harddrive for years, but not any more.
Unfortunately, Bob Stout (the guy who put it together and was ultimately its sole maintainer) died in February; when the registration on the site's domain expired, it appears that nobody renewed it. As far as I know, the site worked up until then, so it may still be there in the servers, just with no registration to make the name visible. If you could find a way to get ahold of him, Jon Guthrie would probably be the person who could get it up and running again (IIRC, he was largely responsible for putting it up on the web in the first place).
I believe all the "released" versions of Snippets are available from IFDC FileGate in the PDNCEE area. If there's enough interest, I could probably sort out the code that's still reasonably interesting from the basically-obsolete (purely MS-DOS), and put it up on Github or SourceForge or some such.
Other mirrors of the (1997) version of Snippets:
http://www8.cs.umu.se/~isak/snippets/
http://www.brokersys.com/snippets/
TL;DR
Full github mirror of the code (Edited to point to github site, since first mirror died.)
Since your ServerFault question was off topic, I thought I'd post this here;
*puts on detective hat*
Well DNSHistory.org reports the domain snippets.org used to point to '206.251.38.37' up till 2011-04-02 (When did the domain go?)
Using CURL to send the 'host' header to that server;
[samr#ocelot~]$ curl -I -H "Host:snippets.org" http://206.251.38.37/
HTTP/1.1 200 OK
Date: Thu, 24 Nov 2011 15:12:16 GMT
Server: Apache/2.2.9 (Debian) PHP/4.4.4-8+etch6 mod_ssl/2.2.9 OpenSSL/0.9.8g
X-Powered-By: PHP/4.4.4-8+etch6
Content-Type: text/html
gives us a response. Next step, what does that page look like?
Well just getting the HTML and opening it in lynx;
[samr#ocelot~]$ curl -H "Host:snippets.org" http://206.251.38.37/ > snippets.org.html
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 10319 0 10319 0 0 29500 0 --:--:-- --:--:-- --:--:-- 52583
[samr#ocelot~]$ lynx snippets.org.html
Gives the wonderful response of;
SNIPPETS sponsors [MicroFirm.gif] Firmware, system, and sensors
development [Sponsor.jpg] Click here to learn more [dmlogo.gif]
"Free C/C++ Compilers" [188x44_bss14.jpg] "High-Performance Version
Control"
[logo.gif]
Free educational resources on the Internet
______________________________________________________________________________________________________________________
Thursday, 24-Nov-2011, 15:13:22 GMT
Last modified: 01-Apr-2007, 05:50:42 GMT
395594 visitors since 15-Jul-2002
__________________________________________________________________________
Mission:
SNIPPETS.ORG is dedicated to providing free informational and
educational resources on the World Wide Web. Currently, the two
principle topics are programming and do-it-yourself (DIY) audio.
The fields covered by SNIPPETS.ORG are generally technology and arts.
As soon as enough free material is collected, new sections will be
added related to photography and digital imaging.
The one common factor in everything you'll find on this site is
that it's all free. Programming source code is free. Tools and
utilities are free. And, of course, information is always free.
While SNIPPETS.ORG provides many links to commercial sites, it is a
not-for-profit operation - nothing here is for sale! If you wish
to contribute content, information, or entire web sites to
SNIPPETS.ORG, please contact me.
[snip]
So to answer your question, the domain used to point to '206.251.38.37', and the site (appears to) still exist.
Next thing; mirroring. The wget tool provides a --mirror flag to recursively download a website to a directory, which looks to be just what we're after.
Started creating a mirror on my home server, but here's the command I'm using;
wget --header="Host:snippets.org" --mirror -p --convert-links -P ./snippets.org/ http://206.251.38.37/
Then I extracted the files from the code directories and uploaded them to my mirror site http://mirror.rmg.io/snippets.org/
embedded.snippets.org reports as 'down for maintenance' so couldn't be spidered.
The link you point to is not dead; it's alive at archive.org: http://web.archive.org/web/20080217222203/http://c.snippets.org/
Now, the latest copy of the SNIPPETS archive as found from archive.org, lives on at Github.