I am looking for a GEO IP database (Similar to MaxMind GeoLite2 Country and City) that will allow me to identify the US state that the user is coming from in order to target specific content to that user.
Does anyone know how or where I could find such a database/service or solution?
Don't expect high accuracy, unless you're satisfied with country/city precision. It is after all IP based geolocation data and the accuracy varies. It is based on ISP providers or companies that manages the databases(commercial databases may have higher accuracy). Look at an IP location info webtool ( like http://geoipinfo.org/ ) and you'll see approximately where it finds you, and it also provided accuracy at city and country levels - percentage wise. It used the ip2location database for lookups and their precision data.
This thread is old, but as of today, I used http://api.ipstack.com and it's working perfectly. They have a VERY extensive help examples on their site, but basically you make the call, parse the data and get what you want.
First, be sure you have any/all includes (Namespace=System.Xml, System.Net, blah, blah).
Second, be sure not to test on a private network IP (192.168.x.x or 10.x.x.x) because that will always return blank/empty fields and you'll think something is coded wrong.
Third, you will need an Acess_Key from ipstack.com ... you can setup a FREE account (10,000 requests a month I think) and get your access code to enter into your string below for API call. I filled out the form and was up and running in 10 minutes for free.
This worked for me to track visitors to any page:
string IP = "";
string strHostName = "";
string strHostInfo = "";
string strMyAccessKeyForIPStack = "THEYGIVEYOUTHISWHENYOUSETUPFREEACCOUNT";
strHostName = System.Net.Dns.GetHostName();
IPHostEntry ipEntry = System.Net.Dns.GetHostEntry(strHostName);
IPAddress[] addr = ipEntry.AddressList;
IP = addr[2].ToString();
XmlDocument doc = new XmlDocument();
string strMyIPToLocate = "http://api.ipstack.com/" + IP + "?access_key=strMyAccessKeyForIPStack&output=xml";
doc.Load(strMyIPToLocate);
XmlNodeList nodeLstCity = doc.GetElementsByTagName("city");
XmlNodeList nodeLstState = doc.GetElementsByTagName("region_name");
XmlNodeList nodeLstZIP = doc.GetElementsByTagName("zip");
XmlNodeList nodeLstLAT = doc.GetElementsByTagName("latitude");
XmlNodeList nodeLstLON = doc.GetElementsByTagName("longitude");
strHostInfo = "IP is from " + nodeLstCity[0].InnerText + ", " + nodeLstState[0].InnerText + " (" + nodeLstZIP[0].InnerText + ")";
// Then I do what you want with strHostInfo, I put it in a DB myself, but whatever.
Related
Hello all data pipeline experts!
Currently, I'm about to set up data ingestion from a MQTT source. All my MQTT topics contain float values, except a few ones from RFID scanners contain uuids that should be read in as strings. The RFID topics have a "RFID" in their topic name, specifically, they are of format "/+/+/+/+/RFID".
I would like to transfer all topics EXCEPT the RFID topics to float and store them in an influx db measurement "mqtt_data". The RFID topics should be stored as strings in the measurement "mqtt_string".
Yesterday, I fiddled around a lot with Processors and got no results other than headache. Today, I had a first success:
[[outputs.influxdb_v2]]
urls = ["http://localhost:8086"]
organization = "xy"
bucket = "bucket"
token = "ExJWOb5lPdoYPrJnB8cPIUgSonQ9zutjwZ6W3zDRkx1pY0m40Q_TidPrqkKeBTt2D0_jTyHopM6LmMPJLmzAfg=="
[[inputs.mqtt_consumer]]
servers = ["tcp://127.0.0.1:1883"]
qos = 0
connection_timeout = "30s"
name_override = "mqtt_data"
## Topics to subscribe to
topics = [
"+",
"+/+",
"+/+/+",
"+/+/+/+",
"+/+/+/+/+/+",
"+/+/+/+/+/+/+",
"+/+/+/+/+/+/+/+",
"+/+/+/+/+/+/+/+/+",
]
data_format = "value"
data_type = "float"
[[inputs.mqtt_consumer]]
servers = ["tcp://127.0.0.1:1883"]
qos = 0
connection_timeout = "30s"
name_override = "mqtt_string"
topics = ["+/+/+/+/RFID"]
data_format = "value"
data_type = "string"
as you can see, in the first mqtt_consumer, I left out all topics containing 5 levels of hierarchy. So it would miss those topics. Listing all number of hierarchy levels isn't nice either.
My question would be:
Is there a way to formulate a regex that negates the second mqtt_consumer block, i.e. selecting all topics that are not of the form "+/+/+/+/RFID" ? ... or is there another complete different, more elegant approach I'm not aware of ...
Although I worked before with regex'es, I got stuck at this point. Thanks for any hints to that!!!
I have set of documents which has the server name, with the start timestamp and end timestamp of that server. eg.
[
{
serverName: "Houston",
startTimestamp: "2018/03/07 17:52:13 +000",
endTimestamp: "2018/03/07 18:50:10 +000"
},
{
serverName: "Canberra",
startTimestamp: "2018/03/07 18:48:09 +000",
endTimestamp: "2018/03/07 20:10:00 +000"
},
{
serverName: "Melbourne",
startTimestamp: "2018/03/08 01:43:13 +000",
endTimestamp: "2018/03/08 12:09:10 +000"
}
]
With this data, given a Timestamp I need to get the list of active servers at that point of time.
For example. for TS="2018/03/07 18:50:00 +000" from the above data the list of active servers are ["Huston", "Canberra"]
Is it possible to achieve this using only CouchDB views. If so how to go about it?
Note: Initially I tried the following approach. In the map function I emit two documents
1 with key=doc.startTimestsamp and value={"station_add": doc.station}
1 with key=doc.startEndtsamp and value={"station_rem": doc.station}
My intention was to iterate through these in the reduce function adding stations present in "station_add" and removing stations in "stations_rem". But I found that CouchDB does not mention anything about the ordering of values in the reduce function.
If you can live with fixed periods and don't mind the extra disk space that might be needed for the view results, you can create a view of active servers per hour, for example.
Iterate over the periods between start and end and emit the time that each server was online during this period:
function(doc) {
var start = new Date(doc.startTimestamp).getTime()
var end = new Date(doc.endTimestamp).getTime()
var msPerPeriod = 60*60*1000
var msOfflineInFirstPeriod = start % msPerPeriod
var firstPeriod = start - msOfflineInFirstPeriod
var msOnlineInLastPeriod = end % msPerPeriod
var lastPeriod = end - msOnlineInLastPeriod
if (firstPeriod === lastPeriod) {
// The server was only online within one period.
emit([new Date(firstPeriod), doc.serverName], [1, msOnlineInLastPeriod - msOfflineInFirstPeriod])
} else {
// The server was online over multiple periods.
emit([new Date(firstPeriod), doc.serverName], [1,msPerPeriod - msOfflineInFirstPeriod])
for (var period = firstPeriod + msPerPeriod; period < lastPeriod; period += msPerPeriod) {
emit([new Date(period), doc.serverName], [1, msPerPeriod])
}
emit([new Date(lastPeriod), doc.serverName], [1,msOnlineInLastPeriod])
}
}
If you want the total without the server names, just add a reduce function with the built-in shortcut _sum. You'll get the number of servers online during the period as the first number and the milliseconds that the servers were online in that period as the second number.
You can play with the view if you emit the year, month and day as the first keys. Then you can use the group_level at query time to get a finer or more coarse overview.
Bear in mind that this view might get large on disk, as each row has to be stored, and also the intermediate results for each group level are stored. So you shouldn't set the period duration too small – emitting a row for each second would take a lot of disk space, for example.
I'm very new to python.
I get a serial data in COM port in fixed format as a string like this:
"21-12-2015 10:12:05 005 100 10.5 P"
The format is 'date time id count data data'
Here i don't require count and first data, instead i want to add one more data and send this again through another COM port.
I want to rearrange this and give output as
21-12-2015 10:12:05
SI.NO: 1451
Result: 10.5 P
My attempt:
ip = '21-12-2015_10:12:05_005_100_10.5 P'
dt = ip[0]+ip[1]+ip[3]+..... #save date as dt
tm = ip[9]+ip[10]+ip[11]+.... etc
and at the end
Result = dt + tm +"\n" + " "+ "SI.NO"+.......
Please suggest some good concept to do this in python 2.7.11
If you can mention some ideas i will search for the code.
Thank you
You can split up your string on whitespace into fields with split and build a new string using Python's string formatting syntax:
ip = "21-12-2015 10:12:05 005 100 10.5 P"
fields = ip.split()
s = '{date} {time}\n SI.NO: {sino}\n Result: {x} {y}'.format(
date=fields[0],
time=fields[1],
sino=1451, # Provide your own counter here
x=fields[4],
y=fields[5])
print s
21-12-2015 10:12:05
SI.NO: 1451
Result: 10.5 P
It isn't clear from your question whether your fields are separated by spaces or underscores. In the latter case, use fields = ip.split('_').
I am trying to display client's timezone besides the timestamp.
E.g 4:13 PST
I tried using GetTimeZoneInfo() but the only way I could think of is by getting the offset in hours and then determining through an array of hard coded values.
Other way around I found was using java.util.TimeZone class. Following is the code I have tried ---
<cfset tz = CreateObject("java", "java.util.TimeZone")>
<cfset tz = tz.getDefault()>
<cfoutput>TimeZone:#tz.getDisplayName(false, 1)#</cfoutput>
This gives me output as Central Standard Time.
Any further help...
The code you mention above gets the server's TZ, not the client's.
If you want the client's TZ, you should read the comments against this other, similar question. These all revolve around using the Date.getTimezoneOffset() method. This does only give you the offset from UTC though, not the more familiar GMT / BST etc.
If you are allowing your users to select their time zone instead of getting it from the browser which potentially could be inaccurate, or they are coming from database values such as time zone per city, etc, or you simply need to extract the abbreviation from any datetime value, you can parse it out of the return value from LSDateTimeFormat() with the "long" mask.
function tzabbr(required date dttm, string tz = "", string locale = GetLocale()) {
var str = tz == ""
? LSDateTimeFormat(dttm, "long", locale)
: LSDateTimeFormat(dttm, "long", locale, tz)
return ListLast(str, " ")
}
// Usage Examples
dttm = Now()
tzServ = tzabbr(dttm)
tzWest = tzabbr(dttm, "US/Pacific")
tzEast = tzabbr(dttm, "US/Eastern")
https://trycf.com/gist/144aa0399ea80127a3aa1d11a74fc79b/acf2021?theme=monokai
I need to find the Facebook place for the city for many lat/long points. The actual points refer to personal addresses, so there are no exact place ID's to look for as in the case of a business.
For testing, I was looking for the town of Red Feather Lakes, CO.
The graph search function will return a lot of places, but does not return cities Example
Raw FQL does not let you search by lat/long, and has no concept of "nearby" anyway. Example
An FQL query by ID reveals that there is an least a "Display Subtext" field which indicates that object is a city. Example
Thanks for any help. I have over 80 years of dated and geotagged photos of my dad that he would love to see on his timeline!
EDIT
Cities are not in the place table, they are only in the page table.
There is an undocumented distance() FQL function, but it only works in the place table. (Via this SO answer.)
This works:
SELECT name,description,geometry,latitude,longitude, display_subtext
FROM place
WHERE distance(latitude, longitude, "40.801985", "-105.593719") < 50000
But this gives an error "distance is not valid in table page":
SELECT page_id,name,description,type,location
FROM page
WHERE distance(
location.latitude,location.longitude,
"40.801985", "-105.593719") < 50000
It's a glorious hack, but this code works. The trick is to make two queries. First we look for places near our point. This returns a lot of business places. We then take the city of one of these places, and use this to look in the page table for that city's page. There seems to be a standard naming conventions for cities, but different for US and non-US cities.
Some small cities have various spellings in the place table, so the code loops through the returned places until it finds a match in the page table.
$fb_token = 'YOUR_TOKEN';
// Red Feather Lakes, Colorado
$lat = '40.8078';
$long = '-105.579';
// Karlsruhe, Germany
$lat = '49.037868';
$long = '8.350124';
$states_arr = array('AL'=>"Alabama",'AK'=>"Alaska",'AZ'=>"Arizona",'AR'=>"Arkansas",'CA'=>"California",'CO'=>"Colorado",'CT'=>"Connecticut",'DE'=>"Delaware",'FL'=>"Florida",'GA'=>"Georgia",'HI'=>"Hawaii",'ID'=>"Idaho",'IL'=>"Illinois", 'IN'=>"Indiana", 'IA'=>"Iowa", 'KS'=>"Kansas",'KY'=>"Kentucky",'LA'=>"Louisiana",'ME'=>"Maine",'MD'=>"Maryland", 'MA'=>"Massachusetts",'MI'=>"Michigan",'MN'=>"Minnesota",'MS'=>"Mississippi",'MO'=>"Missouri",'MT'=>"Montana",'NE'=>"Nebraska",'NV'=>"Nevada",'NH'=>"New Hampshire",'NJ'=>"New Jersey",'NM'=>"New Mexico",'NY'=>"New York",'NC'=>"North Carolina",'ND'=>"North Dakota",'OH'=>"Ohio",'OK'=>"Oklahoma", 'OR'=>"Oregon",'PA'=>"Pennsylvania",'RI'=>"Rhode Island",'SC'=>"South Carolina",'SD'=>"South Dakota",'TN'=>"Tennessee",'TX'=>"Texas",'UT'=>"Utah",'VT'=>"Vermont",'VA'=>"Virginia",'WA'=>"Washington",'DC'=>"Washington D.C.",'WV'=>"West Virginia",'WI'=>"Wisconsin",'WY'=>"Wyoming");
$place_search = json_decode(file_get_contents('https://graph.facebook.com/search?type=place¢er=' . $lat . ',' . $long . '&distance=10000&access_token=' . $fb_token));
foreach($place_search->data as $result) {
if ($result->location->city) {
$city = $result->location->city;
$state = $result->location->state;
$country = $result->location->country;
if ($country=='United States') {
$city_name = $city . ', ' . $states_arr[$state]; // e.g. 'Chicago, Illinois'
}
else {
$city_name = $city . ', ' . $country; // e.g. 'Rome, Italy'
}
$fql = 'SELECT name,page_id,name,description,type,location FROM page WHERE type="CITY" and name="' .$city_name. '"';
$result = json_decode(file_get_contents('https://graph.facebook.com/fql?q=' . rawurlencode($fql) . '&access_token=' . $fb_token));
if (count($result->data)>0) {
// We found it!
print_r($result);
break;
}
else {
// No luck, try the next place
print ("Couldn't find " . $city_name . "\n");
}
}
}
I found this solution worked for me when looking for a page for the closest city to the specified latitude/longitude. For some reason LIMIT 1 didn't return the closest city so I bumped up the limit and then took the first result.
SELECT page_id
FROM place
WHERE is_city and distance(latitude, longitude, "<latitude>", "<longitude>") < 100000
ORDER BY distance(latitude, longitude, "<latitude>", "<longitude>")
LIMIT 20