Using GeoIP only for US locations not 100% accurate - geoip

I have a module built to use the GeoIP Database. Im using it to get the location of the user visiting the website and we can provide the current weather on their city (only USA) pulled from YQL weather. I found this https://stackoverflow.com/questions/12365078/best-way-to-get-location-weather-details-accurately which reads is not going to be very accurate without letting the user enter their location.
So far it has been 90% succesfully displaying correct locations, sometimes we have to renew the DHCP lease to get the correct location... but my client still cannot pull the right city in their office. I don't really know much about their network connection, but I wonder if it is being forwarded, etc...
Here is how I am obtaining the user's IP on the module:
function getIpAddresses() {
$ipAddresses = array();
if (isset($_SERVER["HTTP_X_FORWARDED_FOR"])) {
$ipAddresses['proxy'] = isset($_SERVER["HTTP_CLIENT_IP"]) ? $_SERVER["HTTP_CLIENT_IP"] : $_SERVER["REMOTE_ADDR"];
$ipAddresses['user'] = $_SERVER["HTTP_X_FORWARDED_FOR"];
} else {
$ipAddresses['user'] = isset($_SERVER["HTTP_CLIENT_IP"]) ? $_SERVER["HTTP_CLIENT_IP"] : $_SERVER["REMOTE_ADDR"];
}
return $ipAddresses;
}
Is there a better way to get the IP / improve the accuracy without requesting the user's entry of their location?
Thanks!

You have a couple options:
Look for a better GeoIP data provider. There are several free data providers, in my experience they're pretty good at things like country level, as you get more granular you need to look at commercial solutions.
Record which IP they're at, even if they're renewing their IP they're probably sticking around in a particular sub-net. Hard code that sub-net to the correct location.
Incorporate a javascript location API, and send that data to the server. If the client computer is a laptop, it may provide better location information than simple GeoIP.

Related

Google Maps URLs, how to add a personal name to destination point?

I'm using Google Maps URLs from my web site page to provide a direction to my user. It works fine and launch automatically Google Map app. on mobile phone. To do that I'm using the lat, long destination coordinates extracted from my database. Each destination point have a name in my database, like PA_1155 or LDN_078... I would like to display that name on the destination point on the map but I didn't find anyway to do that. Is there any additional parameter that I missed ?
As of this writing, the Maps URLs documentation does not include information on any facility/parameter you could use to pass this arbitrary "destination name" and have it be displayed in the resulting Google Maps view.
Purely speculatively, this is likely a deliberate design choice to prevent manipulation of the resulting maps to show inaccurate information to end users. In the interest of protecting its users, I would say it's unlikely that Google would ever enable this sort of functionality.

Google Analytics not filtering internal traffic

I know there have been similar questions in the past but I have tried many solutions given online to no avail. I am just not able to hide internal traffic for Google Analytics on my Django site.
I am setting the filter from Admin->View->Filters. Have tried Predefined and Custom both with fixed IP as well as a regex pattern. (Yes, I have double checked my IP from whatismyip.com and I am using the right one)
I read somewhere that it takes time for the filters to come into effect, so even waited for 24 hours but I still see a lot of internal traffic.
Google Tag Assistant is also tracking the pages when I access them from internal IP (not sure if its supposed to know about the filters)
Not sure where could I be going wrong.
(I am using reverse proxy but hopefully that wouldn't change anything since the google analytics code is run on the client side)
Do not use any filter on the default view (called 'All Website Data'). Create a separate view and then create a filter on it. That will work.
(After struggling with it for a few days, this response helped me with the above fix)
I struggled with this as well, so here is what I found out.
Note that real time reporting can take up to 2hrs to catch up to and reflect analytics configuration changes such as the addition of filters.
Possible solutions
1) As suggested in the other answer, leave the default view as default and create an additional view for the filters:
The default view collects
all traffic. You need to create a new view for which you can apply
your filter. Check out item 3 here
https://support.google.com/analytics/answer/1009618?hl=en
How to add
a new view: https://support.google.com/analytics/answer/1009714?hl=en
2) Filter IP v6, not v4:
Exclude the ipv6 address as mentioned in above post. This is the one
that "what is my ip address" returns. It's not the ipv4 syntax
(xxx.xxx.xxx.xxx) However, I have noticed that wired machines that
stay connected seem to keep the same ipv6 IP (the 31 digit sequence),
however wireless accounts (mobile phones, tablets) tend to be dynamic.
However, as posted above if you use just the first 15 digits of the
sequence and use the "begins with" filter type, it will block
the devices using the same shared router (ie. internet router in your
home)
About filtering only the first 15 digits:
I think it is meant to filter the first four blocks, so if your IPv6 looks like 2601:191:c001:2f9:5c5a:1c20:61b6:675a, then filter IP that begins with 2601:191:c001:2f9:.
Information found here.

How to change users while preserving the store?

I want to implement a "fast login".
I'm developing an enterprise software where a lot of users work in the same organization with the same data in the same computer and I want to be able to know who did what and when. Right now they have to log out and log in and load the data has to be loaded into the store all over again.
What I want is for them to be able to, without logging out, click on a user, from the organization, insert his password and the user is switched while preserving the store.
Any idea how I can accomplish this?
I'm using ember-simple-auth v1.1.0 and ember v2.10.2
The simpliest solution would be disabling page reload when user logs out. As far as I know, it's a reload causes data loss from store, not a logging out by itself. To do this, you need to overwrite sessionInvalidated method in your application route. For example,
sessionInvalidated() {
this.transitionTo('/login');
},
But remember - you lower security with this method: if someone will log out and leave webpage with app open, other person will have a possibility to extract data (if they have enough technical background to at least install ember inspector).
Other solution will require heavy research. I think it should be possible to implement custom authenticator which would allow to authenticate new user without logging out previous, by simply replacing tokens in store. But I don't know how hard it will be to implement and what obstacles you can meet. You will need to read ember-simple-auth's sources a lot, that's for sure.
I was actually able to solve it by simply using authenticate() with another user but never calling invalidateSession() which is the function that calls sessionInvalidated() that looks like this:
sessionInvalidated() {
if (!testing) {
if (this.get('_isFastBoot')) {
this.transitionTo(Configuration.baseURL);
} else {
window.location.replace(Configuration.baseURL);
}
}
}
So by not calling sessionInvalidated() the user isn't redirected or the page refreshed and the new user is able to keep using the store without switching pages.

Updating a hit counter when an image is accessed in Django

I am working on doing some simple analytics on a Django webstite (v1.4.1). Seeing as this data will be gathered on pretty much every server request, I figured the right way to do this would be with a piece of custom middleware.
One important metric for the site is how often given images are accessed. Since each image is its own object, I thought about using django-hitcount, but figured that was unnecessary for what I was trying to do. If it proves easier, I may use it though.
The current conundrum I face is that I don't want to query the database and look for a given object for every HttpRequest that occurs. Instead, I would like to wait until a successful response (indicated by an HttpResponse.status of 200 or whatever), and then query the server and update a hit field for the corresponding image. The reason the only way to access the path of the image is in process_request, while the only way to access the status code is in process_response.
So, what do I do? Is it as simple as creating a class variable that can hold the path and then lookup the file once the response code of 200 is returned, or should I just use django-hitcount?
Thanks for your help
Set up a cron task to parse your Apache/Nginx/whatever access logs on a regular basis, perhaps with something like pylogsparser.
You could use memcache to store the counters and then periodically persist them to the database. There are risks that memcache will evict the value before it's been persisted but this could be acceptable to you.
This article provides more information and highlights a risk arising when using hosted memcache with keys distributed over multiple servers. http://bjk5.com/post/36567537399/dangers-of-using-memcache-counters-for-a-b-tests

A way to bypass the per-ip limit retrieving profile picture?

My app download all the user's friends pictures.
All the requests are of this kind:
https://graph.facebook.com/<friend id>/picture?type=small
But, after a certain limit is reached, instead of the picture I get:
{"error""message":"(#4) Application request limit reached","type":"OAuthException"}}
Actually, the only way I found to prevent this is to change the server ip (manually).
There isn't a better way?
For the record:
The limit is related to the Graph Api only, and the graph.facebook.com/<user>/picture url is a graph api call that returns a redirect.
So, to avoid the daily limit, simply fetch all the images url from FQL, like:
SELECT uid, pic_small, pic_big, pic, pic_square FROM user WHERE uid = me() or IN (SELECT uid2 FROM friend WHERE uid1=me())
now these are the direct urls to the images, for eg:
http://profile.ak.fbcdn.net/hprofile-ak-snc4/275716_1546085527_622218197_q.jpg
so don't store them since they continuously change.
If it's needed for an online app better way not to download those images, but use an online version, there is couple of reasons for doing so:
Users change pictures (some frequently), do you need an updated version?
Facebook's servers probably faster than yours and friends pictures probably cached within browser of your user.
Update:
Since limit you reach is Graph API call limit, not the image retrieval, another solution that comes to my head just now is using friends connection of user in Graph API and specifying picture in fields argument, eq: https://graph.facebook.com/me/friends?fields=picture, this will return direct URL-s for friends pictures so you can do only one call to get all needed info to download the images for every user...