Evercookie browser security - cookies

I've just discovered Evercookie project on Github.
Evercookie is a Javascript API that produces extremely persistent cookies in a browser. Its goal is to identify a client even after they've removed standard cookies, Flash cookies (Local Shared Objects or LSOs), and others.
This is accomplished by storing the cookie data as many browser storage mechanisms as possible. If cookie data is removed from any of the storage mechanisms, evercookie aggressively re-creates it in each mechanism as long as one is still intact.
If the LSO mechanism is available, Evercookie may even propagate cookies between different browsers on the same client machine!
I tested it online, on this example page. I clicked "Create evercookie" button, I deleted all browsing data and I refreshed the page. The cookies that were deleted by deleting browsing data returned again there.
Where is the browser security in this thing? Is this secured?

If you want to disable Flash based cookies, use Adobe's "Global Storage Settings" panel here:
http://www.macromedia.com/support/documentation/en/flashplayer/help/settings_manager03.html
Perform all of these Steps:
Uncheck "Allow 3rd Party Flash Content to store data on your computer"
Check "Never ask again" (a non-obvious, but important step)
Click the 2nd to last tab: "Website Storage Settings"
Delete all existing data
Chrome bundles its own Flash plugin on Windows and Mac OS X. The settings and disk storage are separate from the plugin packaged directly by Adobe, so you may need to perform the above steps twice if you use Chrome. On the plus side, the separate storage location prevents Flash from being used to synchronize cookies to or from Chrome and other browsers.
I recommend testing with my personal site:
http://noc.to
The "Zombie Cookie" section can show you exactly how cookies are being restored and help you determine if the above steps (or any tools you use) are working.

In order to create an Evercookie, all you need is:
The ability to run JavaScript (or other active content, like Flash and perhaps Java); and
The ability to access the various client-side locations where copies of the cookie data are stored.
Totally disabling access to all storage mechanisms would render most of them useless; for most of them, their whole reason for being is to allow a script to use them. So the only even remotely feasible option is restricting access by domain. I'm not sure what browsers (if any) allow that kind of granularity, though. Most can allow or block JS as a whole from certain domains, but as for what features a given domain's scripts can use...? I'm not seeing that ability in Chrome 26 or IE 10, at least.

Well, it doesn't seem to work that well.
Created the everCookie
Closed the window
Empty all elements of Firefox cache (just by going to delete recent history anc check everything except site preferences)
Closed the window
Came back to the page
Finally realized it wasn't stored
What is strange is that I dind't explicitely removed Flash cookies in Flash Website Storage Settings panel. Maybe it's integrated into Firefox. Or I may have disabled them.
I think there's several other ways to store cookies and trace you. Facebook is already tracking you all over the web, even when disconnected. Google too (do you use Chrome?). Moreover, with IPv4 addresses, we certainly can find you back (why not just after you've emptied your cache!). We also can find you back while logging back on any site, and make a link with your previous sessions.
I suggest:
Using Firefox, even it's slower than Chrome, it's still more respectful of privacy
Removing the whole Internet cache on window close (sorry you'll have to log again on your preferred sites)
Check third-party cookie options
Use browser addons with care
Check Flash & Silverlight cookie options
Avoid website reputation checking (provided that you can recognize a fishing attempt)
Use private browsing mode when you don't want to share your digital lives

Related

What exactly does Safari ITP do?

I am very confused as to how Safari ITP 2.3 works in certain respects, and why sites can’t easily circumvent it. I don’t understand under what circumstances limits are applied, what the exact limits are, to what they are applied, and for how long.
To clarify my question I broke it down into several cases. I will be referring to Apple’s official blog post about ITP 2.3 [1] which you can quote from, but feel free to link to any other authoritative or factually correct sources in your answer.
For third-party sites loaded in iframes:
Why can’t they just use localStorage to store the values of cookies, and send this data back and forth not as actual browser cookie headers 🍪, but as data in the body of the request or a header like Set-AuxCookie? Similarly, they can parse the response to updaye localStorage. What limits does ITP actually place on localStorage in third party iframes?
If the localStorage is frequently purged (see question 1), why can’t they simply use postMessage to tell a script on the enclosing website to store some information (perhaps encrypted) and then spit it back whenever it loads an iframe?
For sites that use link decoration
I still don’t understand what the limits on localStorage are in third party sites in iframes, which did NOT get classified as link decorator sites. But let’s say they are link decorator sites. According to [1] Apple only start limiting stuff further if there is a querystring or fragment. But can’t a website rather trivially store this information in the URL path before the querystring, ie /in/here without ?in=here … certainly large companies like Google can trivially choose to do that?
In the case a site has been labeled as a tracking site, does that mean all its non-cookie data is limited to 7 days? What about cookies set by the server, aren’t they exempted? So then simply make a request to your server to set the cookie instead of using Javascript. After all, the operator of the site is very likely to also have access to its HTTP server and app code.
For all sites
Why can’t a service like Google Analytics or Facebook’s widgets simply convince a site to additional add a CNAME to their DNS and get Google’s and Facebook’s servers under a subdomain like gmail.mysite.com or analytics.mysite.com ? And then boom, they can read and set cookies again, in some cases even on the top-level domain for website owners who don’t know better. Doesn’t this completely defeat the goals of Apple’s ITP, since Google and Facebook have now become a “second party” in some sense?
Here on StackOverflow, when we log out on iOS Safari the StackOverflow network is able to log out of multiple sites at once … how is that even accomplished if no one can track users across websites? I have heard it said that “second party cookies” still can be stored but what exactly makes a second party cookie different from a third party?
My question is broken down into 6 cases but the overall theme is, in each case: how does Apple’s latest ITP work in that case, and how does it actually block all cases of potentially malicious tracking (to the point where a well-funded company can’t just do the workarounds above) while at the same time allowing legitimate use cases?
[1] https://webkit.org/blog/9521/intelligent-tracking-prevention-2-3/
I am not sure if the below answers are correct, please comment if they are not:
It seems applications can use localStorage with no problem, up to 7 days. But it won’t be persisted across multiple enclosing domains. I would even recommend using sessionStorage, since the goal is just to have nothing more than a seamless session. You can then roll your own cookie mechanism using a different set of headers, the only thing you can’t implement is http-only cookies.
They can, but ITP won’t let the JavaScript on the enclosing page store cookies (at least, not if your third party domain was flagged as a tracker by Safari).
Yeah, the description of “link decoration” technically doesn’t mention this workaround, but probably Apple has or will update its classifier to handle this workaround.
Yes, if a first-party webpage will send a request to the server and it sets a cookie in the response headers, then these aren’t blocked by ITP, even if it has an iframe to a tracking site. They say that’s not their goal.
Yes, in fact your first-party site can just let your site redirect to google.com and back quickly (like with oAuth) and thereby inform Google of whatever you wanted, without cookies. Google’s JavaScript can do this as well, if you allow it. Then the JavaScript can just load your google-hosted subdomain in an iframe and set a cookie that persists for years, tracking the user. However, ITP 2.3 seems to have also added mitigation to this, so you might use A records instead? https://cookiesaver.io/archives/analytics-guides/cname-cloaking-mitigation-eliminates-safari-itp-workarounds/
Probably the StackExchange network uses a version of #5

Efficient way to inspect Cookie creation in real time

I want to inspect what happens to cookies when I browse a certain website, in order to figure out how this website track some data.
I guess I can use the Chrome developer tools, and use the Network tab, and look within each HTTP requests one-by-one (there are about 100 of them) and check the cookie tab and look for "response cookies", right?
Is there an easier way to do this? some tool to simply see all cookies that were being created / changed?

Parallel website running to my original website

We have been working on a gaming website. Recently while making note of the major traffic sources I noticed a website that I found to be a carbon-copy of our website. It uses our logo,everything same as ours but a different domain name. It cannot be, that domain name is pointing to our domain name. This is because at several places links are like ccwebsite/our-links. That website even has links to some images as ccwebsite/our-images.
What has happened ? How could have they done that ? What can I do to stop this ?
There are a number of things they might have done to copy your site, including but not limited to:
Using a tool to scrape a complete copy of your site and place it on their server
Use their DNS name to point to your site
Manually re-create your site as their own
Respond to requests to their site by scraping yours real-time and returning that as the response
etc.
What can I do to stop this?
Not a whole lot. You can try to prevent direct linking to your content by requiring referrer headers for your images and other resources so that requests need to come from pages you serve, but 1) those can be faked and 2) not all browsers will send those so you'd break a small percentage of legitimate users. This also won't stop anybody from copying content, just from "deep linking" to it.
Ultimately, by having a website you are exposing that information to the internet. On a technical level anybody can get that information. If some information should be private you can secure that information behind a login or other authorization measures. But if the information is publicly available then anybody can copy it.
"Stopping this" is more of a legal/jurisdictional/interpersonal concern than a technical one I'm afraid. And Stack Overflow isn't in a position to offer that sort of advice.
You could run your site with some lightweight authentication. Just issue a cookie passively when they pull a page, and require the cookie to get access to resources. If a user visits your site and then the parallel site, they'll still be able to get in, but if a user only knows about the parallel site and has never visited the real site, they will just see a crap ton of broken links and images. This could be enough to discourage your doppelganger from keeping his site up.
Another (similar but more complex) option is to implement a CSRF mitigation. Even though this isn't a CSRF situation, the same mitigation will work. Essentially you'd issue a cookie as described above, but in addition insert the cookie value in the URLs for everything and require them to match. This requires a bit more work (you'll need a filter or module inserted into the pipeline) but will keep out everybody except your own users.

How to uniquely identify computers that access a website

I have to uniquely identify computers that access my website. The only solution i found so far is storing an id in a cookie or something similar to identify the browser but my problem is that you could copy the cookies and use them in another browser. is there any way to detect if the cookie is copied or some better way to identify the computer?
There is no reliable way to uniquely identify a browser, because there is nothing that prevents two computers from having the exact same configurations (operation system, browser, location, cookies, flash configuration, etc).
The best that we can do is to gather as many information about the browser as possible. This is well-known approach called browser/device fingerprinting. Although the result is not perfect, it is quite good. Browser's fingerprint typically includes browser name, operating system, fonts, plugins installed, etc. You can test how unique your browser is using https://panopticlick.eff.org/.
PHP has an array called $_Server which is an associative array with information such as the visitors IP which can be used in conjunction with cookies. JavaScript also offers the "navigator" object which contains information about the browser. You can save that information to a cookie and if it doesn't match with the browser they use on the next visit, it would indicate they are not using the same browser.

Is there something a site can do to incorporate third party cookies

I work for an e-commerce site. Part of what we do is to offer customized items to some clients. Recently some non-technical management promised that we could incorporate our check-out process into one such client's website. The only way we've figured out how to do this is by using an iframe (I know, I don't like it either). The issue is that most customers of this site are unable to check out because we use cookies to determine which custom items to display. Browsers are recognizing our cookies as third party and almost everybody has third party cookies turned off, as they should. I'm going to be shocked if the answer is yes, but is there any workaround for this? ie can the site hosting our iframe somehow supply the necessary cookie?
Try an invisible, interstitial page.
Essentially the hosting site would issue a redirect to a site within your domain, which is then free to set cookies (because at this point is is actually the first party). Then your site immediately redirects back to the hosting site. At this point your newly-created cookies will be invisible to the hosting site but visible to your iFramed page henceforth.
Unfortunately the hosting site will have to do this every time a cookie is to be updated but the double-redirect can happen so quickly they'll hardly notice. Hopefully your system only needs the cookies to be set once.
Instead of using a cookie, pass the information in the each url request as name/value pairs.
It is a bit of a pain to add the name/value to every url...I know...oh well...it will work.
I'm going to be shocked if the answer is yes, but is there any workaround for this? ie can the site hosting our iframe somehow supply the necessary cookie?
Your iframed page itself, which is the third party in this scenario, could send a P3P Cookie Policy header – some browsers then accept third-party cookies by default, whereas others (mainly Safari) will not be convinced to do so at all if not by the user manipulating the default settings themselves.
What you could also do, is pass the session id not (only) by cookie, but as a GET or POST parameter as well – f.e. under PHP this can be done quite easily by configuring the session options. You should consider if that’s worth the slightly increased risk of session stealing.
The interstitial page solution should work but it might be a lot of trouble for your hosting site, so here's another solution that will allow you to work cookieless.
Write an HttpModule that responds to the BeginRequest event, reads the querystring, and inserts corresponding cookie headers into the Context.HttpRequest object (Note: you can't use AddCookie, you have to use AddHeader, because cookies added by a module directly are disposed of before they hit your application proper). That way the hosting site can simply issue a request (within the iFrame) that contains the necessary value in the querystring, the module will convert it into a cookie (that only exists in memory, not on the wire), and your application will be deceived into thinking that there's a cookie there. No code changes required, you just need to add the module in web.config.
This only works if you are using IIS 7.0+ in integrated pipeline mode. If you're on an earlier version of IIS or if you have to run in classic mode, you'll need an ISAPI filter instead.
Ryan , John
For the Chrome v80 update with SameSite flags, want to set the samesite=none;secure for the site hosting our iframe and somehow supply the necessary samesite=none;secure cookie. We have apache 2.2 and tomcat 6 setup, so would appreciate a solution and advice on how to make it work. Currently with flag enabled the iFrame is not punching out successfully.
Thanks