What exactly does Safari ITP do? - cookies

I am very confused as to how Safari ITP 2.3 works in certain respects, and why sites can’t easily circumvent it. I don’t understand under what circumstances limits are applied, what the exact limits are, to what they are applied, and for how long.
To clarify my question I broke it down into several cases. I will be referring to Apple’s official blog post about ITP 2.3 [1] which you can quote from, but feel free to link to any other authoritative or factually correct sources in your answer.
For third-party sites loaded in iframes:
Why can’t they just use localStorage to store the values of cookies, and send this data back and forth not as actual browser cookie headers 🍪, but as data in the body of the request or a header like Set-AuxCookie? Similarly, they can parse the response to updaye localStorage. What limits does ITP actually place on localStorage in third party iframes?
If the localStorage is frequently purged (see question 1), why can’t they simply use postMessage to tell a script on the enclosing website to store some information (perhaps encrypted) and then spit it back whenever it loads an iframe?
For sites that use link decoration
I still don’t understand what the limits on localStorage are in third party sites in iframes, which did NOT get classified as link decorator sites. But let’s say they are link decorator sites. According to [1] Apple only start limiting stuff further if there is a querystring or fragment. But can’t a website rather trivially store this information in the URL path before the querystring, ie /in/here without ?in=here … certainly large companies like Google can trivially choose to do that?
In the case a site has been labeled as a tracking site, does that mean all its non-cookie data is limited to 7 days? What about cookies set by the server, aren’t they exempted? So then simply make a request to your server to set the cookie instead of using Javascript. After all, the operator of the site is very likely to also have access to its HTTP server and app code.
For all sites
Why can’t a service like Google Analytics or Facebook’s widgets simply convince a site to additional add a CNAME to their DNS and get Google’s and Facebook’s servers under a subdomain like gmail.mysite.com or analytics.mysite.com ? And then boom, they can read and set cookies again, in some cases even on the top-level domain for website owners who don’t know better. Doesn’t this completely defeat the goals of Apple’s ITP, since Google and Facebook have now become a “second party” in some sense?
Here on StackOverflow, when we log out on iOS Safari the StackOverflow network is able to log out of multiple sites at once … how is that even accomplished if no one can track users across websites? I have heard it said that “second party cookies” still can be stored but what exactly makes a second party cookie different from a third party?
My question is broken down into 6 cases but the overall theme is, in each case: how does Apple’s latest ITP work in that case, and how does it actually block all cases of potentially malicious tracking (to the point where a well-funded company can’t just do the workarounds above) while at the same time allowing legitimate use cases?
[1] https://webkit.org/blog/9521/intelligent-tracking-prevention-2-3/

I am not sure if the below answers are correct, please comment if they are not:
It seems applications can use localStorage with no problem, up to 7 days. But it won’t be persisted across multiple enclosing domains. I would even recommend using sessionStorage, since the goal is just to have nothing more than a seamless session. You can then roll your own cookie mechanism using a different set of headers, the only thing you can’t implement is http-only cookies.
They can, but ITP won’t let the JavaScript on the enclosing page store cookies (at least, not if your third party domain was flagged as a tracker by Safari).
Yeah, the description of “link decoration” technically doesn’t mention this workaround, but probably Apple has or will update its classifier to handle this workaround.
Yes, if a first-party webpage will send a request to the server and it sets a cookie in the response headers, then these aren’t blocked by ITP, even if it has an iframe to a tracking site. They say that’s not their goal.
Yes, in fact your first-party site can just let your site redirect to google.com and back quickly (like with oAuth) and thereby inform Google of whatever you wanted, without cookies. Google’s JavaScript can do this as well, if you allow it. Then the JavaScript can just load your google-hosted subdomain in an iframe and set a cookie that persists for years, tracking the user. However, ITP 2.3 seems to have also added mitigation to this, so you might use A records instead? https://cookiesaver.io/archives/analytics-guides/cname-cloaking-mitigation-eliminates-safari-itp-workarounds/
Probably the StackExchange network uses a version of #5

Related

Can code (third-party) iframe set first-party cookies?

Is it possible for any code loading in an iframe set first party cookies?
E.g. I have a site: www.my-website.com and I need to load some content from third party provider www.third-party-site.com for legitimate purposes. But (for obvious security reasons) I do not want to allow them to be able to set (or read) any first party cookies (i.e. cookies with the domain www.my-website.com - they are welcome to set any cookies of their own domain www.third-party-site.com).
Is the above possible under certain conditions or not possible at all:
iframe is not sandboxed?
if the iframe code loads say an image that has header cookies
any other conditions?
some browsers allow vs. others do not?
My understanding is that this is not possible at all and most answers on SO etc. seem to support this - but some are pointing to examples where Facebook has a workaround to this in certain conditions etc. Hence thought to clarify.
By design, no. That's not to say that workarounds have not been found, or that bugs have permitted it in the past, but they are very much bugs and not things you should expect or try to use - leaking a first party cookie to a third party would qualify as a major security problem.
To reduce exposure, you should ensure that appropriate cookie flags are set: Secure, to prevent cookies from being sent over insecure links; httponly, to prevent javascript accessing them, and if available, samesite, to avoid CSRF attacks. You should also set HTTP headers to control framing of your own site, to avoid clickjacking, and other headers like CSP to keep tighter control over sources.
There is one very simple way of avoiding all the negative consequences of third-party cookies: don't have any. It's possible to do a great many things without them, it means you may not need to display cookie notifications or seek consent.
Since you're asking an abstract question about this, you might get a better answer on Security Stack Exchange.

Parallel website running to my original website

We have been working on a gaming website. Recently while making note of the major traffic sources I noticed a website that I found to be a carbon-copy of our website. It uses our logo,everything same as ours but a different domain name. It cannot be, that domain name is pointing to our domain name. This is because at several places links are like ccwebsite/our-links. That website even has links to some images as ccwebsite/our-images.
What has happened ? How could have they done that ? What can I do to stop this ?
There are a number of things they might have done to copy your site, including but not limited to:
Using a tool to scrape a complete copy of your site and place it on their server
Use their DNS name to point to your site
Manually re-create your site as their own
Respond to requests to their site by scraping yours real-time and returning that as the response
etc.
What can I do to stop this?
Not a whole lot. You can try to prevent direct linking to your content by requiring referrer headers for your images and other resources so that requests need to come from pages you serve, but 1) those can be faked and 2) not all browsers will send those so you'd break a small percentage of legitimate users. This also won't stop anybody from copying content, just from "deep linking" to it.
Ultimately, by having a website you are exposing that information to the internet. On a technical level anybody can get that information. If some information should be private you can secure that information behind a login or other authorization measures. But if the information is publicly available then anybody can copy it.
"Stopping this" is more of a legal/jurisdictional/interpersonal concern than a technical one I'm afraid. And Stack Overflow isn't in a position to offer that sort of advice.
You could run your site with some lightweight authentication. Just issue a cookie passively when they pull a page, and require the cookie to get access to resources. If a user visits your site and then the parallel site, they'll still be able to get in, but if a user only knows about the parallel site and has never visited the real site, they will just see a crap ton of broken links and images. This could be enough to discourage your doppelganger from keeping his site up.
Another (similar but more complex) option is to implement a CSRF mitigation. Even though this isn't a CSRF situation, the same mitigation will work. Essentially you'd issue a cookie as described above, but in addition insert the cookie value in the URLs for everything and require them to match. This requires a bit more work (you'll need a filter or module inserted into the pipeline) but will keep out everybody except your own users.

Is there something a site can do to incorporate third party cookies

I work for an e-commerce site. Part of what we do is to offer customized items to some clients. Recently some non-technical management promised that we could incorporate our check-out process into one such client's website. The only way we've figured out how to do this is by using an iframe (I know, I don't like it either). The issue is that most customers of this site are unable to check out because we use cookies to determine which custom items to display. Browsers are recognizing our cookies as third party and almost everybody has third party cookies turned off, as they should. I'm going to be shocked if the answer is yes, but is there any workaround for this? ie can the site hosting our iframe somehow supply the necessary cookie?
Try an invisible, interstitial page.
Essentially the hosting site would issue a redirect to a site within your domain, which is then free to set cookies (because at this point is is actually the first party). Then your site immediately redirects back to the hosting site. At this point your newly-created cookies will be invisible to the hosting site but visible to your iFramed page henceforth.
Unfortunately the hosting site will have to do this every time a cookie is to be updated but the double-redirect can happen so quickly they'll hardly notice. Hopefully your system only needs the cookies to be set once.
Instead of using a cookie, pass the information in the each url request as name/value pairs.
It is a bit of a pain to add the name/value to every url...I know...oh well...it will work.
I'm going to be shocked if the answer is yes, but is there any workaround for this? ie can the site hosting our iframe somehow supply the necessary cookie?
Your iframed page itself, which is the third party in this scenario, could send a P3P Cookie Policy header – some browsers then accept third-party cookies by default, whereas others (mainly Safari) will not be convinced to do so at all if not by the user manipulating the default settings themselves.
What you could also do, is pass the session id not (only) by cookie, but as a GET or POST parameter as well – f.e. under PHP this can be done quite easily by configuring the session options. You should consider if that’s worth the slightly increased risk of session stealing.
The interstitial page solution should work but it might be a lot of trouble for your hosting site, so here's another solution that will allow you to work cookieless.
Write an HttpModule that responds to the BeginRequest event, reads the querystring, and inserts corresponding cookie headers into the Context.HttpRequest object (Note: you can't use AddCookie, you have to use AddHeader, because cookies added by a module directly are disposed of before they hit your application proper). That way the hosting site can simply issue a request (within the iFrame) that contains the necessary value in the querystring, the module will convert it into a cookie (that only exists in memory, not on the wire), and your application will be deceived into thinking that there's a cookie there. No code changes required, you just need to add the module in web.config.
This only works if you are using IIS 7.0+ in integrated pipeline mode. If you're on an earlier version of IIS or if you have to run in classic mode, you'll need an ISAPI filter instead.
Ryan , John
For the Chrome v80 update with SameSite flags, want to set the samesite=none;secure for the site hosting our iframe and somehow supply the necessary samesite=none;secure cookie. We have apache 2.2 and tomcat 6 setup, so would appreciate a solution and advice on how to make it work. Currently with flag enabled the iFrame is not punching out successfully.
Thanks

Bypass specific URL from Akamai if certain cookie exist

I would like Akamai not to cache certain URLs if a specified cookie exist (i.e) If user logged in on specific pages. Is there anyway we can do with Akamai?
The good news, is that I have done exactly this in the past for the Top Gear site (www.topgear.com/uk). The logic goes that if a cookie is present (in this case "TGCACHEKEY") then the Akamai cache is to be bypassed for certain url paths. This basically turns off Akamai caching of html pages when logged in.
The bad news is that you require an Akamai consultant to make this change for you.
If this isn't an option for you, then Peter's suggestions are all good ones. I considered all of these before implementing the cookie based approach for Top Gear, but in the end none were feasible.
Remember also that Akamai strips cookies for cached resources by default. That may or may not effect you in your situation.
The Edge Server doesn't check for a cookie before it does the request to your origin server and I have never seen anything like that in any of their menus, conf screens or documentation.
However, there are a few ways I can think of that you can get the effect that I think you're looking for.
You can specify in the configuration settings for the respective digital property what path(s) or URL(s) you don't want it to cache. If you're talking about a logged on user, you might have a path that only they would get to or you could set up such a thing server side. E.g. for an online course you would have www.course.com/php.html that anybody could get to whereas you might use www.course.com/student/php-lesson-1.html for the actual logged on lessons content. Specifying that /student/* would not be cached would solve that.
If you are serving the same pages to both logged on and not-logged on users and can't do it that way, you could check server-side if they're logged on and if so add a cache-breaker to the links so when they follow a link a cache-breaker is automatically added. You could also do this client side if you want, but it would be more secure and faster to do it server-side. As a note on this, this could be userid-random#. That would keep it unique enough when combined with the page that nobody else would request it and get the earlier 'cache-broken' page.
If neither of the above are workable, there is one other way I can think of, which is a bit unconventional to say the least, but it would work. Create symbolically linked directory in your document root with another name so that you can apply the first option and exempt it from cacheing. Then you check if the guy is logged on and if so prepend the extra directory to the links. From akamai's point of view www.mysite.com/logged-on/page.html can be exempt from cache where www.mysite.com/content/page.html is cached. On your server if /logged-on/ symbolically links over to /content/ then you're all set.
When they login you could send them to a subdomain which is set up as a ServerAlias, so on your side it's the same, but on Akamai has differnt cache handling rules.
Following the same answer than #llevera, you can use the cookies on CloudFlare without intervention of engineers to make the change for you.
Having that sort of cookies to bypass content is a technique that its becoming more popular with the time, and even bug companies like Magento are using it for Magento 2 platform.
But solutions from above still valid, Maybe Akamai supports that that already now, we are in 2017!

Cross Domain User Tracking

We have several websites on different domains and I'd like to be able to track users' movements on these sites.
Obviously cookies are not feasable, because they don't cross domain borders.
I could look at a combination of IP address and User Agent, but there are some cases where that does not work.
I don't want to use flash or other plugins.
Any ideas? Or am I doomed to rely on the IP/User_Agent combination?
You can designate one domain or subdomain to tracking and have it serve a 1x1 pixel image which you include in all pages you would like to track. Serve a cookie with the image, look at the tracking domain's server logs, voilà.
This solution requires no JavaScript, and works even if the user disables third-party cookies.
First, let's make sure the user agent is sending cookies:
If getCookie("c") == null then setCookie("c", "anyValue")
Then let the request finish (aka wait for next request)
Let's call our tracker cookie uaid.
If GET http://child.com/any-page and getCookie("c") is not null and getCookie("uaid") is null...
Redirect to http://parent.com/give-me-a-uaid?returnTo=http://child.com/any-page
On http://parent.com/give-me-a-uaid, check for cookie uaid
If not exists, create it and add it to response. If it exists, get its value.
Redirect to http://child.com/any-page?uaid=valueOfParentsUAIDCookie
Child.com sets cookie uaid with valueOfParentsUAIDCookie
Redirect to http://child.com/any-page
And of course, you are validating input, and white-listing your redirect URLs :)
Flows:
This question is closely related to the Question Accessing Domain Cookies within an iFrame on Internet Explorer.
For Internet Explorer I need to take P3P Policies into account and set an additional P3P HTTP-Header to allow images to set cookies across domain borders. Then I can use simon's suggestion.
You can follow the same concept used in Google Analytics. Injecting javascript in the pages you want to track.
You do not give any context to your situation -just the basic problem. So it is difficult to give an answer that clearly fits. However, here are some techniques/mechanisms for passing information from one page to another, regardless of what domain is involved.
include hyperlink to a 1x1 pixel transparent gif image (sometimes called a "beacon")
rely on referrer information in HTTP request headers to identify page hyperlink is on
include extra parameters in hyperlinks to other site - assuming you run both sites
buy services of a company like Akamai to do user tracking for you
possibly use cross domain cookie mechanism in the future if standard is ever approved
Which techniques really come down to whether you can place software on all of the sites (servers) that the user will visit where you have interest - or you cannot place your software on all of them.