Can code (third-party) iframe set first-party cookies? - cookies

Is it possible for any code loading in an iframe set first party cookies?
E.g. I have a site: www.my-website.com and I need to load some content from third party provider www.third-party-site.com for legitimate purposes. But (for obvious security reasons) I do not want to allow them to be able to set (or read) any first party cookies (i.e. cookies with the domain www.my-website.com - they are welcome to set any cookies of their own domain www.third-party-site.com).
Is the above possible under certain conditions or not possible at all:
iframe is not sandboxed?
if the iframe code loads say an image that has header cookies
any other conditions?
some browsers allow vs. others do not?
My understanding is that this is not possible at all and most answers on SO etc. seem to support this - but some are pointing to examples where Facebook has a workaround to this in certain conditions etc. Hence thought to clarify.

By design, no. That's not to say that workarounds have not been found, or that bugs have permitted it in the past, but they are very much bugs and not things you should expect or try to use - leaking a first party cookie to a third party would qualify as a major security problem.
To reduce exposure, you should ensure that appropriate cookie flags are set: Secure, to prevent cookies from being sent over insecure links; httponly, to prevent javascript accessing them, and if available, samesite, to avoid CSRF attacks. You should also set HTTP headers to control framing of your own site, to avoid clickjacking, and other headers like CSP to keep tighter control over sources.
There is one very simple way of avoiding all the negative consequences of third-party cookies: don't have any. It's possible to do a great many things without them, it means you may not need to display cookie notifications or seek consent.
Since you're asking an abstract question about this, you might get a better answer on Security Stack Exchange.

Related

What exactly does Safari ITP do?

I am very confused as to how Safari ITP 2.3 works in certain respects, and why sites can’t easily circumvent it. I don’t understand under what circumstances limits are applied, what the exact limits are, to what they are applied, and for how long.
To clarify my question I broke it down into several cases. I will be referring to Apple’s official blog post about ITP 2.3 [1] which you can quote from, but feel free to link to any other authoritative or factually correct sources in your answer.
For third-party sites loaded in iframes:
Why can’t they just use localStorage to store the values of cookies, and send this data back and forth not as actual browser cookie headers 🍪, but as data in the body of the request or a header like Set-AuxCookie? Similarly, they can parse the response to updaye localStorage. What limits does ITP actually place on localStorage in third party iframes?
If the localStorage is frequently purged (see question 1), why can’t they simply use postMessage to tell a script on the enclosing website to store some information (perhaps encrypted) and then spit it back whenever it loads an iframe?
For sites that use link decoration
I still don’t understand what the limits on localStorage are in third party sites in iframes, which did NOT get classified as link decorator sites. But let’s say they are link decorator sites. According to [1] Apple only start limiting stuff further if there is a querystring or fragment. But can’t a website rather trivially store this information in the URL path before the querystring, ie /in/here without ?in=here … certainly large companies like Google can trivially choose to do that?
In the case a site has been labeled as a tracking site, does that mean all its non-cookie data is limited to 7 days? What about cookies set by the server, aren’t they exempted? So then simply make a request to your server to set the cookie instead of using Javascript. After all, the operator of the site is very likely to also have access to its HTTP server and app code.
For all sites
Why can’t a service like Google Analytics or Facebook’s widgets simply convince a site to additional add a CNAME to their DNS and get Google’s and Facebook’s servers under a subdomain like gmail.mysite.com or analytics.mysite.com ? And then boom, they can read and set cookies again, in some cases even on the top-level domain for website owners who don’t know better. Doesn’t this completely defeat the goals of Apple’s ITP, since Google and Facebook have now become a “second party” in some sense?
Here on StackOverflow, when we log out on iOS Safari the StackOverflow network is able to log out of multiple sites at once … how is that even accomplished if no one can track users across websites? I have heard it said that “second party cookies” still can be stored but what exactly makes a second party cookie different from a third party?
My question is broken down into 6 cases but the overall theme is, in each case: how does Apple’s latest ITP work in that case, and how does it actually block all cases of potentially malicious tracking (to the point where a well-funded company can’t just do the workarounds above) while at the same time allowing legitimate use cases?
[1] https://webkit.org/blog/9521/intelligent-tracking-prevention-2-3/
I am not sure if the below answers are correct, please comment if they are not:
It seems applications can use localStorage with no problem, up to 7 days. But it won’t be persisted across multiple enclosing domains. I would even recommend using sessionStorage, since the goal is just to have nothing more than a seamless session. You can then roll your own cookie mechanism using a different set of headers, the only thing you can’t implement is http-only cookies.
They can, but ITP won’t let the JavaScript on the enclosing page store cookies (at least, not if your third party domain was flagged as a tracker by Safari).
Yeah, the description of “link decoration” technically doesn’t mention this workaround, but probably Apple has or will update its classifier to handle this workaround.
Yes, if a first-party webpage will send a request to the server and it sets a cookie in the response headers, then these aren’t blocked by ITP, even if it has an iframe to a tracking site. They say that’s not their goal.
Yes, in fact your first-party site can just let your site redirect to google.com and back quickly (like with oAuth) and thereby inform Google of whatever you wanted, without cookies. Google’s JavaScript can do this as well, if you allow it. Then the JavaScript can just load your google-hosted subdomain in an iframe and set a cookie that persists for years, tracking the user. However, ITP 2.3 seems to have also added mitigation to this, so you might use A records instead? https://cookiesaver.io/archives/analytics-guides/cname-cloaking-mitigation-eliminates-safari-itp-workarounds/
Probably the StackExchange network uses a version of #5

Bypass specific URL from Akamai if certain cookie exist

I would like Akamai not to cache certain URLs if a specified cookie exist (i.e) If user logged in on specific pages. Is there anyway we can do with Akamai?
The good news, is that I have done exactly this in the past for the Top Gear site (www.topgear.com/uk). The logic goes that if a cookie is present (in this case "TGCACHEKEY") then the Akamai cache is to be bypassed for certain url paths. This basically turns off Akamai caching of html pages when logged in.
The bad news is that you require an Akamai consultant to make this change for you.
If this isn't an option for you, then Peter's suggestions are all good ones. I considered all of these before implementing the cookie based approach for Top Gear, but in the end none were feasible.
Remember also that Akamai strips cookies for cached resources by default. That may or may not effect you in your situation.
The Edge Server doesn't check for a cookie before it does the request to your origin server and I have never seen anything like that in any of their menus, conf screens or documentation.
However, there are a few ways I can think of that you can get the effect that I think you're looking for.
You can specify in the configuration settings for the respective digital property what path(s) or URL(s) you don't want it to cache. If you're talking about a logged on user, you might have a path that only they would get to or you could set up such a thing server side. E.g. for an online course you would have www.course.com/php.html that anybody could get to whereas you might use www.course.com/student/php-lesson-1.html for the actual logged on lessons content. Specifying that /student/* would not be cached would solve that.
If you are serving the same pages to both logged on and not-logged on users and can't do it that way, you could check server-side if they're logged on and if so add a cache-breaker to the links so when they follow a link a cache-breaker is automatically added. You could also do this client side if you want, but it would be more secure and faster to do it server-side. As a note on this, this could be userid-random#. That would keep it unique enough when combined with the page that nobody else would request it and get the earlier 'cache-broken' page.
If neither of the above are workable, there is one other way I can think of, which is a bit unconventional to say the least, but it would work. Create symbolically linked directory in your document root with another name so that you can apply the first option and exempt it from cacheing. Then you check if the guy is logged on and if so prepend the extra directory to the links. From akamai's point of view www.mysite.com/logged-on/page.html can be exempt from cache where www.mysite.com/content/page.html is cached. On your server if /logged-on/ symbolically links over to /content/ then you're all set.
When they login you could send them to a subdomain which is set up as a ServerAlias, so on your side it's the same, but on Akamai has differnt cache handling rules.
Following the same answer than #llevera, you can use the cookies on CloudFlare without intervention of engineers to make the change for you.
Having that sort of cookies to bypass content is a technique that its becoming more popular with the time, and even bug companies like Magento are using it for Magento 2 platform.
But solutions from above still valid, Maybe Akamai supports that that already now, we are in 2017!

Do we still need to worry about users turning off cookies?

I've noticed that a lot of sites don't bother anymore with work-arounds so users who have turned their cookies off can still get the same experience on the site. Has that problem just gone away in modern web development? Have we gotten to a point where nobody does it, so we don't need to bother?
I think I put this in the same category as JavaScript. Most people will have cookies enabled, but there will be a few people who have them turned off. There isn't the scare like there was in the mid 90s about evil corporations tracking you all over the net etc. People have become more accepting about how the web works and what is required to have the convenience of web sites remembering who you are etc.
Some people still turn off cookies every once in a while. Usually because they wanted to test something and then forget to leave them off. Nowadays most web apps require cookies on so I think it's perfectly acceptable that instead of complex workarounds to provide the same user experience with or without cookies you can live with just a simple check and a message stating that without cookies the user won't be able to use the site.
There are lots of major websites that behave this way.
My 2c: cookies are good by default and Javascript is evil by default.
As to what general user sentiment is... I'd do cookie detection still so that you can display a meaningful error rather than simply not working if your users are blocking cookies for whatever reason. Don't bother trying to work around it though.
I'm going to guess that it'd be worthwhile running a test to see specifically if your visitors have cookies turned off, because different groups would have different issues. (if it's paranoia, security restrictions, etc.) A website catering to government employees might see higher percentages of cookie non-acceptance than other sites.
As some browsers (or plugins) allow customizing your acceptance of cookies by server or domain, it's possible that even two sites with identical user populations might have different levels of 'trust', if the users believe that one site seems shady.

What are the risks of storing a user password in a Cookie, when the connection is via https?

A Note
I have a very good understanding of sessions and the theory of secure web-based authentication, etc., so please don't start with the basics, or give ambiguous answers. I am not looking for Best Practices, because I am aware of them. I am looking for the real risks behind them, that make the Best Practices what they are.
I have read, and agree with the principals that nothing more than a Session identifier should be stored in a Cookie at any given time.
The Story
However... I've inherited a rusty old app that stores the Username, Password, and an additional ID, in a Cookie, which is checked throughout the site as verification/authorization.
This site is always (can only be) accessed via HTTPS, and depending on your stance, is a "low-risk" website.
The application, in its current state, cannot be re-written in such a way as to handle Sessions - to properly implement such a thing would require, essentially, re-writing the entire application.
The Question
When suggesting to the-powers-that-be that storing their user's IDs/Passwords in plaintext, in a Cookie, is an extremely bad idea, what real risks are involved, considering the connection is always initiated and manipulated via HTTPS?
For example: is the only obvious way to compromise this information via Physical Access to the machine containing the Cookie? What other real risks exist?
HTTPS just protects against a man-in-the-middle attack by encrypting the data that goes across the wire. The information would still be in plain text on the client. So anything on the client's computer can go through that cookie information and extract the pertinent information.
Some other risks include cross-site scripting attacks which can enable cookie theft and who knows what kind of browser vulnerabilities which can enable cookie theft.
A given browser's "cookie jar" might not be stored securely, i.e., an attacker might be able to read it without physical access to the machine, over a LAN, or from a distributed filesystem (e.g., if the machine's storing user homes on a storage server, to allow for roaming), or via an application running on the machine.
Some browsers keep cookies in a file that can be displayed on the computer. IE6 comes to mind.
It seems to me that cookies are not all that restricted to a single site. Lots of advertising uses cookies across multiple sites. If I go to NextTag and look for a Nikon D700 camera then
I see NextTag advertisements on slashdot.org. This is an example of a cross-site cookie. Most users use the same password all over the web so if you store the password to one site and make it even a little easy to get to then malicious folks will sooner or later get to it.
To summarize this would be a very very very bad idea. On sites that I work on we don't save users passwords at all. We convert them to a hash key and save the hash key. That way we can validate the user but if we loose the content then there is no exposure of passwords. And this is on the server side, not the browser side!
Most cookies are limited time credentials. For example, session identifiers that expire after a couple hours or are forgotten when the browser windows. Even if the attacker gains access to the session cookie, they are guaranteed neither continued access to the account nor the ability to prevent the original account holder from logging in. Preventing long term account compromise is one of the reasons users are asked for their old password before being allowed to enter a new one.
A cookie containing a username and password, if disclosed, is much longer lived. Also, many users share their passwords between websites. As others have pointed out, the cookie could easily be disclosed via Cross-Site Scripting.
Finally, is the cookie marked with the "Secure" flag? If its not, an active network attack can easily force the browser to disclose it, even if HTTPS is used to serve the entire site.
People here already mentioned the "man in the middle" attack. The thing is that even with https it is still possible. There are different ways to do this - some of them relay on physical access to the network some of them do not.
The bottom line here is that even with https it is still possible for somebody to insert itself between your app and the browser. Everything will be passed through and will look from the browser exactly the same EXCEPT the server certificate. The intruder will have to send his own instead of the real one.
The browser will detect that there are problems with the certificate - usually it will either be issued to a different dns name or, more likely it will not be verified.
And here is the problem: how this violation is presented to the end user and how end user will react. In older versions of IE all indication of the problem was a small broken lock icon on the right side of the status bar - something which many people would not even notice.
How much risk this introduces depends on what is the environment and who (how trainable) the users are
Two two main vulnerabilities are cross site scripting attacks and someone accessing the user's machine.
Have you thought about just storing a password hash in the cookie instead of the raw password? It would require some coding changes but not nearly as many as swapping out your entire authentication system.

Is there much of an anti-cookie movement anymore?

I'm not sure whether this belongs on StackOverflow or on ServerFault, so I've picked SO for as first go.
A number of years ago, there was a highly visible discussion about mis-use of HTTP cookies, leading to various cookie filtering proxys and eventually to active cookie filtering in browsers like Firefox and Opera. Even now, Google will admit that currently about 7% of end-users will reject their tracking cookies, which is quite a lot, actually.
I still vett all cookies that get set in my browser. I have for years. I personally do not know anyone else who does this, but it has given me a few interesting insights into web tracking. For instance, there are many many more sites using Google Analytics than there were even two years ago. And there are still sites (extremely few, fortunately) which malfunction hideously if you don't let them set cookies. But advertisers in particular are still setting cookies to track your way across the web.
So is there much of an anti-cookie movement anymore? Has anyone tried to take Google to task for setting so many with Analytics? Is anyone trying to vilify sites like Ebay and PayPal who use a dodgy cross-site cookie to let you login?
Or am I making too much of a stupidly small problem?
Nowadays, there are other ways to block these annoyances. Rick752's EasyList has the EasyPrivacy list, which blocks most of them with no work at all other than adding the subscription once to Adblock Plus. NoScript can (with a little configuration, mostly removing some misguided entries on the default whitelist) easily block the ones which depend on JavaScript.
That said, I set up my browser to empty all the cookies on logout. Then they can track you only for the duration of a session, which will be short unless you tend to keep your browser open for a long time (or use the session save/restore all the time).
If you use Flash, know that it also has a kind of cookies, and the interface to manage them is most probably poorer than your browser's.
There's always people who misunsderstand cookies - on both sides. Ultimatey, it's up to the browsers to properly identify the sites for cookies. As long as the site's being set properly and the browser's respecting that, it's just not much of a problem. I think thta, with the increased use of web toolkits that take care of the programmatic details (and better, slightly more security-conscious browsers), it's not much of an issue now for end-users.
Beyond that, the proliferation of DHTML and XML-based partial-page-loading mechanisms (as well as database-backends and similar), the need to track session between stateless pages is reduced now. Your web app can very easily keep state without the need for cookies, and that may well have partially been driven by the number of [generally misinformed] end-users who blocked cookies all together.
In shorter words: "IMHO, no".
I gave up both as user and developer.
As a user the convenience of staying logged into sites is just too tempting, the pain of some sites not working too annoying. And I'm not that sensitive about my privacy, so I stopped caring and let all cookies through.
As a developer I always try to be as RESTful as possible, but I don't know any decent way of handling authentication without cookies. HTTP Basic Auth is just too broken, I can't assume HTTPS all the time and mangling URLs is painful and inelegant. What's left is form-based authentication with cookies. So my applications have one auth cookie -- I don't need any more than that, but that by itself requires the user to have cookies on if they want to authenticate themselves. Maybe OpenID and other federated identity services might fix that one day, but at the moment I can't rely on any of these yet.
My biggest annoyance with cookies is that I want to block Analytics cookies but at the same time I need to login to analytics to manage some customer sites. As far as I can tell they are the same cookie (in fact it may be the same cookie across all google services).
I really don't trust the Google cookie. They were apparently one of the first large companies to set cookie expiration to 2038 (the maximum) and their business model is almost entirely advertising based (targeted advertising at that). I suspect they know more about the day-to-day online activities and interests of people than any other government or organisation on the planet.
That's not to say it's all evil or anything but that really is a lot of trust to be given one entity. They may claim it's all anonymised but I'm pretty sure that claim would be hard to verify. At any rate there is no guarantee that this data won't be stolen, legally acquired or otherwise misused at some future point for other purposes.
It isn't impossible that one day this kind of profiling could be used to target people for more serious things than ads. How hard would it be for some future Hitler to establish the IP addresses, bank accounts, schools, employers, club memberships etc of some arbitary class of person for incarceration or worse?
So my answer is that this is not a small problem and history has already taught us many times over what can happen when you start classifying and tracking people. Cookies are not the only means but they are certainly a part of the problem and I recommend blocking them and clearing at every convenient opportunity.
I am also one of the hold-outs who doesn't automatically accept cookies. I do appreciate sites that need fewer, and I am more likely to return to those sites and allow cookies from them in the future.
That said, I do think that being vigilant about cookies is not (rationally) worth the effort. (In other words, I expect I will keep doing what I'm doing because it makes me feel better, even though I don't have evidence of commensurate tangible benefit.)
Every now and again I clear all my cookies. It's a pain as I then have to login to sites again (or set preferences) but this is also a good test as to whether either me or my browser can remember the login details..