Domain Level Cookies in an Akamai setup - akamai

Has anyone had a problem in running domain level cookies with Akamai implementation?
The site issues a domain level cookie which contains 2 values which are used by other apps.
With Akamai in the mix, the cookie never gets generated. When I take Akamai out of the mix, everything works fine. Not sure if anyone else has seen this behavior. I am not clear on how Akamai handles cookies.

Akamai, by default, strips cookies from cached resources.
The logic (quit sensibly) is that cookies are designed to be specific to each browser/user, so caching them makes no sense.
My advice:
1. Check if the resource in question is being cached. You can use the Akamai browser plugins for this
2. Think carefully why you would want cookies in a cached resource
3. If you are sure you do want these cookies, contact Akamai. They can change this behaviour for you

As an alternative, you can still cache those pages: you'd need to define the cookies in an uncached URL, which should be called inside the cached pages, for example, as a tag.
That way you can do redirects, AJAX calls, or DOM manipulation from JS depending on cookies from within cached pages.

Related

Cookie “PHPSESSID” will be soon treated as cross-site cookie against <file> because the scheme does not match

I've just noticed my console is littered with this warning, appearing for every single linked resource. This includes all referenced CSS files, javascript files, SVG images, and even URLs from ajax calls (which respond in JSON). But not images.
The warning, for example in case of a style.css file, will say:
Cookie “PHPSESSID” will be soon treated as cross-site cookie against “http://localhost/style.css” because the scheme does not match.
But, the scheme doesn't match what? The document? Because that it does.
The URL of my site is http://localhost/.
The site and its resources are all on http (no https on localhost)
The domain name is definitely not different because everything is referenced relative to the domain name (meaning the filepaths start with a slash href="/style.css")
The Network inspector just reports a green 200 OK response, showing everything as normal.
It's only Mozilla Firefox that is complaining about this. Chromium seems to not be concerned by anything. I don't have any browser add-ons. The warnings seem to originate from the browser, and each warning links to view the corresponding file source in Debugger.
Why is this appearing?
that was exactly same happening with me. the issue was that, firefox keeps me showing even Cookies of different websites hosted on same URL : "localhost:Port number" stored inside browser memory.
In my case, i have two projects configured to run at http://localhost:62601, when i run first project, it saves that cookie in browser memory. when i run second project having same URL, Cookie is available inside that projects console also.
what you can do, is delete the all of the cookies from browser.
#Paramjot Singh's answer is correct and got me most of the way to where I needed to be. I also wasted a lot of time staring at those warnings.
But to clarify a little, you don't have to delete ALL of your cookies to resolve this. In Firefox, you can delete individual site cookies, which will keep your settings on other sites.
To do so, click the hamburger menu in the top right, then, Options->Privacy & Security or Settings->Privacy & Security
From here, scroll down about half-way and find Cookies and Site Data. Don't click Clear Data. Instead, click Manage Data. Then, search for the site you are having the notices on, highlight it, and Remove Selected
Simple, I know, but I made the mistake of clearing everything the first time - maybe this will prevent someone from doing same.
The warning is given because, according to MDN web docs:
Standards related to the Cookie SameSite attribute recently changed such that:
The cookie-sending behaviour if SameSite is not specified is SameSite=Lax. Previously the default was that cookies were sent for all requests.
Cookies with SameSite=None must now also specify the Secure attribute (they require a secure context/HTTPS).
Which indicates that a secure context/HTTPS is required in order to allow cross site cookies by setting SameSite=None Secure for the cookie.
According to Mozilla, you should explicitly communicate the intended SameSite policy for your cookie (rather than relying on browsers to apply SameSite=Lax automatically), otherwise you might get a warning like this:
Cookie “myCookie” has “SameSite” policy set to “Lax” because it is missing a “SameSite” attribute, and “SameSite=Lax” is the default value for this attribute.
The suggestion to simply delete localhost cookies is not actually solving the problem. The solution is to properly set the SameSite attribute of cookies being set by the server and use HTTPS if needed.
Firefox is not the only browser making these changes. Apparently the version of Chrome I am using (84.0.4147.125) has already implemented the changes as I got this message in the console:
The previously mentioned MDN article and this article by Mike Conca have great information about changes to SameSite cookie behavior.
Guess you are using WAMP or LAMP etc. The first thing you need to do is enable ssl on WAMP as you will find many references saying you need to adjust the cookie settings to SameSite=None; Secure That entails your local connection being secure. There are instructions on this link https://articlebin.michaelmilette.com/how-to-add-ssl-https-to-wampserver/ as well as some YouTube vids.
The important thing to note is that when creating the SSL certificate you should use sha256 encoding as sha1 is now deprecated and will throw another warning.
There is a good explanation of SameSite cookies on https://web.dev/samesite-cookies-explained/
I was struggling with the same issue and solved it by making sure the Apache 2.4 headers module was enabled and than added one line of code
Header always edit Set-Cookie ^(.")$ $1;HttpOnly;Secure
I wasted lots of time staring at the same sets of warnings in the Inspector until it dawned on me that the cookies were persisting and needed purging.
Apparently Chrome was going to introduce the new rules by now but Covid-19 meant a lot of websites might have been broken while people worked from home. The major browsers are working together on the SameSite attribute this so it will be in force soon.

Third party code on subdomain

As the owner of domain example.com with many content what security risks arising from providing subdomain to third party company. We don't want to share any of the content and the third company would have complete control over the application and machine hosting the subdomain site.
I'm concerned mainly about:
Shared cookies
We have cookies .example.com, so there will be sent also in the requests to subdomain. Is it possible for us to point A record to reverse proxy where we strip the cookies and send the request to third party provider without them?
Content loading from main domain
Is it possible to set document.domain to example.com and do XMLHttpRequest to the example.com?
Cross site scripting
I guess that it would be no problem because of the same origin policy. Subdomain is treated as separate domain?
Any other security issues?
We have cookies .example.com, so there will be sent also in the
requests to subdomain. Is it possible for us to point A record to
reverse proxy where we strip the cookies and send the request to third
party provider without them?
Great idea, you could do this yes, however you will also need to set the HttpOnly flag, otherwise they would be able to retrieve them with JavaScript.
Is it possible to set document.domain to example.com and do
XMLHttpRequest to the example.com?
No, subdomains for Ajax are treated as a different Origin. See this answer.
I guess that it would be no problem because of the same origin policy.
Subdomain is treated as separate domain?
JavaScript code could interact with each other subdomains - but only with the cooperation of your site. You would also need to also set document.domain = 'example.com'; If you do not do this, you are secure against this threat.
See here:
When using document.domain to allow a subdomain to access its parent
securely, you need to set document.domain to the same value in both
the parent domain and the subdomain. This is necessary even if doing
so is simply setting the parent domain back to its original value.
Failure to do this may result in permission errors.
Any other security issues?
You need to be aware of cookie poisoning. If evil.example.com sets a non host-only cookie at .example.com that your domain believes it has set itself, then the evil cookie may be used for your site.
For example, if you display the contents of the cookie as HTML, then this may introduce XSS. Also, if you're using the double submit cookies CSRF prevention method an evil domain may be able to set their own cookie value to achieve CSRF. See this answer.

Basic issue with setting HTTP cookies

I'd like to set an HTTP cookie for my users, in order to not bother them with having to log in every time.
What I want to know is this: if I set the cookie at a page other than the homepage for my website, then will that cookie be available when the user comes to my homepage the next time?
More generally, is it the case that I can set the cookie at any page of my website, and the cookie will be available to me whenever I want?
Thanks!
Cookies can be configured to be available on specific subdomains, specific paths and specific protocols (HTTPS only, for instance). Without you telling which language you're using, it's hard to tell the default behavior of your local Set-Cookie function, but I believe that most often, the default behavior is to make the cookie available to all subdomains and all paths.
So yes, if you set a cookie on a random page, it should be available to the home page, too.
Yes - once you set a cookie it will be accessible from the server as long as it is stored in the user's browser (hasn't expired or been deleted).
I found that if the cookie is being set via Javascript, then this can be determined via a simple parameter.
The example JS code (from here) sets a cookie, that is available across the site
$.cookie('the_cookie', 'the_value', {path: '/'});

Cookies not working in ie7

I have two pages on two different domains example1.blogspot.com (a Blogspot blog) and example2.com (my own domain, static page). Both pages contain an iframe which loads the same document from a third domain, example.org. The iframe's document contains a small JS web app which calls example.org via AJAX, one of the calls is a POST request and the server sets a cookie with the response.
Upon reloading the pages, the cookie on example1.com seems gone, i.e. jQuery's $.cookie() returns null. On example2.com, everything is fine. This happens only in IE7 - IE6, Safari and Firefox all behave as expected. What's wrong with IE7?
Thanks, Simon
edit:
Oh well, stupid me ;-) It looks like I have a race condition between some event handlers and a window.setTimeout call when deciding whether to check for cookies...sorry!
So if $.cookie() returns null, What does document.cookie show? Also have you taken a look in IE7's list of cookies to see if the cookie is actually there? Also check that PATH and DOMAIN settings on the cookie are correct.

Does every web request send the browser cookies?

Does every web request send the browser's cookies?
I'm not talking page views, but a request for an image, .js file, etc.
Update
If a web page has 50 elements, that is 50 requests. Why would it send the SAME cookie(s) for each request, doesn't it cache or know it already has it?
Yes, as long as the URL requested is within the same domain and path defined in the cookie (and all of the other restrictions -- secure, httponly, not expired, etc) hold, then the cookie will be sent for every request.
As others have said, if the cookie's host, path, etc. restrictions are met, it'll be sent, 50 times.
But you also asked why: because cookies are an HTTP feature, and HTTP is stateless. HTTP is designed to work without the server storing any state between requests.
In fact, the server doesn't have a solid way of recognizing which user is sending a given request; there could be a thousand users behind a single web proxy (and thus IP address). If the cookies were not sent every request, the server would have no way to know which user is requesting whatever resource.
Finally, the browser has no clue if the server needs the cookies or not, it just knows the server instructed it to send the cookie for any request to foo.com, so it does so. Sometimes images need them (e.g., dynamically-generated per-user), sometimes not, but the browser can't tell.
Yes. Every request sends the cookies that belong to the same domain. They're not cached as HTTP is stateless, what means every request must be enough for the server to figure out what to do with it. Say you have images that are only accessible by certain users; you must send your auth cookie with every one of those 50 requests, so the server knows it's you and not someone else, or a guest, among the pool of requests it's getting.
Having said that, cookies might not be sent given other restrictions mentioned in the other responses, such as HTTPS setting, path or domain. Especially there, an important thing to notice: cookies are not shared between domains. That helps with reducing the size of HTTP calls for static files, such as the images and scripts you mentioned.
Example: you have 4 cookies at www.stackoverflow.com; if you make a request to www.stackoverflow.com/images/logo.png, all those 4 cookies will be sent.
However, if you request stackoverflow.com/images/logo.png (notice the subdomain change) or images.stackoverflow.com/logo.png, those 4 cookies won't be present - but maybe those related to these domains will.
You can read more about cookies and images requesting, for example, at this StackOverflow Blog Post.
No. Not every request sends the cookies. It depends on the cookie configuration and client-server connection.
For example, if your cookie's secure option is set to true then it must be transmitted over a secure HTTPS connection. Means when you see that website with HTTP protocol then these cookies won't be sent by browsers as the secure flag is true.
3 years have passed
There's another reason why a browser wouldn't send cookies. You can add a crossOrigin attribute to your <script> tag, and the value to "anonymous". This will prevent cookies to be sent to the destination server. 99.9% of the time, your javascripts are static files, and you don't generate that js code based on the request's cookies. If you have 1KB of cookies, and you have 200 resources on your page, then your user is uploading 200KB, and that might take some time on 3G and have zero effect on the result page. Visit HTML attribute: crossorigin for reference.
Cookie has a "path" property. If "path=/" , the answer is Yes.
I know this is an old thread. But I've just noticed that most browsers won't sent cookies for a domain if you add a trailing dot. For example http://example.com. won't receive cookies set for .example.com. Apache on the other hand treats them as the same host. I find this useful to make cross domain tracking more difficult for external resources I include, but you could also use it for performance reasons. Note this brakes validation of https certificates. I've run a few tests using browsershots and my own devices. The hack works on almost all browsers except for safari (mobile and desktop), which will include cookies in the request.
Short answer is Yes. The below lines are from the JS documentation
Cookies were once used for general client-side storage. While this was legitimate when they were the only way to store data on the client, it is now recommended to use modern storage APIs. Cookies are sent with every request, so they can worsen performance (especially for mobile data connections).