Safari 5 not sending all cookies back to the server - cookies

I'm seeing this issue with Safari 5 (actually it has been prevalent since at least 4.0.5), where it seems that it does not set cookies properly. The site works perfectly fine for Firefox, IE, and Chrome.
However, I can verify that the cookie is being properly set by looking at Safari's cookies plist that it uses. On Windows 7 it's stored here: C:\Users\\AppData\Roaming\Apple Computer\Safari\Cookies\Cookies.plist
Now the site that I am working on uses cookies quite extensively and are stored across different domains.
Let's say for example, the site is www.foo.com.
The login cookie is set to domain .foo.com.
In addition, other cookies are stored to domain www.foo.com, a.foo.com, b.foo.com, etc.
Safari seems to get confused when going to www.foo.com and "forgets" to sends all of the cookies that it should (www.foo.com, .foo.com).
I am not sure if this is an error on my side (invalid characters in the cookie) or if this is a bug with Safari. I'd like to know if anybody has come across this and has found a
Thanks!
Edit: The one cookie that I am concerned about is quite big, a little less than 4k in length. It contains encoded information and the characters that it consists of are alphanumerics and / + _ =
Now what's strange is that, depending on the domain the cookie gets set to, the cookie works fine by itself. However, when the user logs in, another cookie gets added and Safari seems to ignore the first one. Safari sends the login cookie fine and it consists of alphanumerics along with [ ] _ \
My original thought was that it could be a domain conflict. The first cookie was being set to www.foo.com while the login cookie was being set to .foo.com. However, I tested this theory, setting them both to .foo.com, and the issue still remains.

Have a look at the browser size limits here. Safari 5 has a 4kb limit per request. So if the total size of all cookies for the domain (including associated wildcard domains) exceeds 4k, you'll begin to lose cookies.
I believe the logic is that the oldest cookie will be sacrificed first, until the total cookie header size is below 4k.
We're currently having an issue with this problem ourselves, but have also identified another scenario in which cookies are lost, though we've yet to understand why this is the case.
Given the large size of a single cookie of yours, it's likely that a following request adds a new cookie, bumping the total size over the limit.

Related

Using JWT_EXTENDED python with flask and having issues when other cookies join the party

So when I only have the JWT_EXTENDED cookie set my axios/flask/gunicorn app works swimmingly....
so xis.test.com is great.... << domain of the cookie
the problem starts when other cookies get set under ".test.com" then ajax calls with a larger return time or more data retuned get a BAD DATA LENGTH 200 issue which I've seen here on SO...so I'm just not sure what is going on....
Sorry for the limited data but there is nothing inherently different between the calls that work and don't just the fact there are about an extra 18 cookies set by a DUO login in the browser.
If I switch to an incognito window there is no issue with the app at all.
Weird..

Cookie “PHPSESSID” will be soon treated as cross-site cookie against <file> because the scheme does not match

I've just noticed my console is littered with this warning, appearing for every single linked resource. This includes all referenced CSS files, javascript files, SVG images, and even URLs from ajax calls (which respond in JSON). But not images.
The warning, for example in case of a style.css file, will say:
Cookie “PHPSESSID” will be soon treated as cross-site cookie against “http://localhost/style.css” because the scheme does not match.
But, the scheme doesn't match what? The document? Because that it does.
The URL of my site is http://localhost/.
The site and its resources are all on http (no https on localhost)
The domain name is definitely not different because everything is referenced relative to the domain name (meaning the filepaths start with a slash href="/style.css")
The Network inspector just reports a green 200 OK response, showing everything as normal.
It's only Mozilla Firefox that is complaining about this. Chromium seems to not be concerned by anything. I don't have any browser add-ons. The warnings seem to originate from the browser, and each warning links to view the corresponding file source in Debugger.
Why is this appearing?
that was exactly same happening with me. the issue was that, firefox keeps me showing even Cookies of different websites hosted on same URL : "localhost:Port number" stored inside browser memory.
In my case, i have two projects configured to run at http://localhost:62601, when i run first project, it saves that cookie in browser memory. when i run second project having same URL, Cookie is available inside that projects console also.
what you can do, is delete the all of the cookies from browser.
#Paramjot Singh's answer is correct and got me most of the way to where I needed to be. I also wasted a lot of time staring at those warnings.
But to clarify a little, you don't have to delete ALL of your cookies to resolve this. In Firefox, you can delete individual site cookies, which will keep your settings on other sites.
To do so, click the hamburger menu in the top right, then, Options->Privacy & Security or Settings->Privacy & Security
From here, scroll down about half-way and find Cookies and Site Data. Don't click Clear Data. Instead, click Manage Data. Then, search for the site you are having the notices on, highlight it, and Remove Selected
Simple, I know, but I made the mistake of clearing everything the first time - maybe this will prevent someone from doing same.
The warning is given because, according to MDN web docs:
Standards related to the Cookie SameSite attribute recently changed such that:
The cookie-sending behaviour if SameSite is not specified is SameSite=Lax. Previously the default was that cookies were sent for all requests.
Cookies with SameSite=None must now also specify the Secure attribute (they require a secure context/HTTPS).
Which indicates that a secure context/HTTPS is required in order to allow cross site cookies by setting SameSite=None Secure for the cookie.
According to Mozilla, you should explicitly communicate the intended SameSite policy for your cookie (rather than relying on browsers to apply SameSite=Lax automatically), otherwise you might get a warning like this:
Cookie “myCookie” has “SameSite” policy set to “Lax” because it is missing a “SameSite” attribute, and “SameSite=Lax” is the default value for this attribute.
The suggestion to simply delete localhost cookies is not actually solving the problem. The solution is to properly set the SameSite attribute of cookies being set by the server and use HTTPS if needed.
Firefox is not the only browser making these changes. Apparently the version of Chrome I am using (84.0.4147.125) has already implemented the changes as I got this message in the console:
The previously mentioned MDN article and this article by Mike Conca have great information about changes to SameSite cookie behavior.
Guess you are using WAMP or LAMP etc. The first thing you need to do is enable ssl on WAMP as you will find many references saying you need to adjust the cookie settings to SameSite=None; Secure That entails your local connection being secure. There are instructions on this link https://articlebin.michaelmilette.com/how-to-add-ssl-https-to-wampserver/ as well as some YouTube vids.
The important thing to note is that when creating the SSL certificate you should use sha256 encoding as sha1 is now deprecated and will throw another warning.
There is a good explanation of SameSite cookies on https://web.dev/samesite-cookies-explained/
I was struggling with the same issue and solved it by making sure the Apache 2.4 headers module was enabled and than added one line of code
Header always edit Set-Cookie ^(.")$ $1;HttpOnly;Secure
I wasted lots of time staring at the same sets of warnings in the Inspector until it dawned on me that the cookies were persisting and needed purging.
Apparently Chrome was going to introduce the new rules by now but Covid-19 meant a lot of websites might have been broken while people worked from home. The major browsers are working together on the SameSite attribute this so it will be in force soon.

Cookie write fails to work on hosted site

I have created a basic but extensive javascript-html page that depends on cookies to keep user information. It runs perfectly on my computer (MAC - Firefox) but when loaded into my hosted web site (the page is in my domain) the cookies are not being written when the page is opened.
I was hoping that by keeping all the programming in javascript I could get some basic interactivity. Is this assumption wrong? Must the cookies be written using PHP?
My cookie writes are very vanilla.
document.cookie = cookieArray[ja]+expires+"; path=/"; // writes cookie data into browser.
update
well cookies are now being written since I added "path=/; domain=.my.org". But now there is one other problem.
It seems that safari and Firefox write the cookies in reverse order to each other. I create the cookies by altering an array then simply stepping thru the array to write the cookies. I was hoping that I could simply read the cookies one by one and keep the order. Ah well.
Did you added the ";" between cookieArray[ja] and expires?
document.cookie = 'cookie-name=cookie-value; expires=Thu, 01-Jan-70 00:00:01 GMT;';
Also the cookieArray[ja] have to contain the cookie-name.
Do you really need the path? This parameter is also optional.
Cookies are, by default, available to all other files in the same directory the cookie was created in.
http://www.comptechdoc.org/independent/web/cgi/javamanual/javacookie.html

What is the maximum size of a cookie, and how many can be stored in a browser for each web site?

I am learning about cookies, and I wonder about browser support when writing web applications that rely on cookies to store state.
For each domain/web site, how many cookies may be sent to a browser, and of what size?
If multiple cookies are sent and stored, does that affect performance?
No more than 50 cookies per domain, with a maximum of 4 KB per cookie (or even 4 KB in total, see Iain's answer). On IE 6 it used to be 20 cookies per domain.
Generally it's recommended to preserve state on the server, and use cookies only for session tracking. They're sent along with every request, so they form an unnecessary overhead if the purpose is to keep session state around.
If you do want to keep state on the client, and you can use JavaScript to do it, there are options. Use the assorted storage API's directly or find a wrapper library that abstracts away the details.
Client-side storage options:
localStorage: Firefox 2+, Chrome 4+, Safari 4+, Internet Explorer 8+. 5 MB per domain without user confirmation (but be aware that it is stored as UTF-16 so you may use two bytes per character).
IndexedDB: Firefox 4+, Chrome 11+, Safari 10+, Internet Explorer 10+. 5 MB per domain without user confirmation, much more after confirmation (highly browser specific, check your browser for details).
Deprecated storage options:
Flash 8 persistent storage: any browser with Flash 8+. 100 KB, more with user permission. Deprecated because Flash itself is deprecated.
userData: Internet Explorer 5.5+. 64 KB per domain in the restricted zone, 128 KB per domain in the internet zone. Replaced by localStorage.
Web SQL: Chrome & Safari only, it will never make it to other browsers because it was not possible to standardize it.
So, generally for client-side storage it depends on the use case:
For session id tracking or for a few KB, use cookies.
Up to 2 MB, localstorage delivers a solution across all common browsers.
2 MB and up, use IndexedDB (look for a good wrapper library).
Cookie Size Limits
If you want to support most browsers, then do not exceed 50 cookies per domain, and 4093 bytes per domain. That is, the size of all cookies should not exceed 4093 bytes.
Performance Thoughts
Cookies are sent on every request for a domain, this includes images. For arguments sake, let's say you have 30 resources on your website, and have 4093 bytes of cookies. That means the user is uploading 122Kb of data. So if I have a 1Mbit upload connection, that will take at least 1 second.
If you want to see the cookie test page I created, or read more about it, check out Browser Cookie Limits.
Firstly, I suggest you don't worry about this issue. There is AMPLE room to serialize tons of identifiers to.
Secondly it's not stored by web-server but by web-domain — e.g., www.google.com and not the 100's of different physical servers that serve the Google domain.
Thirdly if you do have to worry know that there are two possible cookie headers. The sizes of these cookie headers are determined by the browser software limits.
Design Discussion
What you don't want to use the cookie header for is sending details about a client's session. E.g., don't try to stuff the email a client is typing into a cookie if you are building an email front-end. Instead you would send the client a cookie that represents his identity+session: you store all sessions data against this identity. You can store tens of identifiers (4–16 bytes) per cookie header and no one needs more than say 4 of these. The cookies data (as an integer) tends to be encoded to base64 which increases the byte-count.
Performance
Your browser sends a plethora of headers to a web-server. The cookie is just another 100-1000 bytes (mostly closer to 100). At both extremes it takes only a fraction of time to send these to the web-server — when placed into context of course. You should keep in mind that the web is built on text based protocols.
If you are concerned about performance decreases due to large cookies being sent on each server request, a good idea might be to place all your static files (images, CSS, etc.) into a subdomain of your site, like http://static.yourdomain.com.
In this way, whenever your site on www.yourdomain.com asks for a static file,like an image, the browser won't send the cookie along with the HTTP request anymore.
Source: http://developer.yahoo.com/performance/rules.html#cookie_free
Different browswers have different size limites on cookies. Here is the information for IE. Here is a page that lists several browsers.
Cookies are not saved on a server basis but on a domain basis (a server may host many domains or the opposite a server farm may be serving a single domain).
In general, I would avoid saving lots of information in cookies, as the data gets sent to and from the browser on every request. As you suggest in your question, this can have a effect on performance.
Usually one stores small amounts of data in the cookie, mostly used to identify the user/session so more data can be picked up from a database or another resource local to the web server.
If you're programming a web site, it's a good idea not to store too much in a cookie, because that cookie gets send to the server every time the user requests a page from your site. A far better solution is to just store a unique id in the cookie, and let the server pull up the required information from a database or file store based on that unique id. Unfortunately that solution leads to people worrying about what you're tracking about them, so you might want to have a "cookie policy" expressed somewhere on your site talking about why you're placing a cookie on their browser and what you do and don't track about them.
4096 bytes
The real problem, however, comes when you try and set cookies with a large size. The standards state that a browser must support a minimum of 4096 bytes per cookie. IE6 doesn't do this. Instead, it seems to have a maximum size of 4096 bytes for all cookies from a domain.
CDN Comes to Rescue.
You can offload your static content to a CDN or a file storage service like Amazon S3, keeping the static file requests cookie-free should be easy as long as you haven’t set up a CNAME record on a subdomain that receives cookies from your top-level domain.
This Blog Post is a good read for on Serving Static Content from a Cookieless Domain and how can we adopt this best practice to boost our performance in the client side.
Here's a really good site on cookie limits and lets you test your browser:
http://browsercookielimits.iain.guru/

Why does Safari let you set cookies over 4kB?

I set up an A/B test which required a fairly large amount of data to be stored in a cookie temporarily. While testing my code, I managed to get the cookie over 4kB. Safari set the cookie. On the subsequent page load, Apache returned an error since the cookie was too large.
I tested this on Firefox as well and it simply ignores the cookie, which seems to be the correct behavior to me.
I've seen this happen before first-hand on GMail. I used to get Bad Request errors and would have to delete my cookies. It was a known issue that's been resolved.
I can find nothing online about Safari allowing cookies over 4kB. Isn't this potentially dangerous? The idea that our users could be blocked from accessing our site and have no idea what's going on is scary. I don't know off the top of my head how it'd be possible to delete those cookies from our side if they got too large.
Why does Safari do this? Do any other browsers?
http://www.nczonline.net/blog/2008/05/17/browser-cookie-restrictions/ says that firefox and safari allow cookies up to 4097 characters, IE 4095 and opera 4096
there is something here about fixing the issue when the error happens, basically the error document clears the offending cookie so subsequent request will work (hopefully) http://www.webmasterworld.com/forum92/1163.htm
The standard specifies a certain minimum size for cookies. However, it does not specify a maximum size. Any browser can store a cookie of any size, as long as it's at least 4kb.
As a web developer, you try to only create cookies that work in all browsers. It's not up to safari to hold your hand on this point- It is simply dealing with the condition of a large cookie by accepting it, where others reject it. This is neither correct, nor incorrect. It is simply allowed.
I don't follow your point about it being potentially dangerous. If a user is blocked from your site, because of a cookie that you are setting doesn't work in some browsers, that is your fault, isn't it? Safari is just dealing with it where other browsers don't.