I have very strange problem - CSRF cookie not set on some of clients browsers. What could it potentially be?
All needed middleware is enabled, and as I said above, problem appears only on very small count of machines, although another Django-powered sites work well there.
The problem didn't solved in usual way, so I refused from cookie-based CSRF-protection and get session-based instead: https://github.com/mozilla/django-session-csrf.
Related
I've just noticed my console is littered with this warning, appearing for every single linked resource. This includes all referenced CSS files, javascript files, SVG images, and even URLs from ajax calls (which respond in JSON). But not images.
The warning, for example in case of a style.css file, will say:
Cookie “PHPSESSID” will be soon treated as cross-site cookie against “http://localhost/style.css” because the scheme does not match.
But, the scheme doesn't match what? The document? Because that it does.
The URL of my site is http://localhost/.
The site and its resources are all on http (no https on localhost)
The domain name is definitely not different because everything is referenced relative to the domain name (meaning the filepaths start with a slash href="/style.css")
The Network inspector just reports a green 200 OK response, showing everything as normal.
It's only Mozilla Firefox that is complaining about this. Chromium seems to not be concerned by anything. I don't have any browser add-ons. The warnings seem to originate from the browser, and each warning links to view the corresponding file source in Debugger.
Why is this appearing?
that was exactly same happening with me. the issue was that, firefox keeps me showing even Cookies of different websites hosted on same URL : "localhost:Port number" stored inside browser memory.
In my case, i have two projects configured to run at http://localhost:62601, when i run first project, it saves that cookie in browser memory. when i run second project having same URL, Cookie is available inside that projects console also.
what you can do, is delete the all of the cookies from browser.
#Paramjot Singh's answer is correct and got me most of the way to where I needed to be. I also wasted a lot of time staring at those warnings.
But to clarify a little, you don't have to delete ALL of your cookies to resolve this. In Firefox, you can delete individual site cookies, which will keep your settings on other sites.
To do so, click the hamburger menu in the top right, then, Options->Privacy & Security or Settings->Privacy & Security
From here, scroll down about half-way and find Cookies and Site Data. Don't click Clear Data. Instead, click Manage Data. Then, search for the site you are having the notices on, highlight it, and Remove Selected
Simple, I know, but I made the mistake of clearing everything the first time - maybe this will prevent someone from doing same.
The warning is given because, according to MDN web docs:
Standards related to the Cookie SameSite attribute recently changed such that:
The cookie-sending behaviour if SameSite is not specified is SameSite=Lax. Previously the default was that cookies were sent for all requests.
Cookies with SameSite=None must now also specify the Secure attribute (they require a secure context/HTTPS).
Which indicates that a secure context/HTTPS is required in order to allow cross site cookies by setting SameSite=None Secure for the cookie.
According to Mozilla, you should explicitly communicate the intended SameSite policy for your cookie (rather than relying on browsers to apply SameSite=Lax automatically), otherwise you might get a warning like this:
Cookie “myCookie” has “SameSite” policy set to “Lax” because it is missing a “SameSite” attribute, and “SameSite=Lax” is the default value for this attribute.
The suggestion to simply delete localhost cookies is not actually solving the problem. The solution is to properly set the SameSite attribute of cookies being set by the server and use HTTPS if needed.
Firefox is not the only browser making these changes. Apparently the version of Chrome I am using (84.0.4147.125) has already implemented the changes as I got this message in the console:
The previously mentioned MDN article and this article by Mike Conca have great information about changes to SameSite cookie behavior.
Guess you are using WAMP or LAMP etc. The first thing you need to do is enable ssl on WAMP as you will find many references saying you need to adjust the cookie settings to SameSite=None; Secure That entails your local connection being secure. There are instructions on this link https://articlebin.michaelmilette.com/how-to-add-ssl-https-to-wampserver/ as well as some YouTube vids.
The important thing to note is that when creating the SSL certificate you should use sha256 encoding as sha1 is now deprecated and will throw another warning.
There is a good explanation of SameSite cookies on https://web.dev/samesite-cookies-explained/
I was struggling with the same issue and solved it by making sure the Apache 2.4 headers module was enabled and than added one line of code
Header always edit Set-Cookie ^(.")$ $1;HttpOnly;Secure
I wasted lots of time staring at the same sets of warnings in the Inspector until it dawned on me that the cookies were persisting and needed purging.
Apparently Chrome was going to introduce the new rules by now but Covid-19 meant a lot of websites might have been broken while people worked from home. The major browsers are working together on the SameSite attribute this so it will be in force soon.
I am building a simple web app using React.js for the frontend and Django for the server side.
Thus frontend.herokuapp.com and backend.herokuapp.com.
When I attempt to make calls to my API through the react app the cookie that was received from the API is not sent with the requests.
I had expected that I would be able to support this configuration without having to do anything special since all server-side requests would (I thought) be made by the JS client app directly to the backend process with their authentication cookies attached.
In an attempt to find a solution that I thought would work I attempted to set
SESSION_COOKIE_DOMAIN = "herokuapp.com"
Which while less than ideal (as herokuapp.com is a vast domain) in Production would seem to be quite safe as they would then be on api.myapp.com and www.myapp.com.
However, with this value set in settings.py I get an AuthStateMissing when hitting my /oauth/complete/linkedin-oauth2/ endpoint.
Searching google for AuthStateMissing SESSION_COOKIE_DOMAIN yields one solitary result which implies that the issue was reported as a bug in Django social auth and has since been closed without further commentary.
Any light anyone could throw would be very much appreciated.
I ran into the exact same problem while using herokuapp.com.
I even posted a question on SO here.
According to Heroku documentation:
In other words, in browsers that support the functionality, applications in the herokuapp.com domain are prevented from setting cookies for *.herokuapp.com
Heroku blocks cookies from frontend.herokuapp.com and backend.herokuapp.com
You need to add a custom domain to frontend.herokuapp.com and backend.herokuapp.com
The entire answer https://stackoverflow.com/a/54513216/1501643
I have a public Django site which uses CSRF protection.
I have not set the CSRF_COOKIE_DOMAIN. My site uses subdomains.
Sometimes, a user ends up having a csrftoken cookie set on .toplevel.com as well as on sub.toplevel.com. This causes problems, as CSRF checking fails if the wrong cookie is used in the check.
I would like to set a CSRF_COOKIE_DOMAIN to .toplevel.com. However, I would also like to delete any csrftoken cookies for any *.toplevel.com subdomains. How would I do this?
If I do not delete the other cookies, I will just end up in the original situation of having two cookies with the same name on different domains, which causes issues.
I had a similar problem. The way I dealt with it is together with CSRF_COOKIE_DOMAIN I also changed the CSRF_COOKIE_NAME, making old "csrftoken" cookies obsolete.
I've got two different django projects, where one sits on domain A and has a bunch of functionalities (REST among them). Site B is simple and I want to post ajax-forms to site A, but keep csrf security. Is that possible?
Btw sites can share database if necessary.
I've had a simillar problem and I've managed to solve it in the following way:
issue GET request from site B to site A to fetch a form (with csrf field)
POST the form back to site A.
The main problem for me was to get cross-site ajax requests to work. To achieve that I've had to configure CORS correctly on the server-side (I've slightly edited this middleware: https://gist.github.com/strogonoff/1369619) and set xmlHttp.withCredentials = true (where xmlHttp is my XMLHttpRequest object) in the ajax POST function.
I've tested this solution on two diffenet ports on the same IP address, but I think it should also work cross-domain.
I set up an A/B test which required a fairly large amount of data to be stored in a cookie temporarily. While testing my code, I managed to get the cookie over 4kB. Safari set the cookie. On the subsequent page load, Apache returned an error since the cookie was too large.
I tested this on Firefox as well and it simply ignores the cookie, which seems to be the correct behavior to me.
I've seen this happen before first-hand on GMail. I used to get Bad Request errors and would have to delete my cookies. It was a known issue that's been resolved.
I can find nothing online about Safari allowing cookies over 4kB. Isn't this potentially dangerous? The idea that our users could be blocked from accessing our site and have no idea what's going on is scary. I don't know off the top of my head how it'd be possible to delete those cookies from our side if they got too large.
Why does Safari do this? Do any other browsers?
http://www.nczonline.net/blog/2008/05/17/browser-cookie-restrictions/ says that firefox and safari allow cookies up to 4097 characters, IE 4095 and opera 4096
there is something here about fixing the issue when the error happens, basically the error document clears the offending cookie so subsequent request will work (hopefully) http://www.webmasterworld.com/forum92/1163.htm
The standard specifies a certain minimum size for cookies. However, it does not specify a maximum size. Any browser can store a cookie of any size, as long as it's at least 4kb.
As a web developer, you try to only create cookies that work in all browsers. It's not up to safari to hold your hand on this point- It is simply dealing with the condition of a large cookie by accepting it, where others reject it. This is neither correct, nor incorrect. It is simply allowed.
I don't follow your point about it being potentially dangerous. If a user is blocked from your site, because of a cookie that you are setting doesn't work in some browsers, that is your fault, isn't it? Safari is just dealing with it where other browsers don't.