Safari doesn't save cookies on custom https port - cookies

I have a web application that i'm looking to access by https on port 444.
In Safari, when accessing it, my initial request logs me in successfully (i can see the authentication success message in custom HTTP response headers).
But the site does not display successfully, because the request for the CSS file associated with the page fails with Access Denied.
Upon inspection of the headers, i can see that a session is assigned to the first request (for the page, with the Set-Cookie response header), and a new session has been assigned for the CSS request (also with the Set-Cookie response header). No cookies show up in the Web Inspector. My assumption is that the cookie from the first request is not being saved and so the server is assigning a new session.
When accessing the site on the standard https port (443), the behavior is as expected, as it is when accessing it on port 444 with Firefox. I'm on OSX Mountain Lion.
Any thoughts much appreciated!

Related

Flask session lost after redirect - seems like browser doesn't set the cookie. What am I missing?

I have a web app that makes a POST request to:
https://localhost:5000/processOneTapCredentials
This endpoint sets some data in flask.session, and then returns a redirect to another endpoint (https://localhost:5000/login/success). I can confirm it attempts to set the session. The response headers for the first endpoint (the 302 response) includes:
On the second endpoint, the session is empty though. I see that when the 302 is processed, there is no cookie header set in the headers:
So the flow is:
Web app makes a XHR request (POST) to https://localhost:5000/processOneTapCredentials
https://localhost:5000/processOneTapCredentials sets some flask.session info and returns a 302 to https://localhost:5000/login/success
https://localhost:5000/login/success gets invoked (I see in dev tools), but there is no cookie, so session is empty.
I have set the Flask key correctly, and the session works between redirects in other situations (such as when Flask-dance redirects to authenticate a user). So I must be doing something wrong.
What am I missing?
make sure your app['SECRET_KEY'] is not changing, and if you are redirecting to an external website or redirect from http to https you need to set your SAMESITE policy properly, with a SAMESITE='Lax' cookies are not forwarded, try setting it to None and see it the problem is related to your SAMESITE policy

nginx API cross origin calls not working only from some browsers

TLDR: React app's API calls are returning with status code 200 but without body in response, happens only when accessing the web app from some browsers.
I have a React + Django application deployed using nginx and uwsgi on a single centOS7 VM.
The React app is served by nginx on the domain, and when users log in on the javascript app, REST API requests are made to the same nginx on a sub domain (ie: backend.mydomain.com), for things like validate token and fetch data.
This works on all recent version of Firefox, Chrome, Safari, and Edge. However, some users have complained that they could not log in from their work network. They can visit the site, so obviously the javascript application is served to them, but when they log in, all of the requests come back with status 200, except the response has an empty body. (and the log in requires few pieces of information to be sent back with the log in response to work).
For example, when I log in from where I am, I would get response with status=200, and a json object with few parameters in the body of the response.
But when one of the users showed me the same from their browser, they get Status=200 back, but the Response is empty. They are using the same version of browsers as I have. They tried both Firefox and Chrome with the same behaviours.
After finally getting hold of one of the user to send me some screenshots. I found the problem. In my browser that works with the site, the API calls to the backend had Referrer Policy set to strict-origin-when-cross-origin in the Headers. However on their browser, the same was showing up as no-referrer-when-downgrade.
I had not explicitly set the referrer policy so the browsers were using each of their default values, and it differed between different versions of browsers (https://developers.google.com/web/updates/2020/07/referrer-policy-new-chrome-default)
To fix this, I added add_header 'Referrer-Policy' 'strict-origin-when-cross-origin'; to the nginx.conf file and restarted the server. More details here: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Referrer-Policy
The users who had trouble before can now access the site API resources after clearing cache in their browsers.

Why cookies appear in request header but not response header when visiting a new website?

This is a incognito window in chrome visiting oracle. Please notice that the request header already has cookie in the very request.
I also tried to use GuzzleHttp in php and postman. I can't get the cookie from anywhere.
Actually I am trying to crawl some other website, and that website has the same problem. I can't get the cookie so I got rejected.
Isn't cookie something that the server returns to the browser? Why in this case it is like the browse know the cookie in the first?
Http cookies are set once in a response by the server (with a Set-Cookie header), then they are included by the browser in each applicable subsequent request.
So it is perfectly normal that cookies your browser already obtained are present in the request but not in the response.
Cookies may also be set on browser side by Javascript, but that also can't happen before the first request (to at least retrieve that Javascript).

Firefox extension/addon does not store cookies

I'm working on an browser extension that authenticates with a remote server via XMLHttpRequests. In Firefox (59.0.2) I have the problem that the session cookie send by the server is not stored in the browser. When looking at the network traffic I get a Set-Cookie response from the server for every request:
Set-Cookie JSESSIONID=node01abks2u96hf84wt0i1uqwsb9879.node0;Path=/
but it seems that the cookie is never accepted or stored in the extension.
When looking at Chrome (where the extension is working) my extension includes this cookie in the request:
Cookie: io=jCX1X9rlaOhCqE0nAAAB JSESSIONID=node01abks2u96hf84wt0i1uqwsb9879.node0
However, this is not the case in Firefox. Why is Firefox is not including the cookie in the request? and why is it not storing the cookie?
UPDATE: as suggested I filed a bug report:
https://bugzilla.mozilla.org/show_bug.cgi?id=1454806
Furthermore, I created a very minimal example addon that fails:
https://gitlab.com/czeidler/firefox-cookie-problem
Could somebody please let me know if that addon really should work? or am I doing something wrong? To trigger the problem open the debug view of the addon and select the network view. Then click the addon popup icon. This will trigger two requests to my server. The first reply contains a Set-Cookie header that is not reused in the second request.
I found the reason why it is not working. Firefox handles a request from the popup as a cross domain request and does not set the cookie for this reason. Not sure if Chrome and Firefox should behave the same here or which approach is the better one. Here is how I fixed this issue to make it work in both browsers:
On the server:
response.addHeader("Access-Control-Allow-Origin", request.getHeader("Origin"))
response.addHeader("Access-Control-Allow-Credentials", "true")
In the popup:
connection.withCredentials = true;

How to make a cookie available to all paths in a domain?

I created a cookie in a java filter and added back to the response
response.addCookie()
before returning to the client node.js application. This web application is accessed using a localhost URL in the browser. After reading about cookie domain issue while using 'localhost', i did not set any domain or path in the cookie, while creating it.
Now the Chrome or Firefox browsers don't show-up the cookie in the browser. All my URLs are http://localhost but, each page having different path.
Step 1: During a request to http://localhost/app/login cookie is created and set in the response
Step 2: When the page loads after response, no cookies are shown in Chrome
Step 3: During the next request http://localhost/app/customer the previously created cookie is not recieved when trying request.getCookies().
Step 4: Before returning back to client application, a cookie is created
Step 5: Now the cookie created in Step 4 is shown in Chrome
Step 6: The next request is also sent to http://localhost/app/customer , now the cookie created in step 4 is recieved in the server as well
If cookie creation for localhost is an issue, how does it work for Steps 4-6 only ?
How can i make the created cookie available to all paths under the
localhost domain ? I tried using cookie.addPath("/") but, no change.
Note: Due to admin privilege issues in my development machine, i am not able to set-up a domain name to my localhost IP in etc/hosts file.
In your Java server, you should call cookie.setPath("/") before adding it to response.
Such cookie will match all request URIs. It's a pity that it is not the default behavior.
I have a more detailed explanation of cookie path here - http://bayou.io/release/0.9/javadoc/bayou/http/Cookie.html#path
Not sure path is the issue. Path does not affect whether a cookie is created; it only determines whether it is presented. If cookies aren't showing up in the browser's cookie jar they are being rejected for some reason other than path.
Chrome will not accept cookies for localhost because it does not accept cookies in the top level domain. The domain in the URL has to have a dot in it somewhere. So you could either add a hosts entry (recommended) or just trying using 127.0.0.1 instead of localhost.
Also, none of this will work if the cookie is marked as secure or is being set with a domain attribute. If either of those is the case, you MUST use a hosts entry instead of localhost or 127.0.0.1.