TO protect app from CSRF attack we set a cookie named XSRF-TOKEN from server side. So from client side code we are able to set-cookie and send across to server, But to validate CSRF in the server side we need to send header while firing 'POST' service call. As per angular document automatically $http sets header X-XSRF-TOKEN by reading the cookie ( Please refer link), but Javascript code is unable to read the cookie though we have deployed our application on same domain.
Server side cookie generation code and service deployment details are as below,
final Cookie newCookie = new Cookie(
"XSRF-TOKEN",
csrfValue);
newCookie.setPath("/");
httpResponse.addCookie(newCookie);
UI is deployed in 8080 port and service is deployed in port 8084 inside same VM
Port 8080 and 8084 are different origins, so you can't read cookies from one on the other, the same as you can't access the cookies of any other website in javascript running on yours.
How does the service authenticate the user? If it's token based, and the token is sent as a request header, you don't even need further protection from csrf.
Related
I previously posted about not being able to send HttpOnly cookie from nextJS to django either from getServerSideProps or from useEffect here. I think the reason is that my django and Nextjs are running on different domains. So I need to have same domains for both back-end and front-end. Does it mean that requests from nextJS should go from 127.0.0.1:8000 instead of 127.0.0.1:3000?
If yes, do I need to use proxy within Nextjs?
Also, I have set-up django-cors-headers, does it still require proxy?
Yes. You'll need to set up a proxy server that will forward your requests from 127.0.0.1:3000 to 127.0.0.1:8000. That way, your cookies will be shared with your backend server since the browser will assume they are from the same origin.
As for setting up a http proxy on nextjs, you can refer to this Github Answer
Setting django CORS headers will not apply to your cookies. CORS refers to requests from different origins and setting these headers will allow your backend to receive requests from different origins as you have specified in your allowedHosts declerative. Cookies can be shared within different subdomains but never different domains.
I'm currently working on a React project. The development server (Bottle/Python) for the project is hosted remotely, and my React dev-server is localhost. Part of the authentication process for the application involves setting a cookie on login, but because of same-site and secure rules that cookie is not being set, meaning that my dev frontend can't access any of the data that it needs.
Myself and the server engineer have added SameSite=None to the cookie as well as secure, but because my localhost is not https the cookie is still not being stored properly (I get the error message "this Set-Cookie" was blocked because it had the "Secure" attribute but was not received over a secure connection").
There are no issues when the app is deployed because everything is on the same domain, but for now we're stuck - we've been trying to solve the issue for several hours but can't seem to get it.
My question is - what is the best development practice if you need to access a non-local development server, but can't actually just have your own version of the server running on your local machine?
Do I:
Need to make my localhost https somehow?
Need to make the dev-server domain https?
Need to install the server locally because there's just no way to do this?
Apologies if this is a noob question, it would be great to have some advice.
Many thanks.
The short answer is:
No
Yes
No
You can run your app on http://localhost:port. Assuming response from your dev server has in response headers Set-Cookie of the cookie which has Secure flag, your dev server URL has to be https in order to have the cookie accepted by the browser.
I have this setup and it works just well.
Regarding CORS (as mentioned in the title of the question): you have to have you server configured to accept credentials and to have allowed origins configured. The client app when doing XHR request has to have withCredentials:true. Check the points 2 and 3 in my post for details.
Also note, that if you are using Chrome you can bypass for development purposes the requirement to have SameSite=None and Secure by disabling the flag "Cookies without SameSite must be secure", also detailed here
can the proxy server intercept my https request and set cookies before actually sending the request?
I'm going a GET on an url from chrome browser. In the development tools, under "Network", I noticed that the first request, the one that I made, has cookies set. but I did not set any cookies.
any thoughts?
No it can't. To proxy HTTPS requests your browser issues HTTP CONNECT command (https://developer.mozilla.org/en-US/docs/Web/HTTP/Methods/CONNECT). Proxy then creates a tunnel between the browser and a target server.
A conventional proxy can neither view nor manipulate a TLS-encrypted data stream, so a CONNECT request simply asks the proxy to open a pipe between the client and server. The proxy here is just a facilitator - it blindly forwards data in both directions without knowing anything about the contents. The negotiation of the TLS connection happens over this pipe, and the subsequent flow of requests and responses are completely opaque to the proxy.
It cannot modify or see what is being transferred as it is protected by TLS encryption.
The only way to modify HTTPS conenctions on the fly is if you install some external CA certificates on your computer. This is known as MITM Attack.
I wish to understand if a HTTP cookie is the same for every browser that starts the request, or if the server generates a different cookie if the browser is different, although the client is the same.
For example, suppose I connect to an Amazon server, first from Mozilla and later from Chrome. Will I receive two different cookies or I will receive the same cookie?
TL,DR: does the server keep track of the client or of the browser?
Thanks in advance.
What is the best way to debug a Django app that runs on top of TLS/SSL?
Background:
I have a Django web app that uses X.509 client side certificates for authentication. When running under Apache, my app can only be reached via HTTPS. Clients that connect to the app provide a client side certificate which Apache validates and then forwards to the app in an environment variable. The app parses the certificate and provides access controlled content.
So far, I have only been able to debug the app under regular HTTP, with "./manage.py runserver". I have simulated an HTTPS connection by using a custom view handler middleware that kicks in, in debug mode. The view handler adds information to the request, similar to the information that would be parsed out of an actual client side certificate when run under HTTPS.
It would make debugging much easier for me if I could debug with the actual client side certificates that clients provide when connecting via HTTPS.
We use nginx in front of Django, with client certificate checking. NGINX does the SSL termination, client cert validation, and checking against revocation list. The client cert fields are passed in header variables up to the django app.
So then our django app doesn't receive the cert, it just looks at the header variables. I think the same mechanism applies in Apache.
For clients accessing the development server (e.g. './manage.py runserver'), we simply have a special case in the client. Example of a python client:
if (proto == "https"):
conn = http.client.HTTPSConnection( "cert."+webhost+":"+port,
key_file = certfile, cert_file = certfile)
headers = {}
else:
# fake client for local connections. pass cert info in headers, as it would come
# out of nginx
conn = http.client.HTTPConnection( webhost+":"+port)
headers = { 'X_SSL_CLIENT_S_DN':'/C=US/ST=California/O=yyyy/CN=zzzz',
'X_SSL_CLIENT_I_DN':'/C=US/ST=California/O=xxxx/CN=wwww',
'X_SSL_CLIENT_SERIAL':hex(serialnum),
'USER_AGENT':"test client user agent",}
For unit tests, we do the same thing using the Django test client:
from django.test.client import Client
self.client = Client()
response = self.client.get(url, data,
**{
'HTTP_X_SSL_CLIENT_S_DN':'/C=US/ST=California/O=yyyy/CN=zzzz',
'HTTP_X_SSL_CLIENT_I_DN':'/C=US/ST=California/O=xxxx/CN=wwww',
'HTTP_X_SSL_CLIENT_SERIAL':hex(serialnum),
'HTTP_USER_AGENT':"test client user agent",
})
I've come up with a workaround that works fairly well for me. I still debug with HTTP, but I pass the client side certificate in via an HTTP header. So, when I debug the web app with HTTP, I have the clients copy the client side certificate into an HTTP header. Before entering the views, the web app copies the certificate from the header and into the regular location in which it would be passed by Apache when using HTTPS.
The client side certificates are PEM formatted so, to be able to pass them in HTTP headers, the only thing that needs to be done is to remove the newlines on the client and reinsert them on the server.
If using this approach, note that Apache's default limit for the size of a single HTTP header field is 8190 bytes, configured with the LimitRequestFieldSize directive. For certificates that are larger than that, the configuration must be changed or the certificate must be split up and passed in multiple headers.