Cannot persist cookies when using RTK Query - cookies

I'm creating a pet project with React with Redux on the front end and Node.js with express on the back end. It's the first time I use Redux and RTK Queries, and I'm confused about cookies not being saved.
I tried both variants with RTK Query, and neither worked for me.
const baseQuery = fetchBaseQuery({
baseUrl,
credentials: "include",
});
and
signIn: builder.mutation<User, SignInPayload>({
query(body) {
return {
url: "/auth/jwt",
method: "POST",
body,
credentials: "include",
};
},
}),
If I reach the same endpoint using axios and {withCredentials: true}, everything works fine, and cookies are persisted.
I attach my Request/Response Headers below.

Related

Django - why am i not authenticated after logging in from Axios/AJAX?

I'm building a separated VueJS/Django app where Django will communicate with the Vue frontend using JSON. In order to be able to use the standard session authentication and django-allauth i will deploy the two apps on the same server and on the same port.
Here is my problem: after i log in from the Vue app using Axios, i don't receive any response but i notice that a session is created on the db, so i'm assuming that i'm getting logged in. But if i try to reach and endpoint that prints request.user.is_authenticatedi get False, and request.user returns Anonymous, so i'm not logged in anymore. How can i solve this?
Here is my Axios code:
bodyFormData.append('login', 'root');
bodyFormData.append('password', 'test');
axios({
method: "post",
url: "http://127.0.0.1:8000/accounts/login/",
data: bodyFormData,
withCredentials: true,
headers: { "Content-Type": "application/json" },
})
.then(function (response) {
//handle success
console.log(response);
})
.catch(function (response) {
//handle error
console.log(response);
});
I think Django-Allauth supports AJAX authentication on its urls, but i don't understand how to make it return something and how can my Vue app stay authenticated once i submit the Axios form. Any advice is welcome!

How can I set a cookie with Relay?

I've got express js server code:
...
const server = new GraphQLServer({
typeDefs: `schema.graphql`,
resolvers,
context: context => {
let cookie = get(context, 'request.headers.cookie');
return { ...context, cookie, pubsub };
},
});
such that I can attach cookie to resolvers' requests:
...
method: 'GET',
headers: {
cookie: context.cookie,
},
Now I want to be able to use Relay (as a GraphQL client) and I want to be able to attach a cookie to Relay's requests as well.
I've found a similar question but it's not clear to me where can I insert that code:
Relay.injectNetworkLayer(
new Relay.DefaultNetworkLayer('/graphql', {
credentials: 'same-origin',
})
);
since I don't import Relay in Environment.js.
Update: I tried to add
import { Relay, graphql, QueryRenderer } from 'react-relay';
Relay.injectNetworkLayer(
new Relay.DefaultNetworkLayer('http://example.com/graphql', {
credentials: 'same-origin',
})
);
to a file where I send GraphQL queries (e.g., client.js), but it says that Relay is undefined.
Update #2: this repo looks interesting.

Apify: Preserve headers in RequestQueue

I'm trying to crawl our local Confluence installation with the PuppeteerCrawler. My strategy is to login first, then extracting the session cookies and using them in the header of the start url. The code is as follows:
First, I login 'by foot' to extract the relevant credentials:
const Apify = require("apify");
const browser = await Apify.launchPuppeteer({sloMo: 500});
const page = await browser.newPage();
await page.goto('https://mycompany/confluence/login.action');
await page.focus('input#os_username');
await page.keyboard.type('myusername');
await page.focus('input#os_password');
await page.keyboard.type('mypasswd');
await page.keyboard.press('Enter');
await page.waitForNavigation();
// Get cookies and close the login session
const cookies = await page.cookies();
browser.close();
const cookie_jsession = cookies.filter( cookie => {
return cookie.name === "JSESSIONID"
})[0];
const cookie_crowdtoken = cookies.filter( cookie => {
return cookie.name === "crowd.token_key"
})[0];
Then I'm building up the crawler structure with the prepared request header:
const startURL = {
url: 'https://mycompany/confluence/index.action',
method: 'GET',
headers:
{
Accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
'Accept-Encoding': 'gzip, deflate, br',
'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7',
Cookie: `${cookie_jsession.name}=${cookie_jsession.value}; ${cookie_crowdtoken.name}=${cookie_crowdtoken.value}`,
}
}
const requestQueue = await Apify.openRequestQueue();
await requestQueue.addRequest(new Apify.Request(startURL));
const pseudoUrls = [ new Apify.PseudoUrl('https://mycompany/confluence/[.*]')];
const crawler = new Apify.PuppeteerCrawler({
launchPuppeteerOptions: {headless: false, sloMo: 500 },
requestQueue,
handlePageFunction: async ({ request, page }) => {
const title = await page.title();
console.log(`Title of ${request.url}: ${title}`);
console.log(page.content());
await Apify.utils.enqueueLinks({
page,
selector: 'a:not(.like-button)',
pseudoUrls,
requestQueue
});
},
maxRequestsPerCrawl: 3,
maxConcurrency: 10,
});
await crawler.run();
The by-foot-login and cookie extraction seems to be ok (the "curlified" request works perfectly), but Confluence doesn't accept the login via puppeteer / headless chromium. It seems like the headers are getting lost somehow..
What am I doing wrong?
Without first going into the details of why the headers don't work, I would suggest defining a custom gotoFunction in the PuppeteerCrawler options, such as:
{
// ...
gotoFunction: async ({ request, page }) => {
await page.setCookie(...cookies); // From page.cookies() earlier.
return page.goto(request.url, { timeout: 60000 })
}
}
This way, you don't need to do the parsing and the cookies will automatically be injected into the browser before each page load.
As a note, modifying default request headers when using a headless browser is not a good practice, because it may lead to blocking on some sites that match received headers against a list of known browser fingerprints.
Update:
The below section is no longer relevant, because you can now use the Request class to override headers as expected.
The headers problem is a complex one involving request interception in Puppeteer. Here's the related GitHub issue in Apify SDK. Unfortunately, the method of overriding headers via a Request object currently does not work in PuppeteerCrawler, so that's why you were unsuccessful.

batch requests with credentials using React Apollo

How do you batch requests with credentials? I'm using an http-only JWT cookie and HttpLink allows me to pass a credentials: 'include' option which will forward the cookie through to my graphene server. When I try to switch to BatchHttpLink, it no longer accepts that option for configuration. Looking through the source, it doesn't appear there's an easy way to configure this. Anyone know how to handle this?
Here's how I was doing it without batching:
window['app-react'].GRAPHQL_URL = window['app-react'].GRAPHQL_URL || 'http://backend.app.local/graphiql'
const httpLink = new HttpLink({
uri: window['app-react'].GRAPHQL_URL,
credentials: 'include'
})
const client = new ApolloClient({
link: httpLink,
cache: new InMemoryCache(),
})
Here's how I wish it worked:
const batchHttpLink = new BatchHttpLink({
uri: window['joor-react'].GRAPHQL_URL,
credentials: 'include'
})
const client = new ApolloClient({
link: batchHttpLink,
cache: new InMemoryCache(),
})
When I do it this way though, the JWT cookie isn't passed in the header.
Right now apollo-link-batch-http is behind in its API compared to apollo-link-http. Here is a note from the docs (as of 2017-12-07)
Note: This package will be updated to remove the dependency on apollo-fetch an use the same options / API as the http-link
Using BatchHttpLink with apollo-fetch
The current API of apollo-link-batch-http requires a custom apollo-fetch if you want to make customizations to things like credentials. Here's a couple of options.
Provide a customFetch to createApolloFetch and define credentials in the fetch options.
Use apollo-fetch middleware
fetch.batchUse(({ options }, next) => {
options.credentials = 'include';
next();
});

Fetch API with Cookie

I am trying out the new Fetch API but is having trouble with Cookies. Specifically, after a successful login, there is a Cookie header in future requests, but Fetch seems to ignore that headers, and all my requests made with Fetch is unauthorized.
Is it because Fetch is still not ready or Fetch does not work with Cookies?
I build my app with Webpack. I also use Fetch in React Native, which does not have the same issue.
Fetch does not use cookie by default. To enable cookie, do this:
fetch(url, {
credentials: "same-origin"
}).then(...).catch(...);
In addition to #Khanetor's answer, for those who are working with cross-origin requests: credentials: 'include'
Sample JSON fetch request:
fetch(url, {
method: 'GET',
credentials: 'include'
})
.then((response) => response.json())
.then((json) => {
console.log('Gotcha');
}).catch((err) => {
console.log(err);
});
https://developer.mozilla.org/en-US/docs/Web/API/Request/credentials
Have just solved. Just two f. days of brutforce
For me the secret was in following:
I called POST /api/auth and see that cookies were successfully received.
Then calling GET /api/users/ with credentials: 'include' and got 401 unauth, because of no cookies were sent with the request.
The KEY is to set credentials: 'include' for the first /api/auth call too.
If you are reading this in 2019, credentials: "same-origin" is the default value.
fetch(url).then
Programmatically overwriting Cookie header in browser side won't work.
In fetch documentation, Note that some names are forbidden. is mentioned. And Cookie happens to be one of the forbidden header names, which cannot be modified programmatically. Take the following code for example:
Executed in the Chrome DevTools console of page https://httpbin.org/, Cookie: 'xxx=yyy' will be ignored, and the browser will always send the value of document.cookie as the cookie if there is one.
If executed on a different origin, no cookie is sent.
fetch('https://httpbin.org/cookies', {
headers: {
Cookie: 'xxx=yyy'
}
}).then(response => response.json())
.then(data => console.log(JSON.stringify(data, null, 2)));
P.S. You can create a sample cookie foo=bar by opening https://httpbin.org/cookies/set/foo/bar in the chrome browser.
See Forbidden header name for details.
Just adding to the correct answers here for .net webapi2 users.
If you are using cors because your client site is served from a different address as your webapi then you need to also include SupportsCredentials=true on the server side configuration.
// Access-Control-Allow-Origin
// https://learn.microsoft.com/en-us/aspnet/web-api/overview/security/enabling-cross-origin-requests-in-web-api
var cors = new EnableCorsAttribute(Settings.CORSSites,"*", "*");
cors.SupportsCredentials = true;
config.EnableCors(cors);
This works for me:
import Cookies from 'universal-cookie';
const cookies = new Cookies();
function headers(set_cookie=false) {
let headers = {
'Accept': 'application/json',
'Content-Type': 'application/json',
'X-CSRF-Token': $('meta[name="csrf-token"]').attr('content')
};
if (set_cookie) {
headers['Authorization'] = "Bearer " + cookies.get('remember_user_token');
}
return headers;
}
Then build your call:
export function fetchTests(user_id) {
return function (dispatch) {
let data = {
method: 'POST',
credentials: 'same-origin',
mode: 'same-origin',
body: JSON.stringify({
user_id: user_id
}),
headers: headers(true)
};
return fetch('/api/v1/tests/listing/', data)
.then(response => response.json())
.then(json => dispatch(receiveTests(json)));
};
}
My issue was my cookie was set on a specific URL path (e.g., /auth), but I was fetching to a different path. I needed to set my cookie's path to /.
If it still doesn't work for you after fixing the credentials.
I also was using the :
credentials: "same-origin"
and it used to work, then it didn't anymore suddenly, after digging much I realized that I had change my website url to http://192.168.1.100 to test it in LAN, and that was the url which was being used to send the request, even though I was on http://localhost:3000.
So in conclusion, be sure that the domain of the page matches the domain of the fetch url.