I'm trying to remove jQuery from a React/Redux/Django webapp and replace the $.ajax method with the Fetch API. I've more or less got all my GET requests working fine and I seem to be able to hit my POST requests, but I cannot seem to format my request in such a way as to actually get my POST data into the Django request.POST object. Every time I hit my /sign_in view, the request.POST object is empty. My entire app's backend is built around using Django forms (no Django templates, just React controlled components) and I would really like to not have to rewrite all my views to use request.body or request.data.
Here is all the code I can think that would be relevant, please let me know if there's more that would be helpful:
This is the curried function I use to build my full POST data and attach the CSRF token:
const setUpCsrfToken = () => {
const csrftoken = Cookies.get('csrftoken')
return function post (url, options) {
const defaults = {
'method': 'POST',
'credentials': 'include',
'headers': {
'X-CSRFToken': csrftoken,
'Content-Type': 'application/x-www-form-urlencoded'
}
}
const merged = merge(options, defaults)
return fetch(url, merged)
}
}
export const post = setUpCsrfToken()
This is the API method I use from my React app:
export const signIn = data => {
return post('/api/account/sign_in/', data)
}
The data when it is originally packaged up in the React app itself is as simple as an object with string values:
{
email: 'email#email.com',
password: 'password
}
I've looked at these questions and found them to be nominally helpful, but I can't figure out to synthesize an answer for myself that takes into account what I assume is some of the intricacies of Django:
POST Request with Fetch API?
Change a jquery ajax POST request into a fetch api POST
Convert JavaScript object into URI-encoded string
Is there a better way to convert a JSON packet into a query string?
Thanks!
You have to set the appropriate X-Requested-With header. jQuery does this under the hood.
X-Requested-With: XMLHttpRequest
So, in your example, you would want something like:
const setUpCsrfToken = () => {
const csrftoken = Cookies.get('csrftoken')
return function post (url, options) {
const defaults = {
'method': 'POST',
'credentials': 'include',
'headers': new Headers({
'X-CSRFToken': csrftoken,
'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8',
'X-Requested-With': 'XMLHttpRequest'
})
}
const merged = merge(options, defaults)
return fetch(url, merged)
}
}
Related
I have the following _app.js for my NextJS app.
I want to change the authorization header on login via a cookie that will be set, I think I can handle the cookie and login functionaility, but I am stuck on how to get the cookie into the ApolloClient headers autorization. Is there a way to pass in a mutation, the headers with a token from the cookie. Any thoughts here???
I have the cookie working, so I have a logged in token, but I need to change the apolloclient Token to the new one via the cookie, in the _app.js. Not sure how this is done.
import "../styles/globals.css";
import { ApolloClient, ApolloProvider, InMemoryCache } from "#apollo/client";
const client = new ApolloClient({
uri: "https://graphql.fauna.com/graphql",
cache: new InMemoryCache(),
headers: {
authorization: `Bearer ${process.env.NEXT_PUBLIC_FAUNA_SECRET}`,
},
});
console.log(client.link.options.headers);
function MyApp({ Component, pageProps }) {
return (
<ApolloProvider client={client}>
<Component {...pageProps} />
</ApolloProvider>
);
}
export default MyApp;
UPDATE:I've read something about setting this to pass the cookie int he apollo docs, but I don't quite understand it.
const link = createHttpLink({
uri: '/graphql',
credentials: 'same-origin'
});
const client = new ApolloClient({
cache: new InMemoryCache(),
link,
});
UPDATE: So I have made good progress with the above, it allows me to pass via the context in useQuery, like below. Now the only problem is the cookieData loads before the use query or something, because if I pass in a api key it works but the fetched cookie gives me invalid db secret and its the same key.
const { data: cookieData, error: cookieError } = useSWR(
"/api/cookie",
fetcher
);
console.log(cookieData);
// const { loading, error, data } = useQuery(FORMS);
const { loading, error, data } = useQuery(FORMS, {
context: {
headers: {
authorization: "Bearer " + cookieData,
},
},
});
Any ideas on this problem would be great.
If you need to run some GraphQL queries after some other data is loaded, then I recommend putting the latter queries in a separate React component with the secret as a prop and only loading it once the former data is available. Or you can use lazy queries.
separate component
const Form = ({ cookieData }) => {
useQuery(FORMS, {
context: {
headers: {
authorization: "Bearer " + cookieData,
},
},
});
return /* ... whatever ... */
}
const FormWrapper = () => {
const { data: cookieData, error: cookieError } = useSWR(
"/api/cookie",
fetcher
);
return cookieData ? <Form cookieData={ cookieData }/> : ...loading
}
I might be missing some nuances with when/how React will mount and unmount the inner component, so I suppose you should be careful with that.
Manual Execution with useLazyQuery
https://www.apollographql.com/docs/react/data/queries/#manual-execution-with-uselazyquery
I have been trying to send data from my Angular front end to my Django backend and the data doesn't seem to be sending and I can't figure out why. This is my code on the frontend...
const post_data = {
email: form.value.email,
password: form.value.password
}
const headers = new HttpHeaders({
'Content-Type': 'application/json',
});
const params = new HttpParams()
const options = {
headers,
params,
};
this.httpClient.post('https://my_website/signin/check/', post_data, options).subscribe(
result => { console.log(result); },
error => { },
() => {}
)
and in my Django backend, I print the request.POST to see if the data is sent correctly, and nothing shows up. This is the result:
<QueryDict: {}>
Any help would be nice. Thanks!
As also mentioned in the comments, request.POST would contain the data only if the data is populated by a form-encoded post. You can verify it by using POSTMAN to send a request with form-encoded data.
However, in your case using request.body will output the post_data that you have passed in the http request.
I'm trying to crawl our local Confluence installation with the PuppeteerCrawler. My strategy is to login first, then extracting the session cookies and using them in the header of the start url. The code is as follows:
First, I login 'by foot' to extract the relevant credentials:
const Apify = require("apify");
const browser = await Apify.launchPuppeteer({sloMo: 500});
const page = await browser.newPage();
await page.goto('https://mycompany/confluence/login.action');
await page.focus('input#os_username');
await page.keyboard.type('myusername');
await page.focus('input#os_password');
await page.keyboard.type('mypasswd');
await page.keyboard.press('Enter');
await page.waitForNavigation();
// Get cookies and close the login session
const cookies = await page.cookies();
browser.close();
const cookie_jsession = cookies.filter( cookie => {
return cookie.name === "JSESSIONID"
})[0];
const cookie_crowdtoken = cookies.filter( cookie => {
return cookie.name === "crowd.token_key"
})[0];
Then I'm building up the crawler structure with the prepared request header:
const startURL = {
url: 'https://mycompany/confluence/index.action',
method: 'GET',
headers:
{
Accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
'Accept-Encoding': 'gzip, deflate, br',
'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7',
Cookie: `${cookie_jsession.name}=${cookie_jsession.value}; ${cookie_crowdtoken.name}=${cookie_crowdtoken.value}`,
}
}
const requestQueue = await Apify.openRequestQueue();
await requestQueue.addRequest(new Apify.Request(startURL));
const pseudoUrls = [ new Apify.PseudoUrl('https://mycompany/confluence/[.*]')];
const crawler = new Apify.PuppeteerCrawler({
launchPuppeteerOptions: {headless: false, sloMo: 500 },
requestQueue,
handlePageFunction: async ({ request, page }) => {
const title = await page.title();
console.log(`Title of ${request.url}: ${title}`);
console.log(page.content());
await Apify.utils.enqueueLinks({
page,
selector: 'a:not(.like-button)',
pseudoUrls,
requestQueue
});
},
maxRequestsPerCrawl: 3,
maxConcurrency: 10,
});
await crawler.run();
The by-foot-login and cookie extraction seems to be ok (the "curlified" request works perfectly), but Confluence doesn't accept the login via puppeteer / headless chromium. It seems like the headers are getting lost somehow..
What am I doing wrong?
Without first going into the details of why the headers don't work, I would suggest defining a custom gotoFunction in the PuppeteerCrawler options, such as:
{
// ...
gotoFunction: async ({ request, page }) => {
await page.setCookie(...cookies); // From page.cookies() earlier.
return page.goto(request.url, { timeout: 60000 })
}
}
This way, you don't need to do the parsing and the cookies will automatically be injected into the browser before each page load.
As a note, modifying default request headers when using a headless browser is not a good practice, because it may lead to blocking on some sites that match received headers against a list of known browser fingerprints.
Update:
The below section is no longer relevant, because you can now use the Request class to override headers as expected.
The headers problem is a complex one involving request interception in Puppeteer. Here's the related GitHub issue in Apify SDK. Unfortunately, the method of overriding headers via a Request object currently does not work in PuppeteerCrawler, so that's why you were unsuccessful.
I have a server-side rendered Next.js/express app that communicates with a Django API (cross-origin). I login a user like so:
const response = await fetch('localhost:8000/sign-in', {
method: 'POST',
credentials: 'include',
body: JSON.stringify({ email, password }),
headers: { 'Content-Type': 'application/json' },
});
const result = await response.json();
if (response.status === 200) {
Router.push('/account');
}
Django successfully logs in the user and returns set-cookie headers for the csrftoken and sessionid cookies, however, when I navigate to a different page (like in the above code when I Router.push), the cookies don't persist.
I assume this has something to do with server-side vs. client-side, but when cookies are set in the browser I expect them to persist regardless.
How can I get these cookies, once set, to persist across all pages on the client side?
It turns out that set-cookie is the old way of doing things. It's controlled by the browser, so it's obfuscated.
I ended up sending the csrftoken and sessionid back to the client in the JSON body, and saving them to localStorage using localStorage.setItem('sessionid', 'theSessionId') and localStorage.setItem('csrftoken', 'theCsrftoken').
Then when I need to make an authenticated request, I include them in the fetch headers:
const response = await fetch(`${API_HOST}/logout`, {
method: 'POST',
headers: {
'X-CSRFToken': localStorage.getItem('csrftoken'),
sessionid: localStorage.getItem('sessionid'),
},
});
Im sending a POST that creates a new User, and that works.
My question is how do I get back for example the pk of the created user to the ajax response?
$.ajax({
url: 'http://localhost:8080/api/v1/create/user/',
type: 'POST',
contentType: 'application/json',
data: '{"uuid": "12345"}',
dataType: 'json',
processData: false,
success: function (r) {
console.log(r)
},
});
def obj_create(self, bundle, request=None, **kwargs):
try:
user = User.objects.create_user(bundle.data['uuid'],'1')
user.save()
except:
pass
return bundle
you can set always_return_data=True within your UserResource's Meta and on POST and PUT request it will return the created object back.
From the docs
always_return_data
Specifies all HTTP methods (except DELETE) should return a serialized form of the data. Default is False.
If False, HttpNoContent (204) is returned on POST/PUT with an empty body & a Location header of where to request the full resource.
If True, HttpAccepted (202) is returned on POST/PUT with a body containing all the data in a serialized form.
Each resource has dehydrate method. You can use it to add any data to response. Here are the docs - http://django-tastypie.readthedocs.org/en/latest/cookbook.html#adding-custom-values
You could either use the Location header (set by Tastypie by default) or you could try to make Tastypie send the newly created entity back. I believe the first one is simpler. You may also take a look at related SO question: Is it ok by REST to return content after POST?
First you need to slightly modify jQuery XHR objects,
// Required for reading Location header of POST responses.
var _super = $.ajaxSettings.xhr;
$.ajaxSetup({
xhr: function () {
var xhr = _super();
var getAllResponseHeaders = xhr.getAllResponseHeaders;
xhr.getAllResponseHeaders = function () {
var allHeaders = getAllResponseHeaders.call(xhr);
if (allHeaders) {
return allHeaders;
}
allHeaders = "";
$(["Cache-Control", "Content-Language", "Content-Type", "Expires", "Last-Modified", "Pragma", "Location"]).each(function (i, header_name) {
if (xhr.getResponseHeader(header_name)) {
allHeaders += header_name + ": " + xhr.getResponseHeader(header_name) + "\n";
}
});
return allHeaders;
};
return xhr;
}
});
This is required because (after jQuery $.ajax docs):
At present, due to a bug in Firefox where .getAllResponseHeaders() returns the empty string although .getResponseHeader('Content-Type') returns a non-empty string, automatically decoding JSON CORS responses in Firefox with jQuery is not supported.
A workaround to this is possible by overriding jQuery.ajaxSettings.xhr as follows:
Then you can read the header in the successCallback, like so:
successCallback: errorAwareCall(function (data, t, textStatus, XMLHttpRequest) {
var loc = XMLHttpRequest.getAllResponseHeaders();
var pk = parseInt(loc.match(/\/(\d+)(\/)*/)[1]);
// Do something with the PK
})