Sending data with HttpClient post to Django backend - django

I have been trying to send data from my Angular front end to my Django backend and the data doesn't seem to be sending and I can't figure out why. This is my code on the frontend...
const post_data = {
email: form.value.email,
password: form.value.password
}
const headers = new HttpHeaders({
'Content-Type': 'application/json',
});
const params = new HttpParams()
const options = {
headers,
params,
};
this.httpClient.post('https://my_website/signin/check/', post_data, options).subscribe(
result => { console.log(result); },
error => { },
() => {}
)
and in my Django backend, I print the request.POST to see if the data is sent correctly, and nothing shows up. This is the result:
<QueryDict: {}>
Any help would be nice. Thanks!

As also mentioned in the comments, request.POST would contain the data only if the data is populated by a form-encoded post. You can verify it by using POSTMAN to send a request with form-encoded data.
However, in your case using request.body will output the post_data that you have passed in the http request.

Related

Request body is not being passed from axios get call to Django Rest Framework

I have an issue while sending get request with a body in axios. It does not pass the body of the request to the backend.
Axios code looks like below
const FunctionName = (environment, page_num) => {
axios.get(API_URL,
{ params:
{
environment,
page_num
},
}).then(res => {
console.log(res);
}).catch(err => {
console.log(err.response.data);
});
}
I'm using Django as my backend and I'm receiving empty body i.e {} which causes bad request sent to the backend. I went through several stack overflow questions but none of them helped me. Can anyone please help me with this.
Update
My django code looks like below
class TestView(APIView);
def get(self, request):
environment = request.data['environment']
page_num = request.data['page_num']
...
...
Here when I'm unable to get the environment or page_num data. The same request when I send from postman with the get call and content in the request of the body, it accepts and sends the response back.
Re-Update
I noticed that we have to use request.query_params['some_val'] incase we're passing the body in a request from Axios but request.query_params['some_val'] will not work if we send a request with the body in postman. I'm not sure it is normal behavior or not!
I'm not sure but try this:
axios({
method: "get",
url: API_URL,
body: {
environment,
page_num
}
}).then(res => console.log(res.data));
In Django, are you try get body with request.body
Anyone who's facing the issue can find the answer below to the link.
Possible duplicate
How to access get request data in django rest framework

Apify: Preserve headers in RequestQueue

I'm trying to crawl our local Confluence installation with the PuppeteerCrawler. My strategy is to login first, then extracting the session cookies and using them in the header of the start url. The code is as follows:
First, I login 'by foot' to extract the relevant credentials:
const Apify = require("apify");
const browser = await Apify.launchPuppeteer({sloMo: 500});
const page = await browser.newPage();
await page.goto('https://mycompany/confluence/login.action');
await page.focus('input#os_username');
await page.keyboard.type('myusername');
await page.focus('input#os_password');
await page.keyboard.type('mypasswd');
await page.keyboard.press('Enter');
await page.waitForNavigation();
// Get cookies and close the login session
const cookies = await page.cookies();
browser.close();
const cookie_jsession = cookies.filter( cookie => {
return cookie.name === "JSESSIONID"
})[0];
const cookie_crowdtoken = cookies.filter( cookie => {
return cookie.name === "crowd.token_key"
})[0];
Then I'm building up the crawler structure with the prepared request header:
const startURL = {
url: 'https://mycompany/confluence/index.action',
method: 'GET',
headers:
{
Accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
'Accept-Encoding': 'gzip, deflate, br',
'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7',
Cookie: `${cookie_jsession.name}=${cookie_jsession.value}; ${cookie_crowdtoken.name}=${cookie_crowdtoken.value}`,
}
}
const requestQueue = await Apify.openRequestQueue();
await requestQueue.addRequest(new Apify.Request(startURL));
const pseudoUrls = [ new Apify.PseudoUrl('https://mycompany/confluence/[.*]')];
const crawler = new Apify.PuppeteerCrawler({
launchPuppeteerOptions: {headless: false, sloMo: 500 },
requestQueue,
handlePageFunction: async ({ request, page }) => {
const title = await page.title();
console.log(`Title of ${request.url}: ${title}`);
console.log(page.content());
await Apify.utils.enqueueLinks({
page,
selector: 'a:not(.like-button)',
pseudoUrls,
requestQueue
});
},
maxRequestsPerCrawl: 3,
maxConcurrency: 10,
});
await crawler.run();
The by-foot-login and cookie extraction seems to be ok (the "curlified" request works perfectly), but Confluence doesn't accept the login via puppeteer / headless chromium. It seems like the headers are getting lost somehow..
What am I doing wrong?
Without first going into the details of why the headers don't work, I would suggest defining a custom gotoFunction in the PuppeteerCrawler options, such as:
{
// ...
gotoFunction: async ({ request, page }) => {
await page.setCookie(...cookies); // From page.cookies() earlier.
return page.goto(request.url, { timeout: 60000 })
}
}
This way, you don't need to do the parsing and the cookies will automatically be injected into the browser before each page load.
As a note, modifying default request headers when using a headless browser is not a good practice, because it may lead to blocking on some sites that match received headers against a list of known browser fingerprints.
Update:
The below section is no longer relevant, because you can now use the Request class to override headers as expected.
The headers problem is a complex one involving request interception in Puppeteer. Here's the related GitHub issue in Apify SDK. Unfortunately, the method of overriding headers via a Request object currently does not work in PuppeteerCrawler, so that's why you were unsuccessful.

Next.js not persisting cookies

I have a server-side rendered Next.js/express app that communicates with a Django API (cross-origin). I login a user like so:
const response = await fetch('localhost:8000/sign-in', {
method: 'POST',
credentials: 'include',
body: JSON.stringify({ email, password }),
headers: { 'Content-Type': 'application/json' },
});
const result = await response.json();
if (response.status === 200) {
Router.push('/account');
}
Django successfully logs in the user and returns set-cookie headers for the csrftoken and sessionid cookies, however, when I navigate to a different page (like in the above code when I Router.push), the cookies don't persist.
I assume this has something to do with server-side vs. client-side, but when cookies are set in the browser I expect them to persist regardless.
How can I get these cookies, once set, to persist across all pages on the client side?
It turns out that set-cookie is the old way of doing things. It's controlled by the browser, so it's obfuscated.
I ended up sending the csrftoken and sessionid back to the client in the JSON body, and saving them to localStorage using localStorage.setItem('sessionid', 'theSessionId') and localStorage.setItem('csrftoken', 'theCsrftoken').
Then when I need to make an authenticated request, I include them in the fetch headers:
const response = await fetch(`${API_HOST}/logout`, {
method: 'POST',
headers: {
'X-CSRFToken': localStorage.getItem('csrftoken'),
sessionid: localStorage.getItem('sessionid'),
},
});

Unexpected token P in JSON at position 0 when trying to return Excel spreadhsheet

I have an emberJS application where I can make a POST AJAX call to a Django backend. A function in Django creates an xlsx file for a bunch of queried items based on IDs coming in the POST request. It goes through the Django view function without any issues, but when the HTTP response is returned to ember, I get the error
SyntaxError: Unexpected token P in JSON at position 0
at parse (<anonymous>)
at ajaxConvert (jquery.js:8787)
at done (jquery.js:9255)
at XMLHttpRequest.<anonymous> (jquery.js:9548)
at XMLHttpRequest.nrWrapper (base-content:20)
I'm setting the response content type to application/vnd.openxmlformats-officedocument.spreadsheetml.sheet, so I'm unsure as to why its trying to read the response as JSON.
Python Code
file_path = '/User/path_to_spreadsheet/content.xlsx'
fsock = open(file_path, "rb")
response = HttpResponse(fsock, content_type='application/vnd.openxmlformats-officedocument.spreadsheetml.sheet')
response['Content-Disposition'] = 'attachment; filename="content.xlsx"'
return response
EmberJS Code
export default Controller.extend({
actions: {
storeProductId(products) {
let product_ids = []
products.forEach(function(product){
product_ids.push(product.id)
});
let adapter = this.store.adapterFor('product-export');
adapter.export_products(product_ids).then(function(response){
console.log(response)
}).catch(function(response) {
console.log('ERROR')
console.log(response)
})
}
}
});
Product-Export Adapter Code
export default ApplicationAdapter.extend(FormDataAdapterMixin, {
export_products(products) {
let url = this.buildURL('unified-product');
url = `${url}export/`;
return this.ajax(url, 'POST', { data: {'products': products} });
}
});
By default, Ember Data makes some assumptions around how things should be handled (including that you’ll be receiving JSON data back). Is there a reason you are using Ember Data instead of using a direct Ajax call to your backend? Seems like that would greatly simplify things here ...

Fetch API for Django POST requests

I'm trying to remove jQuery from a React/Redux/Django webapp and replace the $.ajax method with the Fetch API. I've more or less got all my GET requests working fine and I seem to be able to hit my POST requests, but I cannot seem to format my request in such a way as to actually get my POST data into the Django request.POST object. Every time I hit my /sign_in view, the request.POST object is empty. My entire app's backend is built around using Django forms (no Django templates, just React controlled components) and I would really like to not have to rewrite all my views to use request.body or request.data.
Here is all the code I can think that would be relevant, please let me know if there's more that would be helpful:
This is the curried function I use to build my full POST data and attach the CSRF token:
const setUpCsrfToken = () => {
const csrftoken = Cookies.get('csrftoken')
return function post (url, options) {
const defaults = {
'method': 'POST',
'credentials': 'include',
'headers': {
'X-CSRFToken': csrftoken,
'Content-Type': 'application/x-www-form-urlencoded'
}
}
const merged = merge(options, defaults)
return fetch(url, merged)
}
}
export const post = setUpCsrfToken()
This is the API method I use from my React app:
export const signIn = data => {
return post('/api/account/sign_in/', data)
}
The data when it is originally packaged up in the React app itself is as simple as an object with string values:
{
email: 'email#email.com',
password: 'password
}
I've looked at these questions and found them to be nominally helpful, but I can't figure out to synthesize an answer for myself that takes into account what I assume is some of the intricacies of Django:
POST Request with Fetch API?
Change a jquery ajax POST request into a fetch api POST
Convert JavaScript object into URI-encoded string
Is there a better way to convert a JSON packet into a query string?
Thanks!
You have to set the appropriate X-Requested-With header. jQuery does this under the hood.
X-Requested-With: XMLHttpRequest
So, in your example, you would want something like:
const setUpCsrfToken = () => {
const csrftoken = Cookies.get('csrftoken')
return function post (url, options) {
const defaults = {
'method': 'POST',
'credentials': 'include',
'headers': new Headers({
'X-CSRFToken': csrftoken,
'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8',
'X-Requested-With': 'XMLHttpRequest'
})
}
const merged = merge(options, defaults)
return fetch(url, merged)
}
}