Django rest framework - Authentication error with PUT requests - django

I have a very simple Resource like this for my model 'Presentacion'
class PresentacionResource(ModelResource):
model = Presentacion
fields = (some fields)
ignore_fields = (few to ignore)
and I need to implement authentication for this, so as I read, I created two wrappers
class AuthListOrCreateModelView(ListOrCreateModelView):
permissions = (IsAuthenticated, )
class AuthInstanceModelView(InstanceModelView):
permissions = (IsAuthenticated, )
And then in my in my urls.py
url(r'^presentaciones/$', AuthListOrCreateModelView.as_view(resource=PresentacionResource), name='presentacion-root'),
url(r'^presentaciones/(?P<id>[0-9]+)$', AuthInstanceModelView.as_view(resource=PresentacionResource), name='presentacion'),
This is working fine for the GET 'presentaciones/' requests but when I try to make a PUT request, I'm getting a 403 FORBIDDEN
What's strange to me is that GET is working fine: as long as I'm logged, it's responding correctly but if I logout it responds with 403 FORBIDDEN.

If the issue is the X-CSRF token header you can modify the Backbone.sync like this to send a token with each POST, PUT, DELETE request.
/* alias away the sync method */
Backbone._sync = Backbone.sync;
/* define a new sync method */
Backbone.sync = function(method, model, options) {
/* only need a token for non-get requests */
if (method == 'create' || method == 'update' || method == 'delete') {
// CSRF token value is in an embedded meta tag
var csrfToken = $("meta[name='csrf_token']").attr('content');
options.beforeSend = function(xhr){
xhr.setRequestHeader('X-CSRFToken', csrfToken);
};
}
/* proxy the call to the old sync method */
return Backbone._sync(method, model, options);
};

If you are using Django's session based authentication, then you may be tripping over the CSRF protection built into Django (see UserLoggedInAuthentication class[1]).
If this is the case, you will need to ensure that a CSRF cookie gets sent to the client and then you can adapt the jQuery instructions[2] to send the X-CSRFToken header with requests that may change data.
[1] http://django-rest-framework.org/_modules/authentication.html
[2] https://docs.djangoproject.com/en/dev/ref/contrib/csrf/#ajax

I realize this is an older post, but I was dealing with this problem recently. Expanding on #orangewarp's answer and using django documentation (https://docs.djangoproject.com/en/dev/ref/contrib/csrf/#ajax), here's a solution:
This solution uses the csrftoken cookie. Another solution would be to create a csrf token endpoint in your API and grab the csrf from there.
Backbone._sync = Backbone.sync;
Backbone.sync = function(method, model, options) {
//from django docs
function getCookie(name) {
var cookieValue = null;
if (document.cookie && document.cookie != '') {
var cookies = document.cookie.split(';');
for (var i = 0; i < cookies.length; i++) {
var cookie = jQuery.trim(cookies[i]);
// Does this cookie string begin with the name we want?
if (cookie.substring(0, name.length + 1) == (name + '=')) {
cookieValue = decodeURIComponent(cookie.substring(name.length + 1));
break;
}
}
}
return cookieValue;
}
/* only need a token for non-get requests */
if (method == 'create' || method == 'update' || method == 'delete') {
var csrfToken = getCookie('csrftoken');
options.beforeSend = function(xhr){
xhr.setRequestHeader('X-CSRFToken', csrfToken);
};
}
return Backbone._sync(method, model, options);
};

Related

Trying to get a cookie value which I set on ARCGIS online but not getting any value back?

I am trying to set a cookie in ESRI Arcgis online using ESRI runtime SDK for .net v100.
var cookie = new CookieHeaderValue("customCookie", cred.Token);
var response = Request.CreateResponse(HttpStatusCode.OK, new {
token = cred.Token,
expires = cred.ExpirationDate
});
response.Content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
response.Headers.AddCookies(new CookieHeaderValue[] { cookie });
return response;
Now when I try to retrieve that cookie later on in subsequent requests using below I get null.
CookieHeaderValue cookie = context.Request.Headers.GetCookies("customCookie").FirstOrDefault();
I am wondering if there is another way to get the cookie which I set back?
Are you using v100?
If yes, you can try the following code:
ArcGISHttpClientHandler.HttpRequestBegin += (sender, request) =>
{
var cookieContainer = ((System.Net.Http.HttpClientHandler)sender).CookieContainer;
var cookies = cookieContainer.GetCookies(request.RequestUri);
var customCookie = new Cookie("customCookie", "someValue") { Domain = request.RequestUri.Host };
bool foundCookie = false;
foreach (Cookie cookie in cookies)
{
if (cookie.Name == customCookie.Name)
{
foundCookie = true;
break;
}
}
if (!foundCookie)
cookieContainer.Add(customCookie);
};
ArcGISHttpClientHandler has an event HttpRequestBegin which is invoked on every request. You can use CookieContainer.GetCookies and Add to retrieve/add cookies.

How to handle expired access token in asp.net core using refresh token with OpenId Connect

I have configured an ASOS OpenIdConnect Server using and an asp.net core mvc app that uses the "Microsoft.AspNetCore.Authentication.OpenIdConnect": "1.0.0 and "Microsoft.AspNetCore.Authentication.Cookies": "1.0.0". I have tested the "Authorization Code" workflow and everything works.
The client web app processes the authentication as expected and creates a cookie storing the id_token, access_token, and refresh_token.
How do I force Microsoft.AspNetCore.Authentication.OpenIdConnect to request a new access_token when it expires?
The asp.net core mvc app ignores the expired access_token.
I would like to have openidconnect see the expired access_token then make a call using the refresh token to get a new access_token. It should also update the cookie values. If the refresh token request fails I would expect openidconnect to "sign out" the cookie (remove it or something).
app.UseCookieAuthentication(new CookieAuthenticationOptions
{
AutomaticAuthenticate = true,
AutomaticChallenge = true,
AuthenticationScheme = "Cookies"
});
app.UseOpenIdConnectAuthentication(new OpenIdConnectOptions
{
ClientId = "myClient",
ClientSecret = "secret_secret_secret",
PostLogoutRedirectUri = "http://localhost:27933/",
RequireHttpsMetadata = false,
GetClaimsFromUserInfoEndpoint = true,
SaveTokens = true,
ResponseType = OpenIdConnectResponseType.Code,
AuthenticationMethod = OpenIdConnectRedirectBehavior.RedirectGet,
Authority = http://localhost:27933,
MetadataAddress = "http://localhost:27933/connect/config",
Scope = { "email", "roles", "offline_access" },
});
It seems there is no programming in the openidconnect authentication for asp.net core to manage the access_token on the server after received.
I found that I can intercept the cookie validation event and check if the access token has expired. If so, make a manual HTTP call to the token endpoint with the grant_type=refresh_token.
By calling context.ShouldRenew = true; this will cause the cookie to be updated and sent back to the client in the response.
I have provided the basis of what I have done and will work to update this answer once all work as been resolved.
app.UseCookieAuthentication(new CookieAuthenticationOptions
{
AutomaticAuthenticate = true,
AutomaticChallenge = true,
AuthenticationScheme = "Cookies",
ExpireTimeSpan = new TimeSpan(0, 0, 20),
SlidingExpiration = false,
CookieName = "WebAuth",
Events = new CookieAuthenticationEvents()
{
OnValidatePrincipal = context =>
{
if (context.Properties.Items.ContainsKey(".Token.expires_at"))
{
var expire = DateTime.Parse(context.Properties.Items[".Token.expires_at"]);
if (expire > DateTime.Now) //TODO:change to check expires in next 5 mintues.
{
logger.Warn($"Access token has expired, user: {context.HttpContext.User.Identity.Name}");
//TODO: send refresh token to ASOS. Update tokens in context.Properties.Items
//context.Properties.Items["Token.access_token"] = newToken;
context.ShouldRenew = true;
}
}
return Task.FromResult(0);
}
}
});
You must enable the generation of refresh_token by setting in startup.cs:
Setting values to AuthorizationEndpointPath = "/connect/authorize"; // needed for refreshtoken
Setting values to TokenEndpointPath = "/connect/token"; // standard token endpoint name
In your token provider, before validating the token request at the end of the HandleTokenrequest method, make sure you have set the offline scope:
// Call SetScopes with the list of scopes you want to grant
// (specify offline_access to issue a refresh token).
ticket.SetScopes(
OpenIdConnectConstants.Scopes.Profile,
OpenIdConnectConstants.Scopes.OfflineAccess);
If that is setup properly, you should receive a refresh_token back when you login with a password grant_type.
Then from your client you must issue the following request (I'm using Aurelia):
refreshToken() {
let baseUrl = yourbaseUrl;
let data = "client_id=" + this.appState.clientId
+ "&grant_type=refresh_token"
+ "&refresh_token=myRefreshToken";
return this.http.fetch(baseUrl + 'connect/token', {
method: 'post',
body : data,
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
'Accept': 'application/json'
}
});
}
and that's it, make sure that your auth provider in HandleRequestToken is not trying to manipulate the request that is of type refresh_token:
public override async Task HandleTokenRequest(HandleTokenRequestContext context)
{
if (context.Request.IsPasswordGrantType())
{
// Password type request processing only
// code that shall not touch any refresh_token request
}
else if(!context.Request.IsRefreshTokenGrantType())
{
context.Reject(
error: OpenIdConnectConstants.Errors.InvalidGrant,
description: "Invalid grant type.");
return;
}
return;
}
The refresh_token shall just be able to pass through this method and is handled by another piece of middleware that handles refresh_token.
If you want more in depth knowledge about what the auth server is doing, you can have a look at the code of the OpenIdConnectServerHandler:
https://github.com/aspnet-contrib/AspNet.Security.OpenIdConnect.Server/blob/master/src/AspNet.Security.OpenIdConnect.Server/OpenIdConnectServerHandler.Exchange.cs
On the client side you must also be able to handle the auto refresh of the token, here is an example of an http interceptor for Angular 1.X, where one handles 401 reponses, refresh the token, then retry the request:
'use strict';
app.factory('authInterceptorService',
['$q', '$injector', '$location', 'localStorageService',
function ($q, $injector, $location, localStorageService) {
var authInterceptorServiceFactory = {};
var $http;
var _request = function (config) {
config.headers = config.headers || {};
var authData = localStorageService.get('authorizationData');
if (authData) {
config.headers.Authorization = 'Bearer ' + authData.token;
}
return config;
};
var _responseError = function (rejection) {
var deferred = $q.defer();
if (rejection.status === 401) {
var authService = $injector.get('authService');
console.log("calling authService.refreshToken()");
authService.refreshToken().then(function (response) {
console.log("token refreshed, retrying to connect");
_retryHttpRequest(rejection.config, deferred);
}, function () {
console.log("that didn't work, logging out.");
authService.logOut();
$location.path('/login');
deferred.reject(rejection);
});
} else {
deferred.reject(rejection);
}
return deferred.promise;
};
var _retryHttpRequest = function (config, deferred) {
console.log('autorefresh');
$http = $http || $injector.get('$http');
$http(config).then(function (response) {
deferred.resolve(response);
},
function (response) {
deferred.reject(response);
});
}
authInterceptorServiceFactory.request = _request;
authInterceptorServiceFactory.responseError = _responseError;
authInterceptorServiceFactory.retryHttpRequest = _retryHttpRequest;
return authInterceptorServiceFactory;
}]);
And here is an example I just did for Aurelia, this time I wrapped my http client into an http handler that checks if the token is expired or not. If it is expired it will first refresh the token, then perform the request. It uses a promise to keep the interface with the client-side data services consistent. This handler exposes the same interface as the aurelia-fetch client.
import {inject} from 'aurelia-framework';
import {HttpClient} from 'aurelia-fetch-client';
import {AuthService} from './authService';
#inject(HttpClient, AuthService)
export class HttpHandler {
constructor(httpClient, authService) {
this.http = httpClient;
this.authService = authService;
}
fetch(url, options){
let _this = this;
if(this.authService.tokenExpired()){
console.log("token expired");
return new Promise(
function(resolve, reject) {
console.log("refreshing");
_this.authService.refreshToken()
.then(
function (response) {
console.log("token refreshed");
_this.http.fetch(url, options).then(
function (success) {
console.log("call success", url);
resolve(success);
},
function (error) {
console.log("call failed", url);
reject(error);
});
}, function (error) {
console.log("token refresh failed");
reject(error);
});
}
);
}
else {
// token is not expired, we return the promise from the fetch client
return this.http.fetch(url, options);
}
}
}
For jquery you can look a jquery oAuth:
https://github.com/esbenp/jquery-oauth
Hope this helps.
Following on from #longday's answer, I have had success in using this code to force a client refresh without having to manually query an open id endpoint:
OnValidatePrincipal = context =>
{
if (context.Properties.Items.ContainsKey(".Token.expires_at"))
{
var expire = DateTime.Parse(context.Properties.Items[".Token.expires_at"]);
if (expire > DateTime.Now) //TODO:change to check expires in next 5 mintues.
{
context.ShouldRenew = true;
context.RejectPrincipal();
}
}
return Task.FromResult(0);
}

Using AWS Gateway API, can I access the cookies?

Using a HTTP Proxy Integration I want to access the cookies and add one to the json response. Is that possible?
To access cookies sent by the client in your backend you'll have to setup a mapping from the method request header to your integration request header.
These instructions assume you've already setup a simple method in API Gateway.
Access cookies in your backend
Under Method Request, create an HTTP Request Header with the name of "Cookie"
Under Integration Request, create an HTTP header with name "Cookie" and "Mapped from" value of method.request.header.Cookie.
You'll also likely need to setup CORS for this method. See: http://docs.aws.amazon.com/apigateway/latest/developerguide/how-to-cors.html
Deploy your API and make a request to your API Gateway endpoint with your browser/client. You should see requests coming in to your HTTP backend with the Cookie header value sent from the browser.
Add cookie to response
You can setup a Set-Cookie response header in an analogous fashion for the the integration response/method response side of the method configuration.
Under Method Response, create a Response header with name Set-Cookie
Under Integration Response setup a Header Mapping with Response header Set-Cookie and Mapping value integration.response.header.Set-Cookie
Please note that at this time, API Gateway supports setting just a single Set-Cookie response header. If your backend attempts to set multiple Set-Cookie headers, only the last one will be set. See this forum post for more details: https://forums.aws.amazon.com/thread.jspa?messageID=701434
If you check the "Use Lambda Proxy integration" option in your API Gateway method, the request headers will be passed to your Lambda function via the event variable. API Gateway will also expect a different response from your callback function. This response format can be use to dictate a Set-Cookie header. e.g.:
callback(null, {
statusCode: 200,
headers: {'Set-Cookie': 'key=val'},
body: 'Some response'
})`
This approach has the advantage of not requiring any Method Request or Method Response tweaks.
Here's a sample Lambda function using this logic to rotate a cookie value after each request.
exports.handler = (event, context, callback) => {
var cookies = getCookiesFromHeader(event.headers);
var old_cookie = cookies.flavor;
var new_cookie = pickCookieFlavor(old_cookie);
return callback(null, {
statusCode: 200,
headers: {
'Set-Cookie': setCookieString('flavor', new_cookie),
'Content-Type': 'text/plain'
},
body: 'Your cookie flavor was ' + old_cookie + '. Your new flavor is ' + new_cookie + '.'
});
};
/**
* Rotate the cookie flavor
*/
function pickCookieFlavor(cookie) {
switch (cookie) {
case 'peanut':
return 'chocolate';
case 'chocolate':
return 'raisin and oat';
default:
return 'peanut';
}
}
/**
* Receives an array of headers and extract the value from the cookie header
* #param {String} errors List of errors
* #return {Object}
*/
function getCookiesFromHeader(headers) {
if (headers === null || headers === undefined || headers.Cookie === undefined) {
return {};
}
// Split a cookie string in an array (Originally found http://stackoverflow.com/a/3409200/1427439)
var list = {},
rc = headers.Cookie;
rc && rc.split(';').forEach(function( cookie ) {
var parts = cookie.split('=');
var key = parts.shift().trim()
var value = decodeURI(parts.join('='));
if (key != '') {
list[key] = value
}
});
return list;
};
/**
* Build a string appropriate for a `Set-Cookie` header.
* #param {string} key Key-name for the cookie.
* #param {string} value Value to assign to the cookie.
* #param {object} options Optional parameter that can be use to define additional option for the cookie.
* ```
* {
* secure: boolean // Watever to output the secure flag. Defaults to true.
* httpOnly: boolean // Watever to ouput the HttpOnly flag. Defaults to true.
* domain: string // Domain to which the limit the cookie. Default to not being outputted.
* path: string // Path to which to limit the cookie. Defaults to '/'
* expires: UTC string or Date // When this cookie should expire. Default to not being outputted.
* maxAge: integer // Max age of the cookie in seconds. For compatibility with IE, this will be converted to a
* `expires` flag. If both the expires and maxAge flags are set, maxAge will be ignores. Default to not being
* outputted.
* }
* ```
* #return string
*/
function setCookieString(key, value, options) {
var defaults = {
secure: true,
httpOnly: true,
domain: false,
path: '/',
expires: false,
maxAge: false
}
if (typeof options == 'object') {
options = Object.assign({}, defaults, options);
} else {
options = defaults;
}
var cookie = key + '=' + value;
if (options.domain) {
cookie = cookie + '; domain=' + options.domain;
}
if (options.path) {
cookie = cookie + '; path=' + options.path;
}
if (!options.expires && options.maxAge) {
options.expires = new Date(new Date().getTime() + parseInt(options.maxAge) * 1000); // JS operate in Milli-seconds
}
if (typeof options.expires == "object" && typeof options.expires.toUTCString) {
options.expires = options.expires.toUTCString();
}
if (options.expires) {
cookie = cookie + '; expires=' + options.expires.toString();
}
if (options.secure) {
cookie = cookie + '; Secure';
}
if (options.httpOnly) {
cookie = cookie + '; HttpOnly';
}
return cookie;
}

How to set content-length-range for s3 browser upload via boto

The Issue
I'm trying to upload images directly to S3 from the browser and am getting stuck applying the content-length-range permission via boto's S3Connection.generate_url method.
There's plenty of information about signing POST forms, setting policies in general and even a heroku method for doing a similar submission. What I can't figure out for the life of me is how to add the "content-length-range" to the signed url.
With boto's generate_url method (example below), I can specify policy headers and have got it working for normal uploads. What I can't seem to add is a policy restriction on max file size.
Server Signing Code
## django request handler
from boto.s3.connection import S3Connection
from django.conf import settings
from django.http import HttpResponse
import mimetypes
import json
conn = S3Connection(settings.S3_ACCESS_KEY, settings.S3_SECRET_KEY)
object_name = request.GET['objectName']
content_type = mimetypes.guess_type(object_name)[0]
signed_url = conn.generate_url(
expires_in = 300,
method = "PUT",
bucket = settings.BUCKET_NAME,
key = object_name,
headers = {'Content-Type': content_type, 'x-amz-acl':'public-read'})
return HttpResponse(json.dumps({'signedUrl': signed_url}))
On the client, I'm using the ReactS3Uploader which is based on tadruj's s3upload.js script. It shouldn't be affecting anything as it seems to just pass along whatever the signedUrls covers, but copied below for simplicity.
ReactS3Uploader JS Code (simplified)
uploadFile: function() {
new S3Upload({
fileElement: this.getDOMNode(),
signingUrl: /api/get_signing_url/,
onProgress: this.props.onProgress,
onFinishS3Put: this.props.onFinish,
onError: this.props.onError
});
},
render: function() {
return this.transferPropsTo(
React.DOM.input({type: 'file', onChange: this.uploadFile})
);
}
S3upload.js
S3Upload.prototype.signingUrl = '/sign-s3';
S3Upload.prototype.fileElement = null;
S3Upload.prototype.onFinishS3Put = function(signResult) {
return console.log('base.onFinishS3Put()', signResult.publicUrl);
};
S3Upload.prototype.onProgress = function(percent, status) {
return console.log('base.onProgress()', percent, status);
};
S3Upload.prototype.onError = function(status) {
return console.log('base.onError()', status);
};
function S3Upload(options) {
if (options == null) {
options = {};
}
for (option in options) {
if (options.hasOwnProperty(option)) {
this[option] = options[option];
}
}
this.handleFileSelect(this.fileElement);
}
S3Upload.prototype.handleFileSelect = function(fileElement) {
this.onProgress(0, 'Upload started.');
var files = fileElement.files;
var result = [];
for (var i=0; i < files.length; i++) {
var f = files[i];
result.push(this.uploadFile(f));
}
return result;
};
S3Upload.prototype.createCORSRequest = function(method, url) {
var xhr = new XMLHttpRequest();
if (xhr.withCredentials != null) {
xhr.open(method, url, true);
}
else if (typeof XDomainRequest !== "undefined") {
xhr = new XDomainRequest();
xhr.open(method, url);
}
else {
xhr = null;
}
return xhr;
};
S3Upload.prototype.executeOnSignedUrl = function(file, callback) {
var xhr = new XMLHttpRequest();
xhr.open('GET', this.signingUrl + '&objectName=' + file.name, true);
xhr.overrideMimeType && xhr.overrideMimeType('text/plain; charset=x-user-defined');
xhr.onreadystatechange = function() {
if (xhr.readyState === 4 && xhr.status === 200) {
var result;
try {
result = JSON.parse(xhr.responseText);
} catch (error) {
this.onError('Invalid signing server response JSON: ' + xhr.responseText);
return false;
}
return callback(result);
} else if (xhr.readyState === 4 && xhr.status !== 200) {
return this.onError('Could not contact request signing server. Status = ' + xhr.status);
}
}.bind(this);
return xhr.send();
};
S3Upload.prototype.uploadToS3 = function(file, signResult) {
var xhr = this.createCORSRequest('PUT', signResult.signedUrl);
if (!xhr) {
this.onError('CORS not supported');
} else {
xhr.onload = function() {
if (xhr.status === 200) {
this.onProgress(100, 'Upload completed.');
return this.onFinishS3Put(signResult);
} else {
return this.onError('Upload error: ' + xhr.status);
}
}.bind(this);
xhr.onerror = function() {
return this.onError('XHR error.');
}.bind(this);
xhr.upload.onprogress = function(e) {
var percentLoaded;
if (e.lengthComputable) {
percentLoaded = Math.round((e.loaded / e.total) * 100);
return this.onProgress(percentLoaded, percentLoaded === 100 ? 'Finalizing.' : 'Uploading.');
}
}.bind(this);
}
xhr.setRequestHeader('Content-Type', file.type);
xhr.setRequestHeader('x-amz-acl', 'public-read');
return xhr.send(file);
};
S3Upload.prototype.uploadFile = function(file) {
return this.executeOnSignedUrl(file, function(signResult) {
return this.uploadToS3(file, signResult);
}.bind(this));
};
module.exports = S3Upload;
Any help would be greatly appreciated here as I've been banging my head against the wall for quite a few hours now.
You can't add it to a signed PUT URL. This only works with the signed policy that goes along with a POST because the two mechanisms are very different.
Signing a URL is a lossy (for lack of a better term) process. You generate the string to sign, then sign it. You send the signature with the request, but you discard and do not send the string to sign. S3 then reconstructs what the string to sign should have been, for the request it receives, and generates the signature you should have sent with that request. There's only one correct answer, and S3 doesn't know what string you actually signed. The signature matches, or doesn't, either because you built the string to sign incorrectly, or your credentials don't match, and it doesn't know which of these possibilities is the case. It only knows, based on the request you sent, the string you should have signed and what the signature should have been.
With that in mind, for content-length-range to work with a signed URL, the client would need to actually send such a header with the request... which doesn't make a lot of sense.
Conversely, with POST uploads, there is more information communicated to S3. It's not only going on whether your signature is valid, it also has your policy document... so it's possible to include directives -- policies -- with the request. They are protected from alteration by the signature, but they aren't encrypted or hashed -- the entire policy is readable by S3 (so, by contrast, we'll call this the opposite, "lossless.")
This difference is why you can't do what you are trying to do with PUT while you can with POST.

Django with jquery-tinymce image upload plugin

I have installed jbimages from http://justboil.me/ into jquery-tinymce folder of my django project for getting local images from computer.
When I upload image, it is throwing the error as "This is taking longer than usual.An error may have occurred."
It is showing the script output error as "CSRF verification failed. Request aborted."
But i already gave {% csrf_token %} in the form of dialog.htm.
Iam getting the error after selecting the image as shown below:
Can anyone help me how to get rid of this issue?
seems like the form is being posted using ajax. If you are using ajax to post the form make sure you include the csrf_token in the POST data. which in this case you are missing.
Alternatively add the following script to your base.html and it will take care of updating the csrf_token for each Ajax request.
CSRF_AJAX_PATCH
$(document).ajaxSend(function(event, xhr, settings) {
function getCookie(name) {
var cookieValue = null;
if (document.cookie && document.cookie != '') {
var cookies = document.cookie.split(';');
for (var i = 0; i < cookies.length; i++) {
var cookie = jQuery.trim(cookies[i]);
// Does this cookie string begin with the name we want?
if (cookie.substring(0, name.length + 1) == (name + '=')) {
cookieValue = decodeURIComponent(cookie.substring(name.length + 1));
break;
}
}
}
return cookieValue;
}
function sameOrigin(url) {
// url could be relative or scheme relative or absolute
var host = document.location.host; // host + port
var protocol = document.location.protocol;
var sr_origin = '//' + host;
var origin = protocol + sr_origin;
// Allow absolute or scheme relative URLs to same origin
return (url == origin || url.slice(0, origin.length + 1) == origin + '/') ||
(url == sr_origin || url.slice(0, sr_origin.length + 1) == sr_origin + '/') ||
// or any other URL that isn't scheme relative or absolute i.e relative.
!(/^(\/\/|http:|https:).*/.test(url));
}
function safeMethod(method) {
return (/^(GET|HEAD|OPTIONS|TRACE)$/.test(method));
}
if (!safeMethod(settings.type) && sameOrigin(settings.url)) {
xhr.setRequestHeader("X-CSRFToken", getCookie('csrftoken'));
}
});
If you are trying to upload an image through ajax request then you must have to look CSRF validation in ajax.
OR if not then THIS question may help you.