JSONAPI strong params with Rails and Ember - ruby-on-rails-4

I'm using Ember with ember-data and a rails api. I had a createRecord() and save() for the record that was working fine. The payload in the network tab for the post request to create the record in rails looks like: {data: {attributes: { foo: 'bar' } }.
In the rails controller, I have strong params like so: params.require(:data).require(:attributes).permit(:foo), which was working fine for a little while. Now when I send the request rails says that param is missing or the value is empty: data. If I look in the network tab in the browser the payload for the request still looks the same as stated above. If I puts params it only shows {"controller": "api/v1/answers", "action": "create"} and isn't showing the data payload at all.
Is there any reason why rails isn't picking up on the right params from ember now? I did try to add an association to the model that I'm trying to create, which is when it started failing. However, I rolled back to when it was working, but it's not working anymore.

I fixed this by going into the config/initializers/mime_types.rb file in the rails api and changing the file to look like:
api_mime_type = %W(
application/vnd.api+json
text/x-json
application/json
)
Mime::Type.unregister :json
Mime::Type.register 'application/json', :json, api_mime_type

Related

Flask cannot parse JSON in Tabulator AJAX request (using advanced configuration)

I regularly use Tabulator's setData() method. I usually set parameters in the URL args, and have no problems with it. But I now have a complex use case that will be easier to solve if I can put a JSON payload into the request.
I've followed the Tabulator documentation for an advanced configuration.
I've made a series of attempts (putting the JSON in various places, using quotes/double quotes in the JSON, etc) at trying to work out the problem. The Flask server always returns this error:
Failed to decode JSON object: Expecting value: line 1 column 1 (char 0)
What makes me suspect the problem is with Tabulator, not Flask, is because I printed request.__dict__ and couldn't find the JSON in the request. (I.e. that seems to the reason for the error.)
The below example, which triggers the same error, is taken from the Fetch documentation (Tabulator uses the Fetch API).
Is there anything wrong with the below or should I be looking harder at Flask?
const data = { username: 'example' };
var ajaxURL = "/data/results";
var ajaxConfig = {
method:"POST",
headers: {
'Content-Type': 'application/json',
'X-CSRFToken': csrf_token,
},
body: JSON.stringify(data)
};
ResultsTable.setData( ajaxURL, {}, ajaxConfig);
Notes:
I'm using the latest version of Tabulator (4.9).
ResultsTable is set elsewhere in the code and is successfully loading default data when the page loads. The use case kicks in when the user sets their own parameters for the data.
The CSRF token, which is set elsewhere in the code, is there because Flask requires it.
The reason that is failing is that Tabulator will build out its own request body when it builds a request and that will override your config.
In your usage case, you will need to override the build in ajax request promise and add your own function that makes the ajax request and then resolves the data.
You can do this using the ajaxRequestFunc.
Checkout the Ajax Request Documentation for full details

Axios Authorization not working - VueJS + Django

I am trying to build an application using VueJS and Django. I am also using Graphene-Django library, as the project utilize GraphQL.
Now, The authentication works fine and i get a JWT Token back.
But when i use the token for other queries that need authentication, i got this error in Vue:
"Error decoding signature"
and the Django Log also returns this:
graphql.error.located_error.GraphQLLocatedError: Error decoding signature
jwt.exceptions.DecodeError: Not enough segments
ValueError: not enough values to unpack (expected 2, got 1)
the bizarre thing is that the same query when executed in Postman just works fine.
As i mentioned in the title is use Axios for my requests, here's an example of a request:
axios({
method: "POST",
headers: { Authorization: "JWT " + localStorage.getItem("token") },
data: {
query: `{
dailyAppoint (today: "${today}") {
id
dateTime
}
}`
}
});
Note: It uses 'JWT' not 'Bearer' because somehow 'Bearer' didn't work for me.
Ok, couple of questions, does you API work without Vue.js from curl. Generate token, check API from curl.
If it does, then check the Headers sent from the request, from Network Inspector, mozilla dev tools/chrome devtools. And update your Post with those RAW Headers.
This particular error arises when your public key is unable to decode the string[token] signed by your private key. Which ultimately means the access token has been tampered with. It could also mean you're sending values like 'unkown'-- JS state initialization error.
Check the RAW headers of the request. It'll help.
Use a request interceptor to set the Authorization header:
axios.interceptors.request.use(config => {
if (localStorage.getItem("token") != null)
config.headers["Authorization"] = "JWT " + localStorage.getItem("token");
return config;
});

how can I fix conflicting query string params for S3 uploads?

I'm attempting to upload raw image data to S3 in the context of a react-native app.
I have the raw data correct and for the most part I think my code inside react native is working correctly to capture image data.
On my rails server, I'm using the amazon ruby gem to build the details of the url and associated authentication data required to post data to the bucket in question which I'm then rendering into react-native just like a regular react web front end.
# inside the rails server controller
s3_data = S3_BUCKET.presigned_post(key: "uploads/#{SecureRandom.uuid}/${filename}", success_action_status: '201', acl: 'public-read', url: 'https://jd-foo.s3-us-west-2.amazonaws.com')
render json: {s3Data: {fields: s3_data.fields, url: s3_data.url}}
At the moment I attempt to post to S3, I'm using ES6 fetch like the below to build my http request.
saveImage(data) {
var url = data.url
var fields = data.fields
var headers = {'Content-Type': 'multipart/form-data'}
var body = `x-amz-algorithm=${encodeURIComponent(fields['x-amz-algorithm'])}&` +
`x-amz-credential=${encodeURIComponent(fields['x-amz-credential'])}&` +
`x-amz-date=${encodeURIComponent(fields['x-amz-date'])}&` +
`x-amz-signature=${encodeURIComponent(fields['x-amz-signature'])}&` +
`acl=${encodeURIComponent(fields['acl'])}&` +
`key=${encodeURIComponent(fields['key'])}&` +
`policy=${encodeURIComponent(fields['policy'])}&` +
`success_action_status=${encodeURIComponent(fields['success_action_status'])}&` +
`file=${encodeURIComponent('12foo')}`
console.log(body);
return fetch(url, {method: 'POST', body: body, headers: headers})
.then((res) => {console.log('s3 inside api res', res['_bodyText']) ; res.json()} );
}
the logging of the body looks like
x-amz-algorithm=AWS4-HMAC-SHA256&x-amz-credential=AKIAJJ22D4PSUNBB5RAQ%2F20151027%2Fus-west-1%2Fs3%2Faws4_request&x-amz-date=20151027T223159Z&x-amz-signature=42b09d7ae134f803b10ef72d220fe74a630a3f826c7f1f625448277d0a6d93c7&acl=public-read&key=uploads%2F46be8ca3-6d3a-4bb7-a658-f2c8e058bc28%2F%24%7Bfilename%7D&policy=eyJleHBpcmF0aW9uIjoiMjAxNS0xMC0yN1QyMzozMTo1OVoiLCJjb25kaXRpb25zIjpbeyJidWNrZXQiOiJqZC1mb28ifSxbInN0YXJ0cy13aXRoIiwiJGtleSIsInVwbG9hZHMvNDZiZThjYTMtNmQzYS00YmI3LWE2NTgtZjJjOGUwNThiYzI4LyJdLHsic3VjY2Vzc19hY3Rpb25fc3RhdHVzIjoiMjAxIn0seyJhY2wiOiJwdWJsaWMtcmVhZCJ9LHsieC1hbXotY3JlZGVudGlhbCI6IkFLSUFKSjIyRDRQU1VOQkI1UkFRLzIwMTUxMDI3L3VzLXdlc3QtMS9zMy9hd3M0X3JlcXVlc3QifSx7IngtYW16LWFsZ29yaXRobSI6IkFXUzQtSE1BQy1TSEEyNTYifSx7IngtYW16LWRhdGUiOiIyMDE1MTAyN1QyMjMxNTlaIn1dfQ%3D%3D&success_action_status=201&file=12foo
It seems like my problems could be tied to both
Bad format of the post body including problems with special characters
Not providing S3 with enough data in post body including keys and other information, the documentation feels a bit unclear about what is/is not required.
The error back from S3 servers looks like
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>MalformedPOSTRequest</Code><Message>The body of your POST request is not well-formed multipart/form-data.</Message> <RequestId>DCE88AC349D7B2E8</RequestId><HostId>AKE1xctETuZMAhBFLfyuFlDxikYUlbAC7YufkM7h8Z8eVQdtLA25Z0Od/a4cMUbfW1nWnGjc+vM=</HostId></Error>
I'm pretty unclear on what my actual problems are and where I should be digging in.
Any input would be greatly appreciated.
<Message>The body of your POST request is not well-formed multipart/form-data.</Message>
It may not be that you're missing values from the body. The most significant issue here is that the structure of your body does not resemble multipart/form-data.
See RFC 2388 for how multipart/form-data works. (Or find a library that builds this for you.)
What you are sending looks more like the application/x-www-form-urlencoded format, which is used by some AWS APIs, but not S3.
There is an example in the S3 docs showing what an example POST body might look like. You should see a substantial difference there.
Note also that POST is intended for browser based uploads. If you are uoloading from code, you're doing a lot of extra work. PUT Object is much more straightforward. The request body is the binary file contents. Or, if this will eventually be done by a browser, then test it with a browser, and let the browser build your form.

dojox.form.Uploader 403 error from django server

I have a very simple form that I've added the uploader to. When I invoke the uploader, django returns
{"detail":"CSRF Failed: CSRF token missing or incorrect."}
This is the uploader:
var ul = new Uploader(
{
label:"Programmed uploader",
multiple:false,
uploadOnSelect:true,
url:Environment.apiRoot + "upload/",
headers:{
"Accept" : "application/json",
"X-CSRFToken" : dojo.cookie("csrftoken")
}
}).placeAt(form);
I created simple "test" button that invokes a function that performs the same post.
new Button({
name:"Cancel2",
//id:"Cancel",
label:"Cancel" ,
placement:"secondary",
onClick:lang.hitch(this,function(event){
this._testpost()
})
}).placeAt(form);
This is the relavent header from the uploader post
Cookie djdt=hide; csrftoken=WwlARc9OUevblKfgNEDU2Ae4eT9z0kos;sessionid=du37rjyam6v69mw0bgctkbw708xlvc5g
This is the _testpost()
_testpost: function (){
xhr.post({
url: Environment.apiRoot + "upload/",
handleAs: "json",
postData: json.stringify(data),
headers: {
"Content-Type": "application/json",
"Accept" : "application/json",
"X-CSRFToken" : dojo.cookie("csrftoken")
},
loadingMessage: "Submitting form..."
}).then(
lang.hitch(this,function(result) {
form = t._f_form;
dojo.destroy(form);
this._float.destroyRecursive();
alert(result['result_text']);
result['message'] = "Update Request Accepted";
}),lang.hitch(this, function(err){
form = t._f_form;
dojo.destroy(form);
this._float.destroyRecursive();
topic.publish("/application/message","An error occurred.");
}));
this is the relevant header from invoking the _testpost function
Cookie djdt=hide; csrftoken=WwlARc9OUevblKfgNEDU2Ae4eT9z0kos;sessionid=du37rjyam6v69mw0bgctkbw708xlvc5g
X-CSRFToken WwlARc9OUevblKfgNEDU2Ae4eT9z0kos
The key difference being that in the _testpost the X-CSRFToken is put into the header, but on the Uploader post, I don't have any means to put in an X-CSRFToken (my headers attribute seems to just be ignored - i tried it to see if I could get this to work)
Is there any way to get additional headers into the Uploader
Unfortunately, dojox.form.Uploader does not allow headers to be added.
There are a couple options. It sounds like you have access to the csrf token and could append it to the url. Another option may be to provide the csrf token as a cookie and it should be sent with XHR and Flash request.
What I have done (and i'm not sure this is correct), within the django view, disabled csrf checking, and then pull the csrf value out of the header and compare it against the csrf value that is kept in the session record on the server.
you may use dojo.aspect to add the headers to the dojox.form.Uploader.
In case you are using HTML5 upload "plugin", that looks like since you have left the default, you may use something like:
aspect.after(ul, "createXhr", function(xhr) {
xhr.setRequestHeader("Accept", "application/json");
xhr.setRequestHeader("X-CSRFToken", dojo.cookie("csrftoken"));
return xhr;
});
Add this just after you create the Uploader. Also remember to require dojo/aspect.
Notice that this is a bit of a hack and prone to breakage if some change happen in dojox.form.Uploader structure (e.g. they update it to use dojo.promise or other fixes). Also it's implied that this works only for HTML5 plugin, but you may extend the code in the same way to cope for other plugins by inspecting ul.uploadType and make the change specific for that plugin.
This solution works up to and including dojo version 1.12. In 2017 the above announced breakage did happen an this does not work anymore with version of dojo from 1.13 and upward.

How to send data as key - value pairs instead of string via POST using XHR

I'm creating two POST calls. One using a django form and one using angular js via a resource xhr.
The angular setup looks like this:
myModule.factory('gridData', function($resource) {
//define resource class
var root = {{ root.pk }};
var csrf = '{{ csrf_token }}';
return $resource('{% url getJSON4SlickGrid root.pk %}:wpID/', {wpID:'#id'},{
get: {method:'GET', params:{}, isArray:true},
update:{method:'POST', headers: {'X-CSRFToken' : csrf }}
});
});
With creating an xhr post request as such:
item.$update();
This post request is send to the server as expected, but when I want to access the QueryDict I cannot access the data passed using:
name = request.POST.get('name', None)
name is always None like this.
The issue behind this is that the QueryDict object is getting parsed quite strange.
print request.POST
<QueryDict: {u'{"name":"name update","schedule":0"}':[u'']}>
Whereas I would have expected this result, which I got when I send the data via a "normal" Post request:
<QueryDict: {u'name': [u'name update'], u'schedule': [u'0']}>
So it seems to be that Django receives something in the POST request which instructs Django to parse the parameters into one string. Any idea how to circumvent this?
Update:
I found this discussion where they say that the issue is if you provide any content type other than MULTIPART_CONTENT the parameters will be parsed into one string. I checked the content-type send with the POST request and it is really set to 'CONTENT_TYPE': 'application/json;charset=UTF-8'. Thus this is likely the issue. Therefore my question is: How can I set the CONTENT_TYPE for a xhr post request created using angular.js resources to MULTIPART_CONTENT?
you could either:
fiddle with the client to send data instead of json
use json.loads(request.raw_post_data).get('name', None) (django < 1.4)
use json.loads(request.body).get('name', None) (django >= 1.4)
The Angular documentation talks about transforming requests and responses
To override these transformation locally, specify transform functions as transformRequest and/or transformResponse properties of the config object. To globally override the default transforms, override the $httpProvider.defaults.transformRequest and $httpProvider.defaults.transformResponse properties of the $httpProvider.
you can find an example here as was previously pointed at.