I use Postman 6.0 to send an HTTP request. To send a request, I use a pre-request script to get a token and put it into the environment so that it will be used in the succeeding requests.
The script below doesn't work because the body is not sent. Is there anything wrong with the script below?
const getTaxAccessToken={
url: 'http://dev.xxx.com:4001/api/v1/portal/account/tax-login',
method: "post",
body: {
'loginIdentity': 'admic',
'password': 'abc123'
},
header: {
'Content-Type': 'application/json'
}
};
pm.sendRequest(getTaxAccessToken, function (err, response) {
console.log("get accesstoken");
console.log(response.access_Token);
pm.environment.set("taxAccessToken", response.access_Token);
});
If the request needs to be of type application/x-www-form-urlencoded:
const options = {
url: 'http://some/url',
method: 'POST',
header: {
'Accept': '*/*',
'Content-Type': 'application/x-www-form-urlencoded',
},
body: {
mode: 'urlencoded',
urlencoded : [
{ key: 'loginIdentity', value: 'admic'},
{ key: 'password', value: 'abc123'},
]
}
};
pm.sendRequest(options, function (err, res) {
// Use the err and res
// ...
pm.environment.set("my-token", res.json().access_token);
});
Postman Javascript API references:
Request
RequestBody
Try this.
body: {
mode: 'raw',
raw: JSON.stringify({'loginIdentity': 'admic', 'password': 'abc123'})
}
For Postman > v8.3.0
With Postman v8.3.0 the update() method was introduced which allows you to set the request body directly from the pre-request script.
For your use case you could simply use:
pm.request.body.update({
mode: 'raw',
raw: JSON.stringify({'loginIdentity': 'admic', 'password': 'abc123'})
});
or even shorter:
pm.request.body.update(JSON.stringify({'loginIdentity': 'admic', 'password': 'abc123'}));
Strings, form-data, urlencoded and other Content-Types
As the title is not specifically tailored to JSON request bodies I thought I'd add some examples for how to handle this for other data as many might find this page when searching on Google and run into this issue for other Content-Types.
Raw
raw in Postman expects a string and therefore you can transmit anything that can be expressed as a string e.g. plain text, HTML, XML, JSON etc. .
// plain text
pm.request.body.update(`Hello World!`);
// HTML
pm.request.body.update(`<html>...</html>`);
// XML
pm.request.body.update(`<xml>...</xml>`);
// JSON
pm.request.body.update(JSON.stringify({ key: `value` }));
URL-encoded
pm.request.body.update({
mode: "urlencoded",
urlencoded: [{
key: "key",
value: "value with spaces and special chars ?/ and umlaute öüä"
}]
});
Form data
pm.request.body.update({
mode: "formdata",
formdata: [{
key: "key",
value: "value with spaces and special chars ?/ and umlaute öüä"
}]
});
GraphQL
pm.request.body.update({
mode: 'graphql',
graphql: {
query: `
query {
hero {
name
friends {
name
}
}
}`
}
});
Example based on GraphQL Tutorial for Fields.
Files from local file system as form-data
pm.request.body.update({
mode: "formdata",
formdata: [
{
key: "file", // does not need to be "file"
type: "file", // MUST be "file"
src: "/C:/Users/MyUser/Documents/myFile.zip"
}
]
})
Please note: This will only work for files in your current working directory. Otherwise you will receive an error like this Form param 'file', file load error: PPERM: insecure file access outside working directory in the Postman console.
You can see where your working directory is when you go to Settings | General | Working Directory. There also is an option Allow reading files outside working directory which you can enable to read files from anywhere, but be aware that this can allow others to steal data from your computer e.g when you execute untrusted collections.
Related
I would like to use a postman pre-fetch script to refresh my app secret from an api protected by aws signature. I am able to make a basic authentication like this. However I need an aws signature authentication
var url = "https://some.endpoint"
var auth = {
type: 'basic',
basic: [
{ key: "username", value: "postman" },
{ key: "password", value: "secrets" }
]
};
var request = {
url: url,
method: "GET",
auth: auth
}
pm.sendRequest(request, function (err, res) {
const json = res.json() // Get JSON value from the response body
console.log(json)
});
hi just create a normal postman request that work properly and then copy that request to a variable by adding the below line in test script
pm.environment.set("awsrequest", pm.request)
Now you can use the awsrequest variable to send use in pm.sendRequest
pm.sendRequest(pm.environment.get("awsrequest"))
How to pass Cookie headers from gatsby-source-graphql?
I'm using the gatsby-source-graphql (https://github.com/gatsbyjs/gatsby/tree/master/packages/gatsby-source-graphql) and recently had to implement AWS Cloudfront Signed Cookies to authorise users to acccess a private staging environment, for this reason the requests to the graphql endpoint, handled by the plugin, need to have the cookie in the request header, which I do by:
{
resolve: 'gatsby-source-graphql',
options: {
cookie: 'var1=val1; var2=val2; '
}
}
The above fails,
ServerParseError: Unexpected token < in JSON at position 0
If disabling Signed Cookies and making the endpoint public, it works.
And, if I keep it private again and test with curl, works:
curl --cookie 'var1=val1; var2=val2; ' graphql_endpoint.com
I tried to figure out why the Cookie header is not passed, but seems that the problem is in a different plugin that the plugin above uses called 'apollo-link-http' (https://github.com/gatsbyjs/gatsby/blob/master/packages/gatsby-source-graphql/src/gatsby-node.js)
Meanwhile, looking at the apollo-http-link (https://www.apollographql.com/docs/link/links/http/) and a issue reported here (https://github.com/apollographql/apollo-client/issues/4455), I tried:
{
resolve: 'gatsby-source-graphql',
options: {
typeName: 'FOOBAR',
fieldName: 'foobar',
createLink: (pluginOptions) => {
return createHttpLink({
uri: process.env.GATSBY_GRAPHQL_API_URL,
credentials: 'include',
headers: {
cookie: "CloudFront-Policy=xxxxx_; CloudFront-Key-Pair-Id=xxxxx; CloudFront-Signature=xxxxxxxxxx; path=/;",
},
fetch,
})
},
}
},
Without success, the same error as before.
Also tried to use the fetch options for node-fetch,
{
resolve: 'gatsby-source-graphql',
options: {
typeName: 'FOOBAR',
fieldName: 'foobar',
url: process.env.GATSBY_GRAPHQL_API_URL,
fetchOptions: {
credentials: 'include',
headers: {
cookie: "CloudFront-Policy=xxxxx_; CloudFront-Key-Pair-Id=xxxxx; CloudFront-Signature=xxxxxxxxxx; path=/;",
},
},
}
},
As you can see fetchOptions here (https://github.com/gatsbyjs/gatsby/blob/master/packages/gatsby-source-graphql/src/gatsby-node.js)
No success! This is probably a bug.
After spending a lot of time looking at the docs and other reports, I found a solution based on the attempts I've originally posted.
I started by looking at the browser version, and check the cookie header property name to avoid any typos. Which I've determined it should be "Cookie", as most examples I found mention '.cookie', etc.
With that said, I've checked the documentation for all the related packages and source code:
https://github.com/gatsbyjs/gatsby/tree/master/packages/gatsby-source-graphql
https://github.com/gatsbyjs/gatsby/blob/master/packages/gatsby-source-graphql/src/gatsby-node.js
https://www.apollographql.com/docs/link/links/http/
https://github.com/apollographql/apollo-client/issues/4455
Finally, I declared the headers cookie parameter and in a separate property, the options for the node-fetch package:
https://github.com/bitinn/node-fetch
The result:
{
resolve: 'gatsby-source-graphql',
options: {
typeName: 'FOOBAR',
fieldName: 'foobar',
url: process.env.GATSBY_GRAPHQL_API_URL,
headers: {
Cookie: 'CloudFront-Policy=xxxxx_; CloudFront-Key-Pair-Id=xxxxx; CloudFront-Signature=xxxxxxxxxx; path=/;'
},
credentials: 'include',
}
},
What happens above, is that the "credentials include" allow cross-browser origin requests and enables cookies (https://www.apollographql.com/docs/react/networking/authentication/#cookie)
Hope that this helps someone else in the future, as it's not trivial.
I've been experiencing some issues with AWS Kinesis inasmuch as I have a stream set up and I want to use a standard http POST request to invoke a Kinesis PutRecord call on my stream. I'm doing this because bundle-size of my resultant javascript application matters and I'd rather not import the aws-sdk to accomplish something that should (on paper) be possible.
Just so you know, I've looked at this other stack overflow question about the same thing and It was... sort of informational.
Now, I already have a method to sigv4 sign a request using an access key, secret token, and session token. but when I finally get the result of signing the request and send it using the in-browser fetch api, the service tanks with (or with a json object citing the same thing, depending on my Content-Type header, I guess) as the result.
Here's the code I'm working with
// There is a global function "sign" that does sigv4 signing
// ...
var payload = {
Data: { task: "Get something working in kinesis" },
PartitionKey: "1",
StreamName: "MyKinesisStream"
}
var credentials = {
"accessKeyId": "<access.key>",
"secretAccessKey": "<secret.key>",
"sessionToken": "<session.token>",
"expiration": 1528922673000
}
function signer({ url, method, data }) {
// Wrapping with URL for piecemeal picking of parsed pieces
const parsed = new URL(url);
const [ service, region ] = parsed.host.split(".");
const signed = sign({
method,
service,
region,
url,
// Hardcoded
headers : {
Host : parsed.host,
"Content-Type" : "application/json; charset=UTF-8",
"X-Amz-Target" : "Kinesis_20131202.PutRecord"
},
body : JSON.stringify(data),
}, credentials);
return signed;
}
// Specify method, url, data body
var signed = signer({
method: "POST",
url: "https://kinesis.us-west-2.amazonaws.com",
data : JSON.stringify(payload)
});
var request = fetch(signed.url, signed);
When I look at the result of request, I get this:
{
Output: {
__type: "com.amazon.coral.service#InternalFailure"},
Version: "1.0"
}
Now I'm unsure as to whether Kinesis is actually failing here, or if my input is malformed?
here's what the signed request looks like
{
"method": "POST",
"service": "kinesis",
"region": "us-west-2",
"url": "https://kinesis.us-west-2.amazonaws.com",
"headers": {
"Host": "kinesis.us-west-2.amazonaws.com",
"Content-Type": "application/json; charset=UTF-8",
"X-Amz-Target": "Kinesis_20131202.PutRecord",
"X-Amz-Date": "20180613T203123Z",
"X-Amz-Security-Token": "<session.token>",
"Authorization": "AWS4-HMAC-SHA256 Credential=<access.key>/20180613/us-west-2/kinesis/aws4_request, SignedHeaders=content-type;host;x-amz-target, Signature=ba20abb21763e5c8e913527c95a0c7efba590cf5ff1df3b770d4d9b945a10481"
},
"body": "\"{\\\"Data\\\":{\\\"task\\\":\\\"Get something working in kinesis\\\"},\\\"PartitionKey\\\":\\\"1\\\",\\\"StreamName\\\":\\\"MyKinesisStream\\\"}\"",
"test": {
"canonical": "POST\n/\n\ncontent-type:application/json; charset=UTF-8\nhost:kinesis.us-west-2.amazonaws.com\nx-amz-target:Kinesis_20131202.PutRecord\n\ncontent-type;host;x-amz-target\n508d2454044bffc25250f554c7b4c8f2e0c87c2d194676c8787867662633652a",
"sts": "AWS4-HMAC-SHA256\n20180613T203123Z\n20180613/us-west-2/kinesis/aws4_request\n46a252f4eef52991c4a0903ab63bca86ec1aba09d4275dd8f5eb6fcc8d761211",
"auth": "AWS4-HMAC-SHA256 Credential=<access.key>/20180613/us-west-2/kinesis/aws4_request, SignedHeaders=content-type;host;x-amz-target, Signature=ba20abb21763e5c8e913527c95a0c7efba590cf5ff1df3b770d4d9b945a10481"
}
(the test key is used by the library that generates the signature, so ignore that)
(Also there are probably extra slashes in the body because I pretty printed the response object using JSON.stringify).
My question: Is there something I'm missing? Does Kinesis require headers a, b, and c and I'm only generating two of them? Or is this internal error an actual failure. I'm lost because the response suggests nothing I can do on my end.
I appreciate any help!
Edit: As a secondary question, am I using the X-Amz-Target header correctly? This is how you reference calling a service function so long as you're hitting that service endpoint, no?
Update: Followinh Michael's comments, I've gotten somewhere, but I still haven't solved the problem. Here's what I did:
I made sure that in my payload I'm only running JSON.stringify on the Data property.
I also modified the Content-Type header to be "Content-Type" : "application/x-amz-json-1.1" and as such, I'm getting slightly more useful error messages back.
Now, my payload is still mostly the same:
var payload = {
Data: JSON.stringify({ task: "Get something working in kinesis" }),
PartitionKey: "1",
StreamName: "MyKinesisStream"
}
and my signer function body looks like this:
function signer({ url, method, data }) {
// Wrapping with URL for piecemeal picking of parsed pieces
const parsed = new URL(url);
const [ service, region ] = parsed.host.split(".");
const signed = sign({
method,
service,
region,
url,
// Hardcoded
headers : {
Host : parsed.host,
"Content-Type" : "application/json; charset=UTF-8",
"X-Amz-Target" : "Kinesis_20131202.PutRecord"
},
body : data,
}, credentials);
return signed;
}
So I'm passing in an object that is partially serialized (at least Data is) and when I send this to the service, I get a response of:
{"__type":"SerializationException"}
which is at least marginally helpful because it tells me that my input is technically incorrect. However, I've done a few things in an attempt to correct this:
I've run JSON.stringify on the entire payload
I've changed my Data key to just be a string value to see if it would go through
I've tried running JSON.stringify on Data and then running btoa because I read on another post that that worked for someone.
But I'm still getting the same error. I feel like I'm so close. Can you spot anything I might be missing or something I haven't tried? I've gotten sporadic unknownoperationexceptions but I think right now this Serialization has me stumped.
Edit 2:
As it turns out, Kinesis will only accept a base64 encoded string. This is probably a nicety that the aws-sdk provides, but essentially all it took was Data: btoa(JSON.stringify({ task: "data"})) in the payload to get it working
While I'm not certain this is the only issue, it seems like you are sending a request body that contains an incorrectly serialized (double-encoded) payload.
var obj = { foo: 'bar'};
JSON.stringify(obj) returns a string...
'{"foo": "bar"}' // the ' are not part of the string, I'm using them to illustrate that this is a thing of type string.
...and when parsed with a JSON parser, this returns an object.
{ foo: 'bar' }
However, JSON.stringify(JSON.stringify(obj)) returns a different string...
'"{\"foo\": \"bar\"}"'
...but when parsed, this returns a string.
'{"foo": "bar"}'
The service endpoint expects to parse the body and get an object, not a string... so, parsing the request body (from the service's perspective) doesn't return the correct type. The error seems to be a failure of the service to parse your request at a very low level.
In your code, body: JSON.stringify(data) should just be body: data because earlier, you already created a JSON object with data: JSON.stringify(payload).
As written, you are effectively setting body to JSON.stringify(JSON.stringify(payload)).
Not sure if you ever figured this out, but this question pops up on Google when searching for how to do this. The one piece I think you are missing is that the Record Data field must be base64 encoded. Here's a chunk of NodeJS code that will do this (using PutRecords).
And for anyone asking, why not just use the SDK? I currently must stream data from a cluster that cannot be updated to a NodeJS version that the SDK requires due to other dependencies. Yay.
const https = require('https')
const aws4 = require('aws4')
const request = function(o) { https.request(o, function(res) { res.pipe(process.stdout) }).end(o.body || '') }
const _publish_kinesis = function(logs) {
const kin_logs = logs.map(function (l) {
let blob = JSON.stringify(l) + '\n'
let buff = Buffer.from(blob, 'binary');
let base64data = buff.toString('base64');
return {
Data: base64data,
PartitionKey: '0000'
}
})
while(kin_logs.length > 0) {
let data = JSON.stringify({
Records: kin_logs.splice(0,250),
StreamName: 'your-streamname'
})
let _request = aws4.sign({
hostname: 'kinesis.us-west-2.amazonaws.com',
method: 'POST',
body: data,
path: '/?Action=PutRecords',
headers: {
'Content-Type': 'application/x-amz-json-1.1',
'X-Amz-Target': 'Kinesis_20131202.PutRecords'
},
}, {
secretAccessKey: "****",
accessKeyId: "****"
// sessionToken: "<your-session-token>"
})
request(_request)
}
}
var logs = [{
'timeStamp': new Date().toISOString(),
'value': 'test02',
},{
'timeStamp': new Date().toISOString(),
'value': 'test01',
}]
_publish_kinesis(logs)
Here I am trying to upload a video to user profile.
I have set up javascript sdk and my authentication works well .
I have the following code here..
FB.api(
`/${user_id}/videos`,
"POST",
{
"file_url": video,
"description": description,
"thumb": video_thumbnail,
"title": title,
},
function (response) {
console.log("fb response")
console.log(response)
if (response && !response.error) {
/* handle the result */
console.log("video upload response")
console.log(response)
}
});
Here I get the following error ..
code: 100
fbtrace_id: "FD5tVyrH9bS"
message: "(#100) Invalid format. It should be an image file data."
type: "OAuthException"
I am using file_url and passing url to my video. I guess it should upload the video..
Thank you for the response
I confirm that you must post image file data in source field when posting to Facebook.
You can test by use Postman.
This is example:
var fs = require("fs");
var request = require("request");
var options = { method: 'POST',
url: 'https://graph.facebook.com/v2.11/2011156779127713/thumbnails',
headers:
{ 'Postman-Token': '6c17c103-d8f6-47a5-713b-b3709dde762d',
'Cache-Control': 'no-cache',
'content-type': 'multipart/form-data; boundary=----WebKitFormBoundary7MA4YWxkTrZu0gW' },
formData:
{ access_token: 'test',
is_preferred: 'true',
source:
{ value: 'fs.createReadStream("./Downloads/923249_818835191462845_1528674847924045075_n.jpg")',
options:
{ filename: './Downloads/923249_818835191462845_1528674847924045075_n.jpg',
contentType: null } } } };
request(options, function (error, response, body) {
if (error) throw new Error(error);
console.log(body);
});
The problem isn't the video or the URL, it's the thumb parameter.
The thumb parameter needs to be 'file data', not the URL.
As to what format the image needs to be in..please let me know if you find out! I'm asking the same here.
The facebook API is terrible...
I'm having an issue POSTing data to a Node/Express API that leverages Mongoose and MongoDB. When attempting a bulk insert using this schema, data and handler:
// Mongoose schema
var NotificationSchema = new Schema({
uuid: {
type: String,
required: true,
index: true
},
message: {
type: String,
required: true
},
url: {
type: String,
required: true
}
});
// sample data
[
{
'uuid': '34e1ffef49ad4001bb9231c21bdb3be7',
'url': '/polls/4666386cb92348af93417e9abb9ce880/forecast/',
'message': '#btaylor has shared a poll with you'
},
{
'uuid': '42d6a9f4b3f5416b952452c26e01789a',
'url': '/polls/4666386cb92348af93417e9abb9ce880/forecast/',
'message': '#btaylor has shared a poll with you'
}
]
// route handler
Notification.prototype.bulkInsert = function(data, callback) {
NotificationSchema.collection.insert(data, function(error, documents) {
if (error) { return callback(error, null); }
if (documents.length == 0) { return callback(null, null); }
callback(null, documents);
});
};
I get this back when POSTed as x-www-form-urlencoded via Postman:
{ [MongoError: Client Error: bad object in message: bson length doesn't match what we found]
name: 'MongoError',
err: 'Client Error: bad object in message: bson length doesn\'t match what we found',
code: 10307,
n: 0,
connectionId: 125,
ok: 1 }
My Mocha tests posting the same data work just fine. What am I doing wrong?
[Update]
After further testing, it appears that the body of the request is being improperly parsed when posted from my Django web application using the requests library.
My post is constructed as:
requests.post(url, data=data)
where data is a Python dictionary:
{'data': [{'url': '/polls/4666386cb92348af93417e9abb9ce880/forecast/', 'message': '#btaylor has shared a poll with you', 'uuid': '34e1ffef49ad4001bb9231c21bdb3be7'}, {'url': '/polls/4666386cb92348af93417e9abb9ce880/forecast/', 'message': '#btaylor has shared a poll with you', 'uuid': '42d6a9f4b3f5416b952452c26e01789a'}]}
The data argument that the above route handler receives is populated from req.body.data. In my Express middleware, I am using the following body parsers:
app.use(bodyParser.urlencoded({
extended: true
}));
app.use(bodyParser.json());
however, logging the request body, as posted from Django/requests results in:
[ 'url', 'message', 'uuid', 'url', 'message', 'uuid' ]
why are the values being stripped out, along with the curly braces defining the objects? The content type of the post is being reported correctly as:
application/x-www-form-urlencoded
Anyone have any ideas? This works perfectly from Postman, btw.
Turns out there was one problem on the Python side and one on the Express side.
The list of objects that I was posting from Python needed to be converted to a JSON string before being set in the values posted:
# views.py
notifications = json.dumps([{"uuid": profile.uuid_str,
"message": message, "url": poll_forecast_url}
for profile in shared_with])
requests.post(url, data={'data': notifications})
Earlier in my question you'll notice that I indicated the tests from Postman were failing. That's because the value of req.body.data on the Express side was being received as a string when an encoding of x-www-form-urlencoded was set in the Postman options. To fix this, I added this line before the Notification.bulkInsert() function call:
var data = typeof(req.body.data) == 'string' ? JSON.parse(req.body.data) : req.body.data;
to properly convert the JSON string to an object before passing it to .bulkInsert()