How to run a GET and a POST request from pre request tab of another request in postman? - postman-pre-request-script

I am in a situation where I need to run a request (lets say request C) based on the responses of request A and B. Here A is a GET & B is a POST request. Now I have tried to use pm.sendRequest twice in pre-request tab of C. But the problem I am facing mainly is, B is running ahead of A all the time. That means POST is running before GET. As a result I am unable to successfully run request C.
Here is a sample of my pre request script:
const getOTP = {
method: 'GET',
url: `${pm.environment.get('base_url')}/${pm.environment.get('common_path')}/otp-login?msisdn=${pm.environment.get('msisdnWhite')}`,
header: {
'User-Agent': 'something..',
'Accept-Language' : 'en'
},
};
setTimeout(15000)
const postOTP={
method : 'POST',
url : `${pm.environment.get('base_url')}/${pm.environment.get('common_path')}/otp-login`,
header: {
'Content-Type':'application/json',
'Accept-Language':'en'
},
body:{
mode : 'application/json',
raw: JSON.stringify(
{
"msisdn":"123456789",
"otp":"0000"
} )
}
};
pm.sendRequest(getOTP, (err, response) => {
const jsonResponse = response.json();
pm.environment.set("GETOTP",jsonResponse.result)
console.log(jsonResponse);
});
pm.sendRequest(postOTP, (err, response) => {
const jsonData = response.json();
pm.environment.set("access_token",jsonData.access_token)
});

It seems you need to run request A, wait for the response, then run request B, and wait for the response again.
To do this you could trigger the sendRequest() for B in the callback of the A request.
For example:
// send GET request A
pm.sendRequest (getOTP, (err, response) => {
// here you are in the callback of the request A
// this runs after the response of the request A
const jsonResponse = response.json ();
pm.environment.set ("GETOTP", jsonResponse.result)
console.log (jsonResponse);
// send POST request B
pm.sendRequest (postOTP, (err, response) => {
// here you are in the callback of the request B
const jsonData = response.json ();
pm.environment.set ("access_token", jsonData.access_token)
});
});
I saw you set a setTimeout(15000), I guess it's to wait for the answer of A. You shouldn't use setTimeout for this purpose, so please delete it.
That said you need to know that if you wanted the setTimeout() to be useful you need it between the two sendRequest() functions. It doesn't do much right now, between the two variables declarations.

Related

Postman test script - how to call an api twice to simulate 409 error

I am trying to run a few automated testing using the Postman tool. For regular scenarios, I understand how to write pre-test and test scripts. What I do not know (and trying to understand) is, how to write scripts for checking 409 error (let us call it duplicate resource check).
I want to run a create resource api like below, then run it again and ensure that the 2nd invocation really returns 409 error.
POST /myservice/books
Is there a way to run the same api twice and check the return value for 2nd invocation. If yes, how do I do that. One crude way of achieving this could be to create a dependency between two tests, where the first one creates a resource, and the second one uses the same payload once again to create the same resource. I am looking for a single test to do an end-to-end testing.
Postman doesn't really provide a standard way, but is still flexible. I realized that we have to write javascript code in the pre-request tab, to do our own http request (using sendRequest method) and store the resulting data into env vars for use by the main api call.
Here is a sample:
var phone = pm.variables.replaceIn("{{$randomPhoneNumber}}");
console.log("phone:", phone)
var baseURL = pm.variables.replaceIn("{{ROG_SERVER}}:{{ROG_PORT}}{{ROG_BASE_URL}}")
var usersURL = pm.variables.replaceIn("{{ROG_SERVICE}}/users")
var otpURL = `${baseURL}/${phone}/_otp_x`
// Payload for partner creation
const payload = {
"name": pm.variables.replaceIn("{{username}}"),
"phone":phone,
"password": pm.variables.replaceIn("{{$randomPassword}}"),
}
console.log("user payload:", payload)
function getOTP (a, callback) {
// Get an OTP
pm.sendRequest(otpURL, function(err, response) {
if (err) throw err
var jsonDaata = response.json()
pm.expect(jsonDaata).to.haveOwnProperty('otp')
pm.environment.set("otp", jsonDaata.otp)
pm.environment.set("phone", phone);
pm.environment.set("username", "{{$randomUserName}}")
if (callback) callback(jsonDaata.otp)
})
}
// Get an OTP
getOTP("a", otp => {
console.log("OTP received:", otp)
payload.partnerRef = pm.variables.replaceIn("{{$randomPassword}}")
payload.otp = otp
//create a partner user with the otp.
let reqOpts = {
url: usersURL,
method: 'POST',
headers: { 'Content-Type': 'application/json'},
body: JSON.stringify(payload)
}
pm.sendRequest(reqOpts, (err, response) => {
console.log("response?", response)
pm.expect(response).to.have.property('code', 201)
})
// Get a new OTP for the main request to be executed.
getOTP()
})
I did it in my test block. Create your normal request as you would send it, then in your tests, validate the original works, and then you can send the second command and validate the response.
You can also use the pre and post scripting to do something similar, or have one test after the other in the file (they run sequentially) to do the same testing.
For instance, I sent an API call here to create records. As I need the Key_ to delete them, I can make a call to GET /foo at my API
pm.test("Response should be 200", function () {
pm.response.to.be.ok;
pm.response.to.have.status(200);
});
pm.test("Parse Key_ values and send DELETE from original request response", function () {
var jsonData = JSON.parse(responseBody);
jsonData.forEach(function (TimeEntryRecord) {
console.log(TimeEntryRecord.Key_);
const DeleteURL = pm.variables.get('APIHost') + '/bar/' + TimeEntryRecord.Key_;
pm.sendRequest({
url: DeleteURL,
method: 'DELETE',
header: { 'Content-Type': 'application/json' },
body: { TimeEntryRecord }
}, function (err, res) {
console.log("Sent Delete: " + DeleteURL );
});
});
});

Nuxt Vuex Helper not sending Client Cookies to API

Okay, I have the bad feeling that I'm missing a key concept in what I'm doing. Hope someone can help me out with a hint.
I'm using Nuxt and Vuex Store Modules. Every fetch a Module Action does is wrapped in a helper Function (saveFetch) that I imported to decrease repetitive code, like this:
export const actions = {
async sampleAction(context, data){
...
await saveFetch(context, 'POST', '/pages', data)
...
}
}
The helper simple checks if the users accessToken is still valid, refreshes it if not and then sends the request:
export const saveFetch = async (context, method = 'POST', path, data) => {
const accessTokenExpiry = context.rootGetters['auth/getAccessTokenExpiry']
let accessToken = context.rootGetters['auth/getAccessToken']
// If the client provides an accessToken and the accessToken is expired,
// refresh the token before making the "real" fetch.
if (accessToken && accessTokenExpiry < new Date() && path !== '/auth/refresh-token') {
if (process.client) {
// Works fine
await window.$nuxt.$store.dispatch('auth/refreshToken')
} else {
// This is where the trouble starts
await context.dispatch('auth/refreshToken', null, { root: true })
}
accessToken = rootGetters['auth/getAccessToken']
}
return fetch(path, {
method,
headers: { ... },
body: JSON.stringify(data),
}
}
If the accessToken is expired the helper function dispatches a Vuex Action to refresh it. This works well on the client side, but not if that process happens on the server side.
The Problem that's coming up on the server side is, that the user has to provide a refreshToken to get a refreshed accessToken from the API. This refreshToken is stored as a HttpOnly Cookie in the Client. When logging the Nuxt request details on the API side of things I noticed, that Nuxt is not sending that cookie.
My current workaround looks like this:
export const actions = {
async refreshToken(context){
...
let refreshToken
if (process?.server && this?.app?.context?.req?.headers?.cookie) {
const parsedCookies = cookie.parse(
this.app.context.req.headers.cookie
)
refreshToken = parsedCookies?.refreshToken
}
const response = await saveFetch(context, 'POST', '/auth/refresh-token', {
refreshToken,
})
...
}
...
}
If on server side, access the req object, get the cookies from it and send the refreshToken Cookie Content in the requests body.
This looks clearly bad to me and I would love to get some feedback on how to do this better. Did I maybe miss some key concepts that would help me not get into this problem in the first place?

Apify: Preserve headers in RequestQueue

I'm trying to crawl our local Confluence installation with the PuppeteerCrawler. My strategy is to login first, then extracting the session cookies and using them in the header of the start url. The code is as follows:
First, I login 'by foot' to extract the relevant credentials:
const Apify = require("apify");
const browser = await Apify.launchPuppeteer({sloMo: 500});
const page = await browser.newPage();
await page.goto('https://mycompany/confluence/login.action');
await page.focus('input#os_username');
await page.keyboard.type('myusername');
await page.focus('input#os_password');
await page.keyboard.type('mypasswd');
await page.keyboard.press('Enter');
await page.waitForNavigation();
// Get cookies and close the login session
const cookies = await page.cookies();
browser.close();
const cookie_jsession = cookies.filter( cookie => {
return cookie.name === "JSESSIONID"
})[0];
const cookie_crowdtoken = cookies.filter( cookie => {
return cookie.name === "crowd.token_key"
})[0];
Then I'm building up the crawler structure with the prepared request header:
const startURL = {
url: 'https://mycompany/confluence/index.action',
method: 'GET',
headers:
{
Accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
'Accept-Encoding': 'gzip, deflate, br',
'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7',
Cookie: `${cookie_jsession.name}=${cookie_jsession.value}; ${cookie_crowdtoken.name}=${cookie_crowdtoken.value}`,
}
}
const requestQueue = await Apify.openRequestQueue();
await requestQueue.addRequest(new Apify.Request(startURL));
const pseudoUrls = [ new Apify.PseudoUrl('https://mycompany/confluence/[.*]')];
const crawler = new Apify.PuppeteerCrawler({
launchPuppeteerOptions: {headless: false, sloMo: 500 },
requestQueue,
handlePageFunction: async ({ request, page }) => {
const title = await page.title();
console.log(`Title of ${request.url}: ${title}`);
console.log(page.content());
await Apify.utils.enqueueLinks({
page,
selector: 'a:not(.like-button)',
pseudoUrls,
requestQueue
});
},
maxRequestsPerCrawl: 3,
maxConcurrency: 10,
});
await crawler.run();
The by-foot-login and cookie extraction seems to be ok (the "curlified" request works perfectly), but Confluence doesn't accept the login via puppeteer / headless chromium. It seems like the headers are getting lost somehow..
What am I doing wrong?
Without first going into the details of why the headers don't work, I would suggest defining a custom gotoFunction in the PuppeteerCrawler options, such as:
{
// ...
gotoFunction: async ({ request, page }) => {
await page.setCookie(...cookies); // From page.cookies() earlier.
return page.goto(request.url, { timeout: 60000 })
}
}
This way, you don't need to do the parsing and the cookies will automatically be injected into the browser before each page load.
As a note, modifying default request headers when using a headless browser is not a good practice, because it may lead to blocking on some sites that match received headers against a list of known browser fingerprints.
Update:
The below section is no longer relevant, because you can now use the Request class to override headers as expected.
The headers problem is a complex one involving request interception in Puppeteer. Here's the related GitHub issue in Apify SDK. Unfortunately, the method of overriding headers via a Request object currently does not work in PuppeteerCrawler, so that's why you were unsuccessful.

How to run an async function and set headers per each request in Apollo?

Let's say I want to get the firebase auth token and set it to each and every request. To fetch the firebase auth token I need to send an async call to the firebase server. Only when it completes I get the token. I tried to set it as shown below. But apollo sends the request before I get the token from firebase. How can I fix this? How can I make apollo wait?
export const client = new ApolloClient({
uri: 'http://localhost:4000/',
request: async operation => {
await firebase.auth().onAuthStateChanged(async user => {
if (user) {
const token = await firebase.auth().currentUser.getIdToken(/* forceRefresh */ true);
operation.setContext({
headers: {
authorization: token ? `Bearer ${token}` : ''
}
});
}
});
}
});

CORS error with API Gateway and Lambda **only** when using Proxy Integration

I am trying to add an Item to DynamoDB upon a post request from API Gateway using Lambda.
This is what my Lambda code looks like:
var AWS = require('aws-sdk');
var dynamoDB = new AWS.DynamoDB();
exports.handler = (event, context, callback) => {
var temp_id = "1";
var temp_ts = Date.now().toString();
var temp_calc = event['params']['calc'];
var params = {
TableName:"calc-store",
Item: {
Id: {
S: temp_id
},
timestamp: {
S: temp_ts
},
calc: {
S: temp_calc
}
}
};
dynamoDB.putItem(params,callback);
const response = {
statusCode: 200,
headers: {
'content-type': 'application/json',
'Access-Control-Allow-Origin': '*'
},
body: event['params']['calc']
};
callback(null, response);
};
This is how I am calling the function from my client
axios.post(apiURL, {params:{calc:calc}})
.then ((res) => {
console.log(res);
})
I have enabled CORS over 30 times on my API Gateway, and I've also double checked by adding headers to the response. But no matter what I do, I keep getting a CORS error and for some reason, in my response, I can see that the "Access-Control-Allow-Origin" header is not being appended.
POST https://egezysuta5.execute-api.us-east-1.amazonaws.com/TEST 502
localhost/:1 Failed to load https://egezysuta5.execute-api.us-east-
1.amazonaws.com/TEST: No 'Access-Control-Allow-Origin' header is
present on the requested resource. Origin 'http://localhost:3000' is
therefore not
allowed access. The response had HTTP status code 502.
createError.js:17 Uncaught (in promise) Error: Network Error
at createError (createError.js:17)
at XMLHttpRequest.handleError (xhr.js:87)
I tried not using Lambda Proxy Integration, and it worked then, however, I was unable to access the params I passed.
EDIT: After spending hours on this, here is what I've boiled the problem down to. My client is making a successful pre-flight request to OPTIONS. The OPTIONS is successfully returning the correct CORS headers, but for some reason, these are not being passed to my POST request!
EDIT2: (This does not solve the problem) If I change the response body to a string there is no error!! There is something wrong with
event['params]['calc']
Your problem is with the flow of the code. Basically you're not waiting for putItem to complete before callback gets executed...Try this...
dynamoDB.putItem(params,(err,data) => {
if(err){
return callback(err, null)
}
const response = {
statusCode: 200,
headers: {
'content-type': 'application/json',
'Access-Control-Allow-Origin': '*'
},
body: JSON.parse(event.body).calc
};
return callback(null, response);
});
There are 2 issues going on here:
your code is crashing because you are probably trying to access a null property in the event object
because your code fails before you can return the full response, the proper cors headers don’t get sent back to the browser.
Always try and catch errors in your lambda code. Always log the error and return the full response with a status code of 500, in the case of an error. Also, it’s important to handle async functions, like putItem with promises. Really grasp that concept before working with JavaScript!