Missing request headers in puppeteer - cookies

I want to read the request cookie during a test written with the puppeteer. But I noticed that most of the requests I inspect have only referrer and user-agent headers. If I look at the same requests in Chrome dev tools, they have a lot more headers, including Cookie. To check it out, copy-paste the code below into https://try-puppeteer.appspot.com/.
const browser = await puppeteer.launch();
const page = await browser.newPage();
page.on('request', function(request) {
console.log(JSON.stringify(request.headers, null, 2));
});
await page.goto('https://google.com/', {waitUntil: 'networkidle'});
await browser.close();
Is there a restriction which request headers you can and can not access? Is it a limitation of Chrome itself or puppeteer?
Thanks for suggestions!

I also saw this when I was trying to use Puppeteer to test some CORS behaviour - I found the Origin header was missing from some requests.
Having a look around the GitHub issues I found an issue which mentioned Puppeteer does not listen to the Network.responseReceivedExtraInfo event of the underlying Chrome DevTools Protocol, this event provides extra response headers not available to the Network.responseReceived event. There is also a similar Network.requestWillBeSentExtraInfo event for requests.
Hooking up to these events seemed to get me all the headers I needed. Here is some sample code which captures the data from all these events and merges it onto a single object keyed by request ID:
// Setup.
const browser = await puppeteer.launch()
const page = await browser.newPage()
const cdpRequestDataRaw = await setupLoggingOfAllNetworkData(page)
// Make requests.
await page.goto('http://google.com/')
// Log captured request data.
console.log(JSON.stringify(cdpRequestDataRaw, null, 2))
await browser.close()
// Returns map of request ID to raw CDP request data. This will be populated as requests are made.
async function setupLoggingOfAllNetworkData(page) {
const cdpSession = await page.target().createCDPSession()
await cdpSession.send('Network.enable')
const cdpRequestDataRaw = {}
const addCDPRequestDataListener = (eventName) => {
cdpSession.on(eventName, request => {
cdpRequestDataRaw[request.requestId] = cdpRequestDataRaw[request.requestId] || {}
Object.assign(cdpRequestDataRaw[request.requestId], { [eventName]: request })
})
}
addCDPRequestDataListener('Network.requestWillBeSent')
addCDPRequestDataListener('Network.requestWillBeSentExtraInfo')
addCDPRequestDataListener('Network.responseReceived')
addCDPRequestDataListener('Network.responseReceivedExtraInfo')
return cdpRequestDataRaw
}

That's because your browser sets a bunch of headers depending on settings and capabilities, and also includes e.g. the cookies that it has stored locally for the specific page.
If you want to add additional headers, you can use methods such as:
page.setExtraHTTPHeaders docs here.
page.setUserAgent docs here.
page.setCookies docs here.
With these you can mimic the extra headers that you see your Chrome browser dispatching.

Related

Postman: Set a request header from the output of a program

I need to make requests to an API that accepts authentication tokens and I want to be able to use a dynamically generated token by running cmd.exe /c GenerateToken.bat instead of having to run my program and then manually paste the value in Postman every time.
I imagine something that looks like this:
How can I set the value of a HTTP header to contain the stdout output of a program or a batch file?
Short answer is, you can't. This is deliberate, both pre-request and test scripts (the only way, other than a collection runner, to make your environment dynamic) run in the postman sandbox, which has limited functionality.
More information of what is available is in the postman-sandbox Github repository page and in postman docs (scroll to the bottom to see what libraries you can import)
You do have a few options, as described in comments - postman allows sending requests and parsing the response in scripts, so you can automate this way. You do need a server to handle the requests and execute your script (simplest option is probably a small server suporting CGI - I won't detail it here as I feel it's too big of a scope for this answer. Other options are also available, such as a small PHP or Node server)
Once you do have a server, the pre-request script is very simple:
const requestOptions = {
url: `your_server_endpoint`,
method: 'GET'
}
pm.sendRequest(requestOptions, function (err, res) {
if (err) {
throw new Error(err);
} else if (res.code != 200) {
throw new Error(`Non-200 response when fetching token: ${res.code} ${res.status}`);
} else {
var token = res.text();
pm.environment.set("my_token", token);
}
});
You can then set the header as {{my_token}} in the "Headers" tab, and it will be updated once the script runs.
You can do something similar to this from Pre-request Scripts at the collection level.
This is available in postman for 9 different authorization and authentication methods.
this is a sample code taken from this article, that show how to do this in Pre-request Scripts for OAuth2
// Refresh the OAuth token if necessary
var tokenDate = new Date(2010,1,1);
var tokenTimestamp = pm.environment.get("OAuth_Timestamp");
if(tokenTimestamp){
tokenDate = Date.parse(tokenTimestamp);
}
var expiresInTime = pm.environment.get("ExpiresInTime");
if(!expiresInTime){
expiresInTime = 300000; // Set default expiration time to 5 minutes
}
if((new Date() - tokenDate) >= expiresInTime)
{
pm.sendRequest({
url: pm.variables.get("Auth_Url"),
method: 'POST',
header: {
'Accept': 'application/json',
'Content-Type': 'application/x-www-form-urlencoded',
'Authorization': pm.variables.get("Basic_Auth")
}
}, function (err, res) {
pm.environment.set("OAuth_Token", res.json().access_token);
pm.environment.set("OAuth_Timestamp", new Date());
// Set the ExpiresInTime variable to the time given in the response if it exists
if(res.json().expires_in){
expiresInTime = res.json().expires_in * 1000;
}
pm.environment.set("ExpiresInTime", expiresInTime);
});
}

Virtual Hosting on Next.js app with Apollo GraphQL

I have a webapp made with Next.js and Apollo as show in example with-apollo. I want to serve multiple domains with my webapp (name-based virtual hosting). Unfortunately HttpLink of ApolloClient requires absolute server URL with domain but this makes backend app unable to recognize domain which user really visited. Is there a way to configure HttpLink with a dynamic URL based on real request or use relative URL or anything else?
Either use an Apollo Link to intercept the query and set uri property on the context
const authMiddleware = setContext((operation, { uri }) => {
return refreshToken().then(res => ({
uri: this.getURI()
})
}))
Or intercept the request with Angular's HttpClient interceptor and change the endpoint.
https://github.com/apollographql/apollo-angular/tree/master/packages/apollo-angular-link-http#options
Source: Updating uri of apollo client instance
The NextPageContext object passed to getInitialProps includes the req object when called on the server-side. So you can do something like:
WithApollo.getInitialProps = async ctx => {
const { AppTree, req } = ctx
const linkBaseUrl = req ? req.protocol + '://' + req.get('host') : ''
...
}
You can then pass this base url down to createApolloClient along with the initial state and prepend your HttpLink's url with it. On the client side, this will prepend an empty string (you only need the full URL on the server).

Password protect s3 bucket with lambda function in aws

I added website authentication for s3 bucket using lambda function and then connect the lambda function with the CloudFront by using behavior settings in distribution settings and it worked fine and added authentication(means htaccess authentication in simple servers). Now I want to change the password for my website authentication. For that, I updated the password and published the new version of the lambda function and then in the distribution settings; I created a new invalidation to clear cache. But it didn't work, and website authentication password didn't change. Below is my lambda function code to add authentication.
'use strict';
exports.handler = (event, context, callback) => {
// Get request and request headers
const request = event.Records[0].cf.request;
const headers = request.headers;
// Configure authentication
const authUser = 'user';
const authPass = 'pass';
// Construct the Basic Auth string
const authString = 'Basic ' + new Buffer(authUser + ':' + authPass).toString('base64');
// Require Basic authentication
if (typeof headers.authorization == 'undefined' || headers.authorization[0].value != authString) {
const body = 'Unauthorized';
const response = {
status: '401',
statusDescription: 'Unauthorized',
body: body,
headers: {
'www-authenticate': [{key: 'WWW-Authenticate', value:'Basic'}]
},
};
callback(null, response);
}
// Continue request processing if authentication passed
callback(null, request);
};
Can anyone please help me to solve the problem.
Thanks in advance.
On Lambda function view, After you save your changes (using Firefox could be a safer option, see below if you wonder why)
you will see a menu item under Configuration - > Designer -> CloudFront. You will see following screens.
After you deploy :
You can publish your change to CloudFront distribution. Once you publish this, it will automatically start deploying CF distribution which you can view on CF menu.
Also i would prefer using "Viewer Request" as a CloudFront trigger event, not sure which one you are using as this should avoid Cloudfront caching. On top of this Chrome sometimes fails to save changes on Lambda. There should be a bug on aws console. Try Firefox just to be safe when you are editing lambda functions.

Dart BrowserClient POST not including my cookies

I'm doing a BrowserClient POST across domains and don't see my cookies being included.
This the response I'm getting:
When I send another POST request, I don't see the cookies being included:
Going straight to the test page, I can see the cookies being included:
The Dart code I use to make a POST:
var client = new BrowserClient();
client.post(url, body: request, headers:{"Content-Type" : "application/json", "Access-Control-Allow-Credentials":"true"}).then((res) {
if (res.statusCode == 200) {
var response = JSON.decode(res.body);
callback(response);
} else {
print(res.body);
print(res.reasonPhrase);
}
}).whenComplete(() {
client.close();
});
Not sure about the Access-Control-Allow-Credentials header I'm including, with or without it, nothing changes.
Am I missing headers on the server side that needs to be set on the response or is Dartium blocking cross-domain cookies?
More details on Information Security and the reasoning behind setting cookies via the server.
Update: Enhancement request logged: https://code.google.com/p/dart/issues/detail?id=23088
Update: Enhancement implemented, one should now be able to do var client = new BrowserClient()..withCredentials=true; based on
https://github.com/dart-lang/http/commit/9d76e5e3c08e526b12d545517860c092e089a313
For cookies being sent to CORS requests, you need to set withCredentials = true. The browser client in the http package doesn't support this argument. You can use the HttpRequest from dart:html instead.
See How to use dart-protobuf for an example.

Google Apps Script and cookies

I am trying to Post and get a cookie. I am a newbie and this is a learning project for me. My impression is that if you use 'set-cookie' one should be able to see an additional 'set-cookie' in the .toSource. (I am trying to accomplish this on Google Apps Site if that makes a difference.) Am I missing something? Here is my code:
function setGetCookies() {
var payload = {'set-cookie' : 'test'};
var opt2 = {'headers':payload, "method":"post"};
UrlFetchApp.fetch("https://sites.google.com/a/example.com/blacksmith", opt2);
var response = UrlFetchApp.fetch("https://sites.google.com/a/example.com/blacksmith")
var openId = response.getAllHeaders().toSource();
Logger.log(openId)
var AllHeaders = response.getAllHeaders();
for (var prop in AllHeaders) {
if (prop.toLowerCase() == "set-cookie") {
// if there's only one cookie, convert it into an array:
var myArray = [];
if ( Array.isArray(AllHeaders[prop]) ) {
myArray=AllHeaders[prop];
} else {
myArray[0]=AllHeaders[prop];
}
// now process the cookies
myArray.forEach(function(cookie) {
Logger.log(cookie);
});
break;
}
}
}
Thanks in advance! I referenced this to develop the code: Cookie handling in Google Apps Script - How to send cookies in header?
Open to any advice.
When you aren't logged in Google Sites won't set any cookies in the response. UrlFetchApp doesn't pass along your Google cookies, so it will behave as if you are logged out.
First the cookie you want to send whose name is 'test' does not have a value. You should send 'test=somevalue'.
Second I am wondering if you are trying to send the cookie to the googlesite server and ask it to reply with the same cookie you previously sent... ?
I am thinking you are trying to act as a HTTP server beside you are a HTTP client.
As a HTTP client your role is only to send back any cookies that the HTTP server have previously sent to you (respecting the domain, expiration... params).