How to set setTimeout/Thread.sleep in newman - postman

How to set setTimeout/Thread.sleep in newman(postman's node module).
I am using below function :
setTimeout(function(){
console.log('sleep for ten min');
}, 600000);
Above function works perfectly in collection runner of postman.
But when I tried newman it is throwing error as
'setTimeout is not available inside sandbox and has no side-effect.'
I have found a similar thread like below:
https://github.com/postmanlabs/newman/issues/304
But they also haven't provided any solution.
Is there anyway by which I can mark my single API to delay for a time period.
I am already using Newman parameter --delay-request 60000 which delay between API's so it won't work for it.
Any solution will be helpful

Update newman to 3.8.3 or later.
The older version of newman is not supporting setTimeout

So opening the link you've given to us they say :
So to be clear, you are going to use newman a way it's not designed for.
This being said, you can try to implement a custom sleep :
function sleep(milisecond) {
const date = Date.now();
// Sleep in an *infinite* loop
while ((date + milisecond) > Date.now());
}

Related

AWS Amplify federated google login work properly on browser but dont work on Android

The issues are when I am trying to run federated authentication with the help of amplify auth method on the browser it works fine, but when I try to run it on my mobile.
It throws error No user found when I try to use Auth.currentSession() but the same work on the browser.
tried to search about this type of issue but I found related to ionic-cordova-google-plugin not related to AWS Amplify Federated Login Issue.
Updating the question after closing the question with less debugging information without asking for any information.
This is issues raised in git hub with respect to my problem.
Issue No. 5351 amplify js it's still in open state.
https://github.com/aws-amplify/amplify-js/issues/5351
Another issue 3537 which is still in Open
These two issues has the same scenario like me, I hope its enough debugging information, if more required mention comment instead of closing without notification, it's bullying for a beginner not helping
I fixed the above problem by referring a comment or wrapped around fix.
Link that will take to that comment directly link to comment.
First read the above comment as it will give you overall idea of what exactly the issue is instead of directly jumping to the solution.
Once you read the comment you will be little unclear with respect to implementation as he has use capacitor and not every one are using capacitor.
In my implementation I ignore this part as I am not using capacitor.
App.addListener('appUrlOpen')
Now lets go to main step where we are fixing this issue, I am using deep links to redirect to my application
this.platform.ready().then(() => {
this.deeplinks
.route({
"/success.html": "success",
"/logout.html": "logout",
})
.subscribe(
(match: any) => {
const fragment = JSON.stringify(match).split('"fragment":"')[1];
// this link can be your any link based on your requirement,
// what I am doing it I am passing all the data which I get in my fragments.
// fragments consists of id_token, stage, code,response type.
// These need to be passed to Ionic in order for Amplify to run its magic.
document.location.href = `http://192.168.1.162:8100/#${fragment}`;
},
(nomatch) => {
console.log("Got a deeplink that didn't match", nomatch);
}
);
});
I got this idea by referring the issue in which the developer mentioned of sending code and state along with application deep linking URL.

How to deploy large nodejs package to AWS Lambda?

I am trying to deploy a simple script to AWS Lambda that would generate critical css for a website. Running this serverless seems to make sense (but I cannot find any working examples).
The problem is with package size. I am trying to use https://github.com/pocketjoso/penthouse. When I simply npm install penthouse suddenly the package size is over 300MB. Size limit on Lambda is only 250MB and it will not upload.
Is there any way to solve this? Perhaps download penthouse on the fly? If so, is there any example?
Performance is not so critical in this case as it would be called only a few times a day by an automated process.
Looking at the bundle size of the package (https://bundlephobia.com/result?p=penthouse), it doesn't appear that your issue is primarily with the penthouse package. Although I cannot say for certain, I think it's mainly down to the size of your other dependencies.
Nevertheless, seen as this isn't a critical system and will be accessed a few times a day via automation processes, you can reduce the size of your node_modules folder by using a CDN.
There are a number of services which allow you to do this, I have primarily used UNPKG and jsDelivr in the past as they appear to be reliable with minimal-to-no downtime.
I lack the required detail from your question regarding which technology you're specifically using and the extent you can go to in order to achieve your desired result, but there are a few options you can choose:
Utilise webpack's externals configuration:
https://webpack.js.org/configuration/externals/
Use a CDN library loader such as: https://www.npmjs.com/package/import-cdn-js
Or https://www.npmjs.com/package/from-cdn
loadjs is another option: https://github.com/muicss/loadjs
scriptjs https://www.npmjs.com/package/scriptjs
I don't know much about penthouse but with scriptjs, I assume you can achieve something like this:
var penthouseScript = require("scriptjs");
penthouseScript("https://cdn.jsdelivr.net/npm/penthouse#2.2.2/lib/index.min.js", () => {
// penthouse related code
penthouse({
url: 'http://google.com',
cssString: 'body { color: red }'
})
.then(criticalCss => {
// use the critical css
fs.writeFileSync('outfile.css', criticalCss);
});
});

Running a request in Postman multiple times with different data only runs once

I am new to Postman and running into a recurrent issue that I can’t figure out.
I am trying to run the same request multiple times using an array of data established on the Pre-request script, however, when I go to the runner the request is only running once, rather than 3 times.
Pre-request script:
var uuids = pm.environment.get(“uuids”);
if(!uuids) {
uuids= [“1eb253c6-8784”, “d3fb3ab3-4c57”, “d3fb3ab3-4c78”];
}
var currentuuid = uuids.shift();
pm.environment.set(“uuid”, currentuuid);
pm.environment.set(“uuids”, uuids);
Tests:
var uuids = pm.environment.get(“uuids”);
if (uuids && uuids.length>0) {
postman.setNextRequest(myurl/?userid={{uuid}});
} else {
postman.setNextRequest();
}
I have looked over regarding documentation and I cannot find what is wrong with my code.
Thanks!
Pre-request script is not a good way to test api with different data. Better use Postman runner for the same.
First, prepare a request with postman with variable data. For e.g
Then click to the Runner tab
Prepare csv file with data
uuids
1eb253c6-8784
d3fb3ab3-4c57
d3fb3ab3-4c78
And provide as data file, and run the sample.
It will allow you run the same api, multiple times with different data types and can check test cases.
You are so close! The issue is that you are not un-setting your environment variable for uuids, so it is an empty list at the start of each run. Simply add
pm.environment.unset("uuids") to your exit statement and it should run all three times. All specify the your next request should stop the execution by setting it to null.
So your new "Tests" will become:
var uuids = pm.environment.get("uuids");
if (uuids && uuids.length>0) {
postman.setNextRequest(myurl/?userid={{uuid}});
} else {
postman.setNextRequest(null);
pm.environment.unset("uuids")
}
It seems as though the Runner tab has been removed now?
For generating 'real' data, I found this video a great help: Creating A Runner in Postman-API Testing
Sending 1000 responses to the db to simulate real usage has saved a lot of time!

Postman scripts : "pm is not defined"

I try to write a Pre-request script in Postman. I want to to make a request so I try to use pm.sendRequest. For example :
pm.sendRequest('http://example.com', function (err, res) {
//...
});
But I get this error :
There was an error in evaluating the Pre-request Script: pm is not defined
I'm on Windows 10. I just updated the extension.
How do I access pm?
A member posted an answer but, for some reason, I think he got banned. His answer didn't have a lot of details, but was working :
You have to use the standalone version of Postman for the pm to be accessible. I was using it as a Chrome extension. By switching to the standalone version, it worked. I don't know why, though.
If you are running an old version of Postman, you may run into this issue as I did. From https://www.getpostman.com/docs/v6/postman/scripts/test_examples#older-style-of-writing-postman-tests
The older style of writing Postman tests relies on setting values for the special tests object.
If you are running an older version of Postman, you will not have the pm object available for use.
For extension version, you can use following pattern:
tests["Status code is 200"] = responseCode.code === 200;
If you guys are using the Desktop version 0f Postman (in my case MacBook desktop version),
Then,
use responseBody instead of pm.response.text()
to set global variable postman.setGlobalVariable() instead of pm.environment.set()
Sample:
var jsonData = JSON.parse(responseBody);
postman.setGlobalVariable("studentId", jsonData.studentId);
You can use SNIPPETS to auto-generate basic things like the above to avoid these errors.
My Postman version detail:
Postman for Chrome
Version 5.5.6
OS X / arm64
Chrome 108.0.0.0
Replace pm with postman, it should work fine.

Azure Queue Storage Triggered-Webjob - Dequeue Multiple Messages at a time

I have an Azure Queue Storage-Triggered Webjob. The process my webjob performs is to index the data into Azure Search. Best practice for Azure Search is to index multiple items together instead of one at a time, for performance reasons (indexing can take some time to complete).
For this reason, I would like for my webjob to dequeue multiple messages together so I can loop through, process them, and then index them all together into Azure Search.
However I can't figure out how to get my webjob to dequeue more than one at a time. How can this be accomplished?
For this reason, I would like for my webjob to dequeue multiple messages together so I can loop through, process them, and then index them all together into Azure Search.
According to your description, I suggest you could try to use Microsoft.Azure.WebJobs.Extensions.GroupQueueTrigger to achieve your requirement.
This extension will enable you to trigger functions and receive the group of messages instead of a single message like with [QueueTrigger].
More details, you could refer to below code sample and article.
Install:
Install-Package Microsoft.Azure.WebJobs.Extensions.GroupQueueTrigger
Program.cs:
static void Main()
{
var config = new JobHostConfiguration
{
StorageConnectionString = "...",
DashboardConnectionString = "...."
};
config.UseGroupQueueTriggers();
var host = new JobHost(config);
host.RunAndBlock();
}
Function.cs:
//Receive 10 messages at one time
public static void MyFunction([GroupQueueTrigger("queue3", 10)]List<string> messages)
{
foreach (var item in messages)
{
Console.WriteLine(item);
}
}
Result:
How would I get that changed to a GroupQueueTrigger? Is it an easy change?
In my opinion, it is an easy change.
You could follow below steps:
1.Install the package Microsoft.Azure.WebJobs.Extensions.GroupQueueTrigger from Nuget Package manager.
2.Change the program.cs file enable UseGroupQueueTriggers.
3.Change the webjobs functions according to your old triggered function.
Note:The group queue message trigger must use list.
As my code sample shows:
public static void MyFunction([GroupQueueTrigger("queue3", 10)]List<string> messages)
This function will get 10 messages from the "queue3" one time, so in this function you could change the function loop the list of the messages and process them, then index them all together into Azure Search.
4.Publish your webjobs to azure web apps.