Postman allows to generate random dummy data by using pre-defined variables, on example this one would be replaced by random company name:
{{$randomCompanyName}}
Using pre-defined variables multiple times return different values per request.
The question is how to save once generated value to the variable for further usage on example in tests, something like(it doesn't work):
pm.variables.set("company", {{$randomCompanyName}});
Thanks.
You can use the .replaceIn() function with that {{...}} syntax in the sandbox.
pm.globals.set("company", pm.variables.replaceIn('{{$randomCompanyName}}'));
I've used a global variable to store the value as you would want to use it again. You could also use either the environment or collectionVariables scope to do the same thing.
Related
I have a scenario where I have some JSON ("lldp" in the image below), and I need to find a particular key and pull all of its values from within. The particular key I need to pull is dynamic and is identified as the 'thisPort' variable. All of this is shown in the screenshot below.
The lldp data basically looks like this. Note how the ports are not within a list. Any given instance of the lldp data may contain anywhere between 1 - 48 ports.
lldp = {
"port1": {"stuff":"things"},
"port2": {"stuff":"things"},
"port40": {"stuff":"things"}
}
I assumed I could do something like "lldp.thisPort" to access the keys and variables withing, however this produces useless errors and doesn't work. In this case I passed it three different 'thisPort' variables from a list, so presumably its the same problem three times, and not three different problems.
'thisPort' does correctly come across to the Evaluate function as a string that should lead to a valid JSON path. Eg, 'lldp.thisPort' does seem to translate to a valid path like 'lldp.port1', but Evaluate doesn't seem to agree and I get an error.
Using variables (or any other 'dynamic' way of working), how can you access the keys/values within some JSON as part of a postman flow, when the path to the thing you're trying to pull is dynamic?
You can use $lookup(lldp, thisPort) inside the Evaluate block to get the values inside the thisPort object.
I know that we can set a variable in different scopes via a pre-request script, but can we set one for on "execution" or "run of test".
I have a folder that contains two requests to validate a scenario where the first one will create a resource with an unique id and the second one will fail by trying to create a resource with the same unique id.
I would like to generate that unique value each time the collection is run. At this time I use a collectionVariables to test and set when not present but that variable is kept between each "retry".
Can I create a variable that will be the same only for one execution of a collection ?
Thanks
I have similar cases, where I store the values in Environment variables and then unset them in the Pre-request script of the first request:
pm.environment.unset("myVariable");
So, my solution is the same as the one suggested by #so cal cheesehead.
I create a variable in either the folder pre-request or the first request script. And unset it after the last test in the last request.
The sad part is that the initialization and destruction of this variable is spread in different scripts.
I'm using Postman and wondering if I can use a stored JSON object to create variable for additional calls. For an example: I saved an array which include name and ID:
[{"id":28,"name":"Action"},{"id":12,"name":"Adventure"},
{"id":16,"name":"Animation"},{"id":35,"name":"Comedy"},
{"id":80,"name":"Crime"},{"id":99,"name":"Documentary"},
{"id":18,"name":"Drama"},{"id":10751,"name":"Family"},
{"id":14,"name":"Fantasy"},{"id":36,"name":"History"},
{"id":27,"name":"Horror"},{"id":10402,"name":"Music"},
{"id":9648,"name":"Mystery"},{"id":10749,"name":"Romance"},
{"id":878,"name":"Science Fiction"},{"id":10770,"name":"TV Movie"},
{"id":53,"name":"Thriller"},{"id":10752,"name":"War"},
{"id":37,"name":"Western"}]
I'm triggering another API (second call) that retrieves only IDs, so the response is like this: "genre_ids": [35, 10402]
Is there a way to create an environment variable that looks for the IDs, fetch the relevant name from the second API and create a name oriented variable so on the case above 35=comedy and 10402=music so the variable will be: comedy,music?
to save environment variable you can do the following (see the snippets on the right side of postman):
postman.setEnvironmentVariable("variable_key", "variable_value");
if you want to save a global variable just do :
postman.setGlobalVariable("variable_key", "variable_value");
and then use them as you want.
Alexandre
Definitely it's possible with "Tests"(or Post-Request) script.
But since you have 2 requests(and Postman force you there should be 2 separated requests) you should save response from first request into variable with setEnvironmentVariable or setGlobalVariable and after getting response for 2nd request - to parse response for first one and iterate over it, looking up by id given.
I have two tables in different databases. In a table A is the data, in the other table B are information for incremental load of the data from the first table. I want to load from table B and store the date of the last successful load from table A in a mapping variable $$LOAD_DATE. To achieve this, I read a date from table B and use the SETVARIABLE() function in a expression to set the $$LOAD_DATE variable. The port in which I do this is marked as output and writes into a dummy flat file. I only read on row of this source!
Then I use this $$LOAD_DATE variable in the Source Filter of the Source Qualifier of table A to only load new records which are younger than the date stored in the $$LOAD_DATE variable.
My problem is that I am not able to set the $$LOAD_DATE variable correctly. It is always the date 1753-1-1-00.00.00, which is the default value for mapping variables of the type date/time.
How do I solve this? How can I store a date in that variable and use it later in a Source Qualifiers source filter? Is it even possible?
EDIT: Table A has too much records to read them all and filter them later. This would be to expensive, so they have to be filtered at source filter level.
Yes, it's possible.
In the first map you have to initialize the variable, like this:
In first session configuration you have to define the Post-session on success variable assignment:
The second map (with your table A) will get the variable after this configuration of the session in Pre-session variable assignment:
It will work.
It is not possible to set a mapping variable and use it's value somewhere else in the same run, because, the variable is actually set when the session completes.
If you really want to implement it using mapping variables you have to create two mappings, one for setting the mapping variable and another for actual incremental load. You can pass a mapping variable value from one session to another in a workflow using a workflow variable. https://stackoverflow.com/a/26849639/2626813
Other solutions could be to use a lookup on B and a filter after that.
You can also write some scripts to query table B and modify the parameter file with the latest $LOAD_DATE value prior to executing the mapping.
Since we're having two different DBs, use two sessions. Get values in the first one and pass the parameters to the second one.
Workflow generates three files (header, detail, trailer) which I combine via post-session command. There are two variables which are set in my mapping, which I want to use in the post-session command like so:
cat header1.out detail1.out trailer1.out > OUTPUT_$(date +%Y%m%d)_$$VAR1_$$VAR2.dat
But this doesn't work and the values are empty, so I get OUTPUT_20151117__.dat.
I've tried creating workflow variables and assigning them via pre-session variable assignment, but this doesn't work either.
What am I missing? Or was this never going to work?
Can you see the values assigned to those variables on the session log or do they appear empty as well?
Creating workflow variables is what I'd try, but you need to assign the values with the post-session variable assignment.
Basically, you store values in a variable in your mapping and pass the values up to the workflow after the session succeeded. Here is how you can achieve that:
Define a Workflow variables $$VAR1 and $$VAR2
Define the variables in your mapping, but chose different names! So i.e. $$M_VAR1 and $$M_VAR2
In your mapping, assign the values to your mapping variables through the functions SetVariable(var as char, value as data type)
In your session, select Post-session on success variable assignment.
In step 4, the current value from $$M_VAR1 (mapping variable) is stored in your workflow variable $$VAR1 and can then be used in the workflow in command tasks like you asked.
A few notes:
I'm not 100% sure if the variable assignment is exectured before the post-session command. If the command is executed first, you could execute your command tasks in an external command task after your session.
Pre-Session variable assignment is used if you pass a value from a workflow variable down to a mapping variable. You can use this if your variables $$VAR1 or $$VAR2 are used inside another mapping and need to be initialized at the beginning.