how to access a camunda variable (containing dot in name) in user task documentation - camunda

I want to show value of a variable in user task description/documentation. I am able to access simple variables using ${simpleVarName} expression but not able to access variables containing dot in name with ${notSimple.varName} expression. is there a way to access such process variables in usertask documentation field?

i was able to solve this by using execution object.
${execution.getVariable("notSimple.varName")}
Also, putting variable name in quotes did not work. expression ${"notSimple.varName"} returns variable name itself not value.

Related

Nested json path using variables in Postman Flow

I have a scenario where I have some JSON ("lldp" in the image below), and I need to find a particular key and pull all of its values from within. The particular key I need to pull is dynamic and is identified as the 'thisPort' variable. All of this is shown in the screenshot below.
The lldp data basically looks like this. Note how the ports are not within a list. Any given instance of the lldp data may contain anywhere between 1 - 48 ports.
lldp = {
"port1": {"stuff":"things"},
"port2": {"stuff":"things"},
"port40": {"stuff":"things"}
}
I assumed I could do something like "lldp.thisPort" to access the keys and variables withing, however this produces useless errors and doesn't work. In this case I passed it three different 'thisPort' variables from a list, so presumably its the same problem three times, and not three different problems.
'thisPort' does correctly come across to the Evaluate function as a string that should lead to a valid JSON path. Eg, 'lldp.thisPort' does seem to translate to a valid path like 'lldp.port1', but Evaluate doesn't seem to agree and I get an error.
Using variables (or any other 'dynamic' way of working), how can you access the keys/values within some JSON as part of a postman flow, when the path to the thing you're trying to pull is dynamic?
You can use $lookup(lldp, thisPort) inside the Evaluate block to get the values inside the thisPort object.

How can I save value of dynamic variable in Postman?

Postman allows to generate random dummy data by using pre-defined variables, on example this one would be replaced by random company name:
{{$randomCompanyName}}
Using pre-defined variables multiple times return different values per request.
The question is how to save once generated value to the variable for further usage on example in tests, something like(it doesn't work):
pm.variables.set("company", {{$randomCompanyName}});
Thanks.
You can use the .replaceIn() function with that {{...}} syntax in the sandbox.
pm.globals.set("company", pm.variables.replaceIn('{{$randomCompanyName}}'));
I've used a global variable to store the value as you would want to use it again. You could also use either the environment or collectionVariables scope to do the same thing.

Set Mapping variable in Expression and use it in Source Filter

I have two tables in different databases. In a table A is the data, in the other table B are information for incremental load of the data from the first table. I want to load from table B and store the date of the last successful load from table A in a mapping variable $$LOAD_DATE. To achieve this, I read a date from table B and use the SETVARIABLE() function in a expression to set the $$LOAD_DATE variable. The port in which I do this is marked as output and writes into a dummy flat file. I only read on row of this source!
Then I use this $$LOAD_DATE variable in the Source Filter of the Source Qualifier of table A to only load new records which are younger than the date stored in the $$LOAD_DATE variable.
My problem is that I am not able to set the $$LOAD_DATE variable correctly. It is always the date 1753-1-1-00.00.00, which is the default value for mapping variables of the type date/time.
How do I solve this? How can I store a date in that variable and use it later in a Source Qualifiers source filter? Is it even possible?
EDIT: Table A has too much records to read them all and filter them later. This would be to expensive, so they have to be filtered at source filter level.
Yes, it's possible.
In the first map you have to initialize the variable, like this:
In first session configuration you have to define the Post-session on success variable assignment:
The second map (with your table A) will get the variable after this configuration of the session in Pre-session variable assignment:
It will work.
It is not possible to set a mapping variable and use it's value somewhere else in the same run, because, the variable is actually set when the session completes.
If you really want to implement it using mapping variables you have to create two mappings, one for setting the mapping variable and another for actual incremental load. You can pass a mapping variable value from one session to another in a workflow using a workflow variable. https://stackoverflow.com/a/26849639/2626813
Other solutions could be to use a lookup on B and a filter after that.
You can also write some scripts to query table B and modify the parameter file with the latest $LOAD_DATE value prior to executing the mapping.
Since we're having two different DBs, use two sessions. Get values in the first one and pass the parameters to the second one.

Using mapping variables in a post-session command

Workflow generates three files (header, detail, trailer) which I combine via post-session command. There are two variables which are set in my mapping, which I want to use in the post-session command like so:
cat header1.out detail1.out trailer1.out > OUTPUT_$(date +%Y%m%d)_$$VAR1_$$VAR2.dat
But this doesn't work and the values are empty, so I get OUTPUT_20151117__.dat.
I've tried creating workflow variables and assigning them via pre-session variable assignment, but this doesn't work either.
What am I missing? Or was this never going to work?
Can you see the values assigned to those variables on the session log or do they appear empty as well?
Creating workflow variables is what I'd try, but you need to assign the values with the post-session variable assignment.
Basically, you store values in a variable in your mapping and pass the values up to the workflow after the session succeeded. Here is how you can achieve that:
Define a Workflow variables $$VAR1 and $$VAR2
Define the variables in your mapping, but chose different names! So i.e. $$M_VAR1 and $$M_VAR2
In your mapping, assign the values to your mapping variables through the functions SetVariable(var as char, value as data type)
In your session, select Post-session on success variable assignment.
In step 4, the current value from $$M_VAR1 (mapping variable) is stored in your workflow variable $$VAR1 and can then be used in the workflow in command tasks like you asked.
A few notes:
I'm not 100% sure if the variable assignment is exectured before the post-session command. If the command is executed first, you could execute your command tasks in an external command task after your session.
Pre-Session variable assignment is used if you pass a value from a workflow variable down to a mapping variable. You can use this if your variables $$VAR1 or $$VAR2 are used inside another mapping and need to be initialized at the beginning.

tracking traffic source from cookie

I want to establish lookup table macro depending on content of utm_z cookie.
For example if user have in utm_z cookie specific value (for example abc) my lookup table macro need to return specific value.
I made 1 party cookie variable returning value from utm_z cookie. This variable return for example '125233995.1441192396.4.2.utmcsr=aa|utmccn=cc|utmcmd=aa|utmctr=aaa|utmcct=bb
I can not use regular expresions in Lookup table macro.
It's hard to answer without knowing which value you want to get, but you can use a custom javascript variable to extract the value via regex (e.g. something like .*utmcsr(=.*?)\|.* to get the utmcsr value) from your cookie variable and then pass the the variable with the extracted value to the lookup table.
So the first party cookie variable feeds into the custom javascript variable to extract the value which in turn feeds into the lookup table to return the value you want.