Including parameter expression - informatica

I'm new in Informatica PowerCenter, trying to use a new declared parameter expression in Workflow Session parameter file.
Objective: the expression is being feed automatically from data base, so the reason behind all this is to be able to change the logic in db, parameter file will re-generate in the next run with the new logic which ultimately will cause not to change nothing in power Center, only record expression in db. Off course for new fields you will need to change informatica.
Let me summarize or try to be deatailed as posible.
Parameter file has no issues, when executing the workflow it's being read with out issues, here's an example of the parameter expression I'm trying to use (rest of them, dbCon, time etc i delete them)
[Example.WF:Example.WT:.ST:Example]
$$Param_checker=IIF(1=1,'A','B')
Mapping
Created the parameter in the mapping as:
Name: $$Param_checker
Type: Parameter
Data Type: String (as per doc)
Precision: 10000 <-- making sure there's no cut
IsExprVar: true
Created a field in a Transformation Expression Box (the violet one) as:
Name: out_param_checker
Data Type: String
Precision: 10
O: X
Expresion: $$Param_checker
Workflow
In it session reads correctly the parameter file of the Mapping replaces all variables but when it comes to $$Param_checker it's not being expanded, this is a sample of log outcome when inserting on data base (cast and allfunctions are done by Inf. aut.:
CASE WHEN CAST('$$Param_checker' AS VARCHAR(4000))
And off course in Data base when process finishes I see the string as well '$$Param_checker'
Just in case, and sorry for being verbose, I do already have a parameter with a expression working but in a source Qualifier using it on a SQL statement.
The issue is why can't I use it in a out field. Forgot to mention that we have redshift under the hood, all sessions are set to "Full" Optimization, hence I think that's the issue.
If anyone have any hint would appreciate it.
Thanks

Related

Why doesn't the parameter index work?

In the documentation, https://docs.spring.io/spring-data/neo4j/docs/current/reference/html/
it uses {0} to reference the parameter 'movieTitle'.
#Query("MATCH (movie:Movie {title={0}}) RETURN movie")
Movie getMovieFromTitle(String movieTitle);
However, in my own code, if I use "{title={0}", my IntelliJ always reports a syntax error. I can resolve the issue by changing it to
{title:{movieTitle}
Here I have to use the actual argument name and the colon plus {}.
Is there any trick for this? I don't think the documentation is wrong.
Question 2:
If I want the node label "Movie" to be a parameter, it also shows an error message:
#Query("MATCH (movie:{label} {title={0}}) RETURN movie")
Movie getMovieFromTitle(String movieTitle, String label);
I do not know what version of IntelliJ you are using but the first query is right. There is also a test case for this in the spring-data-neo4j project.
It is not possible to use the second query syntax because there is no support for this on the database level where the query gets executed. If it would be supported in SDN before making the call to the DB the query has to be parsed (and the pattern replaced) every time when the query get executed and SDN will loose the possibility to parse the query once and then just add the parameter values in subsequent calls. This will lower the performance of executing annotated query functions.

Updating ElasticSearch mappings field type with existing data

I'm storing a few fields and for the sake of simplicity lets call the field in question 'age'. Initially ES created the index for me and it ended up choosing the wrong field type for 'age'. It's a string type right now instead of a numeric type. I'm aware that, I should have defined the mappings myself to begin with and force the data values been sent to be consistently all strings or numeric values.
What I've right now is an index with a ton of data that uses a 'string' type for age with following values: 1, 10, 'na', etc..
Now my question is: if I were to change the mapping from string to integer, would indexing have any issues with the existing data values such as 'na' when being updated ??
I just wanted to ask first before I start creating a playground environment to test with a sample data set.
What you can update according to the doc:
new properties can be added to Object datatype fields.
new multi-fields can be added to existing fields.
doc_values can be disabled, but not enabled.
the ignore_above parameter can be updated.
Otherwise I am afraid you will have to create a new mapping and reindex your data, see this post for example

Using a Parameter in Expression Transformation

I have a workflow in which I've set up an expression transformation to select $$Param for a particular field, and then within the target properties I've set a delete value. I've tried this by substituting $$Param for a hardcoded value and it works fine, however, for some reason when I put in $$Param, it doesn't actually do the delete. Is there a reason? Am I doing something wrong?
Just for clarification, the workflow executes successfully - no error is thrown but it's not doing what it's supposed to.
Thanks in advance,
$$Param needs to be passed thru a parameter file and you have the option to set an initial value when you declare the parameter in the mapping under Mappings > Parameter and Variables.
Have you looked at the session log to see what's the override value of $$Param is being used? If it's a SQL delete, try to turn see in the session log the query being executed in the database.

Set Mapping variable in Expression and use it in Source Filter

I have two tables in different databases. In a table A is the data, in the other table B are information for incremental load of the data from the first table. I want to load from table B and store the date of the last successful load from table A in a mapping variable $$LOAD_DATE. To achieve this, I read a date from table B and use the SETVARIABLE() function in a expression to set the $$LOAD_DATE variable. The port in which I do this is marked as output and writes into a dummy flat file. I only read on row of this source!
Then I use this $$LOAD_DATE variable in the Source Filter of the Source Qualifier of table A to only load new records which are younger than the date stored in the $$LOAD_DATE variable.
My problem is that I am not able to set the $$LOAD_DATE variable correctly. It is always the date 1753-1-1-00.00.00, which is the default value for mapping variables of the type date/time.
How do I solve this? How can I store a date in that variable and use it later in a Source Qualifiers source filter? Is it even possible?
EDIT: Table A has too much records to read them all and filter them later. This would be to expensive, so they have to be filtered at source filter level.
Yes, it's possible.
In the first map you have to initialize the variable, like this:
In first session configuration you have to define the Post-session on success variable assignment:
The second map (with your table A) will get the variable after this configuration of the session in Pre-session variable assignment:
It will work.
It is not possible to set a mapping variable and use it's value somewhere else in the same run, because, the variable is actually set when the session completes.
If you really want to implement it using mapping variables you have to create two mappings, one for setting the mapping variable and another for actual incremental load. You can pass a mapping variable value from one session to another in a workflow using a workflow variable. https://stackoverflow.com/a/26849639/2626813
Other solutions could be to use a lookup on B and a filter after that.
You can also write some scripts to query table B and modify the parameter file with the latest $LOAD_DATE value prior to executing the mapping.
Since we're having two different DBs, use two sessions. Get values in the first one and pass the parameters to the second one.

BIRT - using multiple webservices to get the data

I am trying to generate a report using Eclipse BIRT report designer.
The scenario is this:
There are 2 web service data sources. There are 2 datasets for webservices 'WS1' and 'WS2' respectively.
The output element 'COUNTRYID' of one webservice 'WS1' would go as input for another webservice 'WS2'.
What I did:
Created a parameter COUNTRYID.
Created a dummy Computed Column in the dataset of the web service 'WS1' with the expression:
params["COUNTRYID"].value=row["COUNTRYID"]
Now the input parameters for the 'WS2' dataset is related to the global paramter 'COUNTRYID'.
When I run the report, I see that the global parameter contains the value from the 'WS1' output.
But the report does not display the values from the response of the web service 'WS2'
My questions:
How can I see, if the webservice got fired or not?
How can I see, if the webservice got fired with correct values ?
WS1 is not fired unless it is explicitely bound to a report element. Typically, to achieve this we apply following steps:
insert a data element at the beginning of the report body
turn the property visibility of this new element to false (or let it visible during testing)
bind it to the first dataset WS1
It will force a silent execution of WS1, and therefore this will populate your parameter COUNTRYID before WS2 runs.
However this approach would not work if:
WS2 dataset has to be used to populate selection items of a report parameter (which does not seem to be the case here)
If COUNTRYID parameter is used at render time. This point is much more annoying, if you need this parameter in chart expressions for example. If so, i would recommend to store WS1 in a report variable instead of (or why not in addition to) a report parameter. See this topic to see how to create a report variable.
You can initialize it at the same place you did for the report parameter with:
vars["COUNTRYID"]=row["COUNTRYID"];
and use it anywhere with
vars["COUNTRYID"];
Report variables are available from the palette of expressions editor :