Looping A Dynamic map to load different tables in Informatica BDM - informatica

I have created a dynamic mapping in Informatica BDM as follows:
I created a paramaterized source and target
I also created a workflow for the map
Now what I i am having difficulty with, is changing the source and target parameters at run time to load different tables to different targets using this one dynamic map.
it is like running a map in a loop but each time it runs the source and target parameters changes to read different source and load to different target.
Any help will be appreciated.
Thanks in advance.

What you can do you can re-import the same mapping into same workflow n times and either you can create workflow parameters and assign those parameters to mapping parameters in input tab in workflow. OR you can create parameter sets and import the mapping parameters inside it, assign the parameter set to mapping in the workflow.
eg.
mapping 1 -> parameter set 1
mapping 1 -> parameter set 2
change your parameters value in parameter set and use the same mapping, for more details refer docs (https://docs.informatica.com/data-quality-and-governance/data-quality/10-4-0/developer-workflow-guide/mapping-task/mapping-task-overview/multiple-mapping-tasks-that-run-the-same-mapping.html)

Related

How to get flat files with no file structure as source and insert same data into table as target in IICS

I wanted to get the flat files as source without any file structure specified using IICS(Informatica Intelligent cloud services). Flat files names can be anything, and structure also will change any. And also need to create table dynamically based on flat file and insert into table.
There's a number of options here. You can use a fully parameterized mapping niside a taskflow that will start on file listener, prepare the parameters and statements to be executed as part of the pre-SQL statement on your Target.
Inside the mapping you define Source and Target as parameterized - and that's briefly it!

Tray.IO - Creating a NEW list

Working with Tray.IO for the first time and I run into an issue after using the object helper to format the data in the object model I need, to add to a NEW list that contains only the new object information. As you can see below I have tried manipulating the object in several ways.
Any help would be very appreciated.
Tray.IO Items
Based on the information you've provided, below is how I would approach this situation. I've also included a video link if you'd like to see it done step-by-step.
Using the Data Storage connector, you should create a new list by using ‘Append to List’ method, setting the Key to an appropriate name, with the Value coming from object-helpers-2. In addition, you’ll want to Create if missing.
Once the new list is created, you’ll need to retrieve the data using another Data Storage connector by using the ‘Get Value’ method, setting the Key to the name of your prior Data Storage step.
Finally, update the Key of your List Helpers step to the result of the Data Storage connector used to retrieve the data.
Link to Video
Happy Traygramming!
Grant

Set Mapping variable in Expression and use it in Source Filter

I have two tables in different databases. In a table A is the data, in the other table B are information for incremental load of the data from the first table. I want to load from table B and store the date of the last successful load from table A in a mapping variable $$LOAD_DATE. To achieve this, I read a date from table B and use the SETVARIABLE() function in a expression to set the $$LOAD_DATE variable. The port in which I do this is marked as output and writes into a dummy flat file. I only read on row of this source!
Then I use this $$LOAD_DATE variable in the Source Filter of the Source Qualifier of table A to only load new records which are younger than the date stored in the $$LOAD_DATE variable.
My problem is that I am not able to set the $$LOAD_DATE variable correctly. It is always the date 1753-1-1-00.00.00, which is the default value for mapping variables of the type date/time.
How do I solve this? How can I store a date in that variable and use it later in a Source Qualifiers source filter? Is it even possible?
EDIT: Table A has too much records to read them all and filter them later. This would be to expensive, so they have to be filtered at source filter level.
Yes, it's possible.
In the first map you have to initialize the variable, like this:
In first session configuration you have to define the Post-session on success variable assignment:
The second map (with your table A) will get the variable after this configuration of the session in Pre-session variable assignment:
It will work.
It is not possible to set a mapping variable and use it's value somewhere else in the same run, because, the variable is actually set when the session completes.
If you really want to implement it using mapping variables you have to create two mappings, one for setting the mapping variable and another for actual incremental load. You can pass a mapping variable value from one session to another in a workflow using a workflow variable. https://stackoverflow.com/a/26849639/2626813
Other solutions could be to use a lookup on B and a filter after that.
You can also write some scripts to query table B and modify the parameter file with the latest $LOAD_DATE value prior to executing the mapping.
Since we're having two different DBs, use two sessions. Get values in the first one and pass the parameters to the second one.

BIRT - using multiple webservices to get the data

I am trying to generate a report using Eclipse BIRT report designer.
The scenario is this:
There are 2 web service data sources. There are 2 datasets for webservices 'WS1' and 'WS2' respectively.
The output element 'COUNTRYID' of one webservice 'WS1' would go as input for another webservice 'WS2'.
What I did:
Created a parameter COUNTRYID.
Created a dummy Computed Column in the dataset of the web service 'WS1' with the expression:
params["COUNTRYID"].value=row["COUNTRYID"]
Now the input parameters for the 'WS2' dataset is related to the global paramter 'COUNTRYID'.
When I run the report, I see that the global parameter contains the value from the 'WS1' output.
But the report does not display the values from the response of the web service 'WS2'
My questions:
How can I see, if the webservice got fired or not?
How can I see, if the webservice got fired with correct values ?
WS1 is not fired unless it is explicitely bound to a report element. Typically, to achieve this we apply following steps:
insert a data element at the beginning of the report body
turn the property visibility of this new element to false (or let it visible during testing)
bind it to the first dataset WS1
It will force a silent execution of WS1, and therefore this will populate your parameter COUNTRYID before WS2 runs.
However this approach would not work if:
WS2 dataset has to be used to populate selection items of a report parameter (which does not seem to be the case here)
If COUNTRYID parameter is used at render time. This point is much more annoying, if you need this parameter in chart expressions for example. If so, i would recommend to store WS1 in a report variable instead of (or why not in addition to) a report parameter. See this topic to see how to create a report variable.
You can initialize it at the same place you did for the report parameter with:
vars["COUNTRYID"]=row["COUNTRYID"];
and use it anywhere with
vars["COUNTRYID"];
Report variables are available from the palette of expressions editor :

Powercenter - concurrent target instances

We have a situation where we have a different execution order of instances of the same target being loaded from a single source qualifier.
We have a problem when we promote a mapping from DEV to TEST when we execute in TEST after promoting there are problems.
For instance we have a router with 3 groups for Insert, Update and Delete followed by the appropriate update strategies to set the row type accordingly followed by three target instances.
RTR ----> UPD_Insert -----> TGT_Insert
\
\__> UPD_Update -------> TGT_Update
\
\__> UPD_Delete ---------> TGT_Delete
When we test this out using data to do an insert followed by an update followed by a delete all based on the same primary key we get a different execution order in TEST compared to the same data in our DEV environment.
Anyone have any thoughts - I would post an image but I don't have enough cred yet.
Cheers,
Gil.
You can not controll the load order as long as you have a single source. I you could separate the loads to use separate sources the target load order setting in the mapping could be used, or you could even create separate mappings for them.
As it is now you should use a single target and utilize the update strategy transformation to determine the wanted operation for each record passing through. It is then possible to use a sort to define in what order the different operations is made to the physical table.
You can use the sorter transformation just before update strategy......based on update strategy condition you can sort the incoming rows....So first date will go through the Insert, than update at last through delete strategy.
Simple solution is try renaming the target definition in alphabetical order... like INSERT_A, UPDATE_B, DELETE_C then start loading
This will load in A,B,C order. Try and let me know