I have a WCF web service that takes a start and end date as input, and returns a record set. What I'd like to do is setup an Informatica mapping that creates variables for the date from one week ago and today's date. These are used as input for the web service consumer or web service as a source (whichever will work), but I'm not sure how to go about this. I can't create an Expression with no inputs, and I don't see how to set a mapped parameter as input.
The only two ways I can think about doing this would be to either build an app that creates a flat file with both dates, or to build a database object that supplies the dates as a source. I'd rather not have a separate outside source to provide these values, but I can't think of another way.
If you need those variables set before mapping run, use Assignment Task in the workflow and usePre-session variable assignment` to set the values for the mapping before it runs.
There is no way to do this with the Informatica v9.6.1. A source has to be created in order to feed the web service. I ended up creating a dummy record with 1 field, using it as input, then disregarding the input and setting up variable output using an Expression transformation.
Related
I have a requirement to capture session level details like session start time, end time, src success row, failedrows etc.. in a audit table. As all those details are available in prebuilt session variables i need to store these in a table. As of now what i am doing is, taking an assignment task in a workflow and assigning all these prebuilt session variables values for a particular session to wrkflow variables and passing these workflow variables to mapping variables using another non reusable session (the mapping which loads the table) using pre variable assignment option.It is working fine for workflow which is having one session. But if i have to implement this for a workflow having more no of sessions this process will be tedious as i have to create assignment task for each of these sessions and need to create non resuable session which calls a mapping to load into audit table.
So i am wondering is there any alternative solution to get this job done? I am thinking of a solution in which if we can able to captures audit details of all session in a file and pass this file as a input to a mapping to load this data at once into table. Is this possible? any solution?
Check this out: ETL Operational framework
It covers and end-to-end solution that should fit your needs and be quite easy to extend if you have multiple sessions - all you'd need to do is apply similar Post session commands before running the final session that loads the stats to database.
How the Informatica power center platform works? Like when we create a new user , what all background works processing? How the source data is getting extracted?
Is there any documents to get the background process of Informatica Power Center?
This is ETL tool so, it extracts data form source, get it into infa server, does transformations, then push the data into target box. All these are done using multiple threads, you may not able to see them in documentation but you can see them into session logs.
For a simple SRC> EXP>AGG>TGT type i can explain the back end process.
Step 1 - Infa gets the mappping info from Repo.
Step 2 -
Infa service creates three threads -
a. Read - reads from source and laods into infa memory.
b. Transform - Aggregates data when #1 is done and put into memory.
c. Load data to target when #2 is done.
This can be complex when you have many transformations. Everything is logged into session logs and you will get tons of info from that.
Power Center Informatica offers the capability to connect & fetch data from different heterogeneous source and processing of data. For example, You can connect to any database like SQL Server Database or Oracle database and can integrate the data into a third system. Informatica has its own transformation language that you can use in our expression transformation, filter transformation, source qualifier transformation etc. It is quite versatile and not at all difficult to learn, if you're familiar with any of the most popular programming languages of today.
I have a scenario where I have to pull data from web service using REST web service consumer transformation. For example the endpoint url is http://example/2015/Q1. Here I have to parameterise 2015/Q1 as $$DATES. But I cannot change parameter values manually. I have to design my mapping in a way that it should dynamically keep increasing the dates without doing it manually in all the runs including past to future. Please suggest me a way for the same.
You can have a parent workflow which will dynamically create a script with "pmcmd startworkflow" calls for all the quarters you need. Parent workflow will call the script to invoke the child workflow n number of times. You also need to have a table or file with all the quarters and a flag which will say if that quarter is processed or not. In the child workflow(actual one that you already have) you need to update the flag and mark it as processed. Each run of the child workflow will pick up the first unprocessed quarter and process it.
Hope that helps.
I am facing some challenge to create multiple PDF files in informatica 10.2.0, Please find the details below:
Requirement : - We need to spilt single xml file into multiple PDF files based on condition.
Tools used : - Informatica Powercenter, Informatica Developer.
Challenge : - I have created data processor in informatica developer and used this as service in informatica powercenter and created single PDF file
But not able to create multiple PDF files with this service. I have used sorter to sort the file based on my condition to split the file and then used transaction control to commit records based on the condition used and passing these records to the UDT transformation (calling service in this transformation) and then passing the output Buffer port to target.
Here in UDT transformation I have given Input type and Output type as 'file' in UDT settings tab in mapping.
Could anyone please provide suggestion to achieve a solution for this technical challenge.
create a variable to identify when a change or new xml is processed and use this to issue commit/continue in the Transaction control.
I think the transaction control is probably the wrong route to use here - try this https://kb.informatica.com/howto/6/Pages/1/154788.aspx
When there is any change in DDL of any table, we have to again import source and target definition and change mapping. Is there a way to dynamically fetch the DDL of the table and do the data copy using Informatica mapping.
The ETL uses an abstractive layer, separated from any physical database. It uses Source and Target definition that indicate what should be expected to find in DB to which the job will be connecting. Keep in mind that the same data mapping can be applied to many different source and / or target systems. It's not bound to any of them, it just defines what data to fetch and what to do with them.
In Informatica this is reflected by separating Mappings, that define data flow, and Sessions, which indicate where the logic should be applied.
Imagine you're transferring data from multiple servers. A change applied on one of them should not break the whole data integration. If the changes would be dynamically reflected, then a column added on one server would make it impossible to read data from the others.
Of course if perfectly fine to have such requirement as you've mentioned. It's just not something Informatica supports with their approach.
The only way workaround is to create your own application that would fetch table definitions, generate the Workflows and import them into Informatica prior to execution.