I am facing some challenge to create multiple PDF files in informatica 10.2.0, Please find the details below:
Requirement : - We need to spilt single xml file into multiple PDF files based on condition.
Tools used : - Informatica Powercenter, Informatica Developer.
Challenge : - I have created data processor in informatica developer and used this as service in informatica powercenter and created single PDF file
But not able to create multiple PDF files with this service. I have used sorter to sort the file based on my condition to split the file and then used transaction control to commit records based on the condition used and passing these records to the UDT transformation (calling service in this transformation) and then passing the output Buffer port to target.
Here in UDT transformation I have given Input type and Output type as 'file' in UDT settings tab in mapping.
Could anyone please provide suggestion to achieve a solution for this technical challenge.
create a variable to identify when a change or new xml is processed and use this to issue commit/continue in the Transaction control.
I think the transaction control is probably the wrong route to use here - try this https://kb.informatica.com/howto/6/Pages/1/154788.aspx
Related
How the Informatica power center platform works? Like when we create a new user , what all background works processing? How the source data is getting extracted?
Is there any documents to get the background process of Informatica Power Center?
This is ETL tool so, it extracts data form source, get it into infa server, does transformations, then push the data into target box. All these are done using multiple threads, you may not able to see them in documentation but you can see them into session logs.
For a simple SRC> EXP>AGG>TGT type i can explain the back end process.
Step 1 - Infa gets the mappping info from Repo.
Step 2 -
Infa service creates three threads -
a. Read - reads from source and laods into infa memory.
b. Transform - Aggregates data when #1 is done and put into memory.
c. Load data to target when #2 is done.
This can be complex when you have many transformations. Everything is logged into session logs and you will get tons of info from that.
Power Center Informatica offers the capability to connect & fetch data from different heterogeneous source and processing of data. For example, You can connect to any database like SQL Server Database or Oracle database and can integrate the data into a third system. Informatica has its own transformation language that you can use in our expression transformation, filter transformation, source qualifier transformation etc. It is quite versatile and not at all difficult to learn, if you're familiar with any of the most popular programming languages of today.
I have never used Informatica PowerCenter before and just don't know where to begin. To summarize my goal, I need to run a simple count query against a Teradata database using Informatica PowerCenter. This query needs to be ran on a specific day, but doesn't require me to store or manipulate the data returned. Looking at Informatica PowerCenter Designer is a bit daunting to me as I'm not sure what to be looking for.
Any help is greatly appreciated in understanding how to setup (if needed):
Sources
Targets
Transformations
Mappings
Is a transformation the only way to query data using PowerCenter? I've looked at a lot of tutorials, but most seem to be oriented to familiar users.
You can run a query against a database using informatica, only if you create a mapping, session and workflow to run that. But you cannot see the result unless you store it somewhere, either in a flatfile or a table.
Here are the steps to create it anyway.
Import your source table in source analyzer from Teradata.
Create a flat file target or import a relational target in target analyzer
Create a mapping m_xyz, drag and drop your source into the mapping.
You will see your source and source qualifier in the mapping. Write your custom query in source qualifier, say select count(*) as cnt from table
Remove all the ports from SQ except one numeric port from source to SQ and name it as cnt, count from your select will be assigned to this port.
Now drag and drop this port to an expression transformation.
Drag and drop your target into the mapping
Propagate the column from expression to this flat file/relational target.
Create a workflow and a session for this mapping.
In workflow you can schedule it to run on specific date.
When you execute this, count will be loaded into the column of that flat file or table.
I am new to wso2 6.4.0 dss, i have to do retrieve the data from multiple sheets of single excel file and insert those data into multiple tables. Please help me to do this. just guide me.
It looks like you need sophisticated logic to implement. Excel files may be a source of data. First of all how wsodss does know about a moment when it must start read excel? It sounds like wsoesb job, which supports a virtual file-system, and can truck directory and generate an event if there are any changes.
Why don't you use wsoesb to read sheet by sheet and insert data?
It provides the necessary tools (mediators) to execute.
Anyway, it does look like a ETL job.
I have a WCF web service that takes a start and end date as input, and returns a record set. What I'd like to do is setup an Informatica mapping that creates variables for the date from one week ago and today's date. These are used as input for the web service consumer or web service as a source (whichever will work), but I'm not sure how to go about this. I can't create an Expression with no inputs, and I don't see how to set a mapped parameter as input.
The only two ways I can think about doing this would be to either build an app that creates a flat file with both dates, or to build a database object that supplies the dates as a source. I'd rather not have a separate outside source to provide these values, but I can't think of another way.
If you need those variables set before mapping run, use Assignment Task in the workflow and usePre-session variable assignment` to set the values for the mapping before it runs.
There is no way to do this with the Informatica v9.6.1. A source has to be created in order to feed the web service. I ended up creating a dummy record with 1 field, using it as input, then disregarding the input and setting up variable output using an Expression transformation.
I am a newbie in ETL and will be using Informatica soon for one of the requirements we have.
The requirement is that Informatica needs to monitor a table in Oracle for certain "trigger data" and as soon as that data is available in that table, Informatica should start executing steps in its workflow.
Is it possible to do this? If yes, could someone please point me to a link/document where this is explained.
Many thanks.
No, it is not possible (checked in PowerCenter 9.5.1).
The Event-Wait task supports only two types of events:
predefined events (the task instructs the Integration Service to wait for the specified indicator file to appear before continuing),
user-defined events (the event is triggered by an Event-Raise task somewhere in the workflow).
Yes it is possible and you will be needing a script that can be created with following steps.
--create a shell script that checks if data is present in table on not you can use this just by taking count of the table
--if count is grater than create an empty file say DUMMY.txt (by using touch command) at a specified path.
--in you Informatica scheduling either by scheduler or by script check every 5 mins if file is present.
--if file is present call you Informatica workflow and delete the DUMMY file.
--once workflow is completed start the process again.