Generate automatically web service parameters - informatica

I have a scenario where I have to pull data from web service using REST web service consumer transformation. For example the endpoint url is http://example/2015/Q1. Here I have to parameterise 2015/Q1 as $$DATES. But I cannot change parameter values manually. I have to design my mapping in a way that it should dynamically keep increasing the dates without doing it manually in all the runs including past to future. Please suggest me a way for the same.

You can have a parent workflow which will dynamically create a script with "pmcmd startworkflow" calls for all the quarters you need. Parent workflow will call the script to invoke the child workflow n number of times. You also need to have a table or file with all the quarters and a flag which will say if that quarter is processed or not. In the child workflow(actual one that you already have) you need to update the flag and mark it as processed. Each run of the child workflow will pick up the first unprocessed quarter and process it.
Hope that helps.

Related

How to capture session level details in a table

I have a requirement to capture session level details like session start time, end time, src success row, failedrows etc.. in a audit table. As all those details are available in prebuilt session variables i need to store these in a table. As of now what i am doing is, taking an assignment task in a workflow and assigning all these prebuilt session variables values for a particular session to wrkflow variables and passing these workflow variables to mapping variables using another non reusable session (the mapping which loads the table) using pre variable assignment option.It is working fine for workflow which is having one session. But if i have to implement this for a workflow having more no of sessions this process will be tedious as i have to create assignment task for each of these sessions and need to create non resuable session which calls a mapping to load into audit table.
So i am wondering is there any alternative solution to get this job done? I am thinking of a solution in which if we can able to captures audit details of all session in a file and pass this file as a input to a mapping to load this data at once into table. Is this possible? any solution?
Check this out: ETL Operational framework
It covers and end-to-end solution that should fit your needs and be quite easy to extend if you have multiple sessions - all you'd need to do is apply similar Post session commands before running the final session that loads the stats to database.

How to create User Task only once a day

We want to collect data during the day and create an User Task once a day. How can that be done with camunda? Is there a possibility to use process variables or do we need to access our own database and mark the corresponding items as processed (as soon as we create the daily user task)?
Do we need to create these user tasks programmatically? (We are using embedded Spring Boot Camunda instance)
One very good option would be to use a Timer Start Event per the documentation here: https://docs.camunda.org/manual/7.10/reference/bpmn20/events/timer-events/#timer-start-event.
It seems that you may want to use that in conjunction with a Timer Intermediate Catching Event (https://docs.camunda.org/manual/7.10/reference/bpmn20/events/timer-events/#timer-intermediate-catching-event) in something like the following manner:
Start a process instance at a specific time in the morning with the Timer Start Event. Perhaps 6:30AM in your local time zone?
Execute certain steps to gather data, perhaps through external service invocations, etc.
At a specific time (in the afternoon?), create the User Task and present the data. The User Task could follow the Timer Intermediate Catching Event noted above.
I hope this helps!

Use parameter as web service input for Informatica mapping

I have a WCF web service that takes a start and end date as input, and returns a record set. What I'd like to do is setup an Informatica mapping that creates variables for the date from one week ago and today's date. These are used as input for the web service consumer or web service as a source (whichever will work), but I'm not sure how to go about this. I can't create an Expression with no inputs, and I don't see how to set a mapped parameter as input.
The only two ways I can think about doing this would be to either build an app that creates a flat file with both dates, or to build a database object that supplies the dates as a source. I'd rather not have a separate outside source to provide these values, but I can't think of another way.
If you need those variables set before mapping run, use Assignment Task in the workflow and usePre-session variable assignment` to set the values for the mapping before it runs.
There is no way to do this with the Informatica v9.6.1. A source has to be created in order to feed the web service. I ended up creating a dummy record with 1 field, using it as input, then disregarding the input and setting up variable output using an Expression transformation.

API Gateway generating 11 sql queries per second on REG_LOG

We have sysdig running on our WSO2 API gateway machine and we notice that it fires a large number of SQL queries to the database for a minute, than waits a minute and repeats.
The query looks like this:
Every minute it goes wild, waits for a minute and goes wild again with a request of the following format:
SELECT REG_PATH, REG_USER_ID, REG_LOGGED_TIME, REG_ACTION, REG_ACTION_DATA
FROM REG_LOG
WHERE REG_LOGGED_TIME>'2016-02-29 09:57:54'
AND REG_LOGGED_TIME<'2016-03-02 11:43:59.959' AND REG_TENANT_ID=-1234
There is no load on the server. What is causing this? What can we do to avoid this?
screen shot sysdig api gateway process
This particular query is the result of the registry indexing task that runs in the background. The REG_LOG table is being queried periodically to retrieve the latest registry actions. The indexing task cannot be stopped. However, one can configure the frequency of the indexing task through the following parameter that is in the registry.xml. See [1] for more information.
indexingFrequencyInSeconds
If this table is filled up, one can clean the data using a simple SQL query. However, when deleting the records, one must be careful not to delete all the data. The latest records of each resource path should be left in the REG_LOG table since reindexing of data requires at least one reference of each resource path.
Also, if required, before clearing up the REG_LOG table, you can take a dump of the data in case you do not want to loose old records. Hope this answer provides information you require.
[1] - https://docs.wso2.com/display/Governance510/Configuration+for+Indexing

Is it possible to trigger Informatica workflow using a data in database table?

I am a newbie in ETL and will be using Informatica soon for one of the requirements we have.
The requirement is that Informatica needs to monitor a table in Oracle for certain "trigger data" and as soon as that data is available in that table, Informatica should start executing steps in its workflow.
Is it possible to do this? If yes, could someone please point me to a link/document where this is explained.
Many thanks.
No, it is not possible (checked in PowerCenter 9.5.1).
The Event-Wait task supports only two types of events:
predefined events (the task instructs the Integration Service to wait for the specified indicator file to appear before continuing),
user-defined events (the event is triggered by an Event-Raise task somewhere in the workflow).
Yes it is possible and you will be needing a script that can be created with following steps.
--create a shell script that checks if data is present in table on not you can use this just by taking count of the table
--if count is grater than create an empty file say DUMMY.txt (by using touch command) at a specified path.
--in you Informatica scheduling either by scheduler or by script check every 5 mins if file is present.
--if file is present call you Informatica workflow and delete the DUMMY file.
--once workflow is completed start the process again.