How can we use advanced date filter(SYSDATE) in Source while reading the data in informatica iics? - informatica

I am unable to use advanced filters in the source of Informatica before reading the data. I want to compare a field with SYSDATE so I am going to advanced filters in Source there is SYSDATE a system variable predefined in Informatica so I am equating tablename.field=SYSDATE or tablename.field=$$SYSDATE or tablename.field=$$SYSDATE by none of them is working. Here is the screenshot of Source filter definition.
Please help how can I compare field with SYSDATE

You need to define a parameter, e.g. SYSDATE and then refer to parameter value by putting $$SYSDATE in the filter - almost like you do, but there has to be a parameter declared and value defined for this parameter. Otherwise this is just a comparison to string SYSDATE, not the desired date value.

Related

Dataprep change str yyyymmdd date to datetime column

I have a column with dates (in a string format) in Dataprep: yyyymmdd. I would like it to become a datetime object. Which function/transformation should I apply to achieve this result automatically?
In this case, you actually don't need to apply a transformation at all—you can just change column type to Date/Time and select the appropriate format options.
Note: This is one of the least intuitive parts of Dataprep as you have to select an incorrect format (in this case yy-mm-dd) before you can drill-down to the correct format (yyyymmdd).
Here's a screenshot of the Date / Time type window to illustrate this:
While it's unintuitive, this will correctly treat the column as a date in future operations, including assigning the correct type in export operations (e.g. BigQuery).
Through the UI, this will generate the following Wrangle Script:
settype col: YourDateCol customType: 'Datetime','yy-mm-dd','yyyymmdd' type: custom
According to the documentation, this should also work (and is more succinct):
settype col: YourDateCol type: 'Datetime','yy-mm-dd','yyyymmdd'
Note that if you absolutely needed to do this in a function context, you could extract the date parts using SUBSTRING/LEFT/RIGHT and pass them to the DATE or DATETIME function to construct a datetime object. As you've probably already found, DATEFORMAT will return NULL if the source column isn't already of type Datetime.
(From a performance standpoint though, it would probably be far more efficient for a large dataset to either just change the the or create a new column with the correct type versus having to perform those extra operations on so many rows.)

Set Mapping variable in Expression and use it in Source Filter

I have two tables in different databases. In a table A is the data, in the other table B are information for incremental load of the data from the first table. I want to load from table B and store the date of the last successful load from table A in a mapping variable $$LOAD_DATE. To achieve this, I read a date from table B and use the SETVARIABLE() function in a expression to set the $$LOAD_DATE variable. The port in which I do this is marked as output and writes into a dummy flat file. I only read on row of this source!
Then I use this $$LOAD_DATE variable in the Source Filter of the Source Qualifier of table A to only load new records which are younger than the date stored in the $$LOAD_DATE variable.
My problem is that I am not able to set the $$LOAD_DATE variable correctly. It is always the date 1753-1-1-00.00.00, which is the default value for mapping variables of the type date/time.
How do I solve this? How can I store a date in that variable and use it later in a Source Qualifiers source filter? Is it even possible?
EDIT: Table A has too much records to read them all and filter them later. This would be to expensive, so they have to be filtered at source filter level.
Yes, it's possible.
In the first map you have to initialize the variable, like this:
In first session configuration you have to define the Post-session on success variable assignment:
The second map (with your table A) will get the variable after this configuration of the session in Pre-session variable assignment:
It will work.
It is not possible to set a mapping variable and use it's value somewhere else in the same run, because, the variable is actually set when the session completes.
If you really want to implement it using mapping variables you have to create two mappings, one for setting the mapping variable and another for actual incremental load. You can pass a mapping variable value from one session to another in a workflow using a workflow variable. https://stackoverflow.com/a/26849639/2626813
Other solutions could be to use a lookup on B and a filter after that.
You can also write some scripts to query table B and modify the parameter file with the latest $LOAD_DATE value prior to executing the mapping.
Since we're having two different DBs, use two sessions. Get values in the first one and pass the parameters to the second one.

libpq: get data type

I am coding a cpp project with the database "postgreSQL".
I created a table in my database its type is character varying(40).
Now I need to SELECT these data FROM the table in my cpp project. I knew that I should use the library libpq, this is the interface of "postgreSQL" for c/cpp.
I have succeeded in selecting data from the table. Now I am considering if it's possible to get the data type of this table. For example, here I want to get character varying(40).
You need to use PQftype.
As described here: http://www.idiap.ch/~formaz/doc/postgreSQL/libpq-chapter17861.htm
And just take a look here about decoding return values: http://www.postgresql.org/message-id/da7021e0608040738l3b0880a1q5a76b838937f8c78#mail.gmail.com
You must also use PQfsize to get field size.

How do I rename a sharePoint file to include a date using Nintex

I'm trying to use a 2 workflows to archive any files when created or updated. The first simply moves a copy to a separate doc library. no issues
The second should rename the file once it arrives to append a date (and possible timestamp) to the end of the file so that it is a unique record.
I am trying to set a variable called Archive_Name and then setting the field value to the Archive_Name before commiting the change.
I am using this fomula to set the variable
Name-fn-FormatDate(Current Date,yyyy-MM-dd)
Both Name and Current Date are recognised variable.
When I run this the Name stays the same and does not append a date. If I run it as
fn-FormatDate(Current Date,yyyy-MM-dd)
the Name changes to my desired date proving that the formula is working, the text is being assigned to the Archive_Date variable and the variable is being applied to the field value.
What am I doing wrong?
I believe you need to concatenate the two variables. The & operator can be used in place of the CONCATENATE function. Thus; Name&fn-FormatDate(Current Date,yyyy-MM-dd)
Hope this helps

In Rapidminer once I import a data set how do I change the type of a column?

I've imported a datset into Rapidminer 5 and one of the columns that was supposed to be nominal or polynomial was set as a numeric. My data set has over 500 attributes so I don't really want to have to reimport my data every time I realize I've made a mistake like this. Is there some way to either automate the import process so that it saves the column types I set each time or can I go back and edit my already imported data set attribute types?
add this operator to your process, after you load the data:
Data Transformation > Type Conversion > Numerical to Polynomial
on the operator, select
attribute type filter = single
attribute = [name of your attribute]
here you go: http://i.stack.imgur.com/ov5yn.png
Select "Numerical to Polynomial"
Then change "attribute filter type" to 'subset' Then select attributes that you want to change.
One more suggestion, you better store this output in your local repository so you dont need the conversion everytime you need the data. So, you will have both original and duplicate in your basket. :)
Happy Data Mining...
apply the 'set role function':
It's listed under operators -> data tranformations -> Name and role modification -> Set role