How to extract values from struct data type in Informatica BDM object? - informatica

I have an Object created in Informatica BDM which has a struct data type port like below
Here is how the sruct column is made up of
filter_column struct<name:string, value:string, data_type:string>
I wanted to extract the name from the filter_column port and created an Expression. Then I created an out put port as shown in the below image.
In the Expression part I am trying to extract the name element
But I get the above error.
Anyone knows how to extract the elements from the Struct?

Hi Alex you need extract the struct column. You can do right click on the source data object and check hierarchical conversions-> extract from complex ports.
Try this and let me know if that helps!
Also there is one video for your reference..
https://network.informatica.com/videos/2829

Related

ColumnsofType Record not returning any columns

I've got a table full of different data types, including records, that I want to extract all column names of records to then use in an expand function. I've included a screenshot of a column containing record's however, when I use this = Table.ColumnsOfType(#"Expanded fields", {type record}), it returns an empty list .
I've tried looking through the entire column to see if there was anything different but its all record types. Any help please.
EDIT:
Error using Table.TransformColumnTypes
Record is not a valid type to search for. And judging by your image, your type is Type.Any as denoted by the ABC123
You best bet is to unpivot all the columns (perhaps those starting with a certain prefix) then on the new Value column, expand like so
#"PriorStepNameHere" = .... ,
ExpandList= List.Distinct(List.Combine(List.Transform(Table.Column(#"PriorStepNameHere", "Value"), each if _ is record then Record.FieldNames(_) else {}))),
Expand= Table.ExpandRecordColumn(#"PriorStepNameHere", "Value", ExpandList,ExpandList)
It sounds like the Table.ColumnsOfType function is not properly identifying the columns in your table that contain records.One possible reason for this is that the column's datatype is not properly set as 'record'. Another possible reason could be that the data in the columns is not structured properly and hence it is not being identified as a record. You can try to use the Table.TransformColumnTypes function to convert the column's datatype to 'record' and see if that resolves the issue.
If the issue still persists, please share the sample data and the code you are using.

IF expression in Azure Data Factory

I have a JSON file which is being used as a data source.
My issue right now is when I was trying to map the data into a CSV file, I noticed that the column documentConfigId has some values which are within an object and some which are not: (example below)
Without
"documentConfigId":{
"12345678"
}
With
"documentConfigId":{
"$ref": "12345678"
}
Is there any way that I can create a dynamic content which will use the first mapping solution for rows which do not have the inner reference node, but when it does have it, it should use the second solution.
This is what it currently shows in Excel when I just use the first mapping solution.
I tested and per my experience, I'm afraid we can't do that in Data Factory.
HTH.

Wireshark Dissector VoidString type

I am working on a Wireshark Dissector Generator for a senior project. I have done some reading but had a question about the VoidString object in the ProtoField Object. The documentation wasn't too clear on this particular value or what its used for.
Our generator uses C++ so that our client can modify it after the project is complete. I was reading in another thread here that it could be passed a table of key, value pairs. Are there other structures or information this parameter is used for? We're trying to make a data structure to contain the parse of a file passed by the user and we're trying to determine how to best make this object. Would it be better to allow a template object to be passed here instead or is the table sufficient?
I'm not sure to understand your needs but according to the wireshark source code (wslua_proto_fields.c), the definition of the VoidString parameter is :
#define WSLUA_OPTARG_ProtoField_new_VALUESTRING 4 /* A table containing the text that
corresponds to the values, or a table containing unit name for the values if base is
`base.UNIT_STRING`, or one of `frametype.NONE`, `frametype.REQUEST`, `frametype.RESPONSE`,
`frametype.ACK` or `frametype.DUP_ACK` if field type is ftypes.FRAMENUM. */
So the table will be "cast" following the type and print in base representation.

Set Mapping variable in Expression and use it in Source Filter

I have two tables in different databases. In a table A is the data, in the other table B are information for incremental load of the data from the first table. I want to load from table B and store the date of the last successful load from table A in a mapping variable $$LOAD_DATE. To achieve this, I read a date from table B and use the SETVARIABLE() function in a expression to set the $$LOAD_DATE variable. The port in which I do this is marked as output and writes into a dummy flat file. I only read on row of this source!
Then I use this $$LOAD_DATE variable in the Source Filter of the Source Qualifier of table A to only load new records which are younger than the date stored in the $$LOAD_DATE variable.
My problem is that I am not able to set the $$LOAD_DATE variable correctly. It is always the date 1753-1-1-00.00.00, which is the default value for mapping variables of the type date/time.
How do I solve this? How can I store a date in that variable and use it later in a Source Qualifiers source filter? Is it even possible?
EDIT: Table A has too much records to read them all and filter them later. This would be to expensive, so they have to be filtered at source filter level.
Yes, it's possible.
In the first map you have to initialize the variable, like this:
In first session configuration you have to define the Post-session on success variable assignment:
The second map (with your table A) will get the variable after this configuration of the session in Pre-session variable assignment:
It will work.
It is not possible to set a mapping variable and use it's value somewhere else in the same run, because, the variable is actually set when the session completes.
If you really want to implement it using mapping variables you have to create two mappings, one for setting the mapping variable and another for actual incremental load. You can pass a mapping variable value from one session to another in a workflow using a workflow variable. https://stackoverflow.com/a/26849639/2626813
Other solutions could be to use a lookup on B and a filter after that.
You can also write some scripts to query table B and modify the parameter file with the latest $LOAD_DATE value prior to executing the mapping.
Since we're having two different DBs, use two sessions. Get values in the first one and pass the parameters to the second one.

libpq: get data type

I am coding a cpp project with the database "postgreSQL".
I created a table in my database its type is character varying(40).
Now I need to SELECT these data FROM the table in my cpp project. I knew that I should use the library libpq, this is the interface of "postgreSQL" for c/cpp.
I have succeeded in selecting data from the table. Now I am considering if it's possible to get the data type of this table. For example, here I want to get character varying(40).
You need to use PQftype.
As described here: http://www.idiap.ch/~formaz/doc/postgreSQL/libpq-chapter17861.htm
And just take a look here about decoding return values: http://www.postgresql.org/message-id/da7021e0608040738l3b0880a1q5a76b838937f8c78#mail.gmail.com
You must also use PQfsize to get field size.