inheritance of pentaho log kettle.properties in sub transforms - kettle

I have setup to log Pentaho jobs and transformations to a database
This works fine provided I define every job and every transformation in its individual log settings dialogue.
I see that I can configure the kettle properties file to hold these values.
However I can't get this to inherit autoamtically in a transformation when it is called by a job. I assume that if defined in properties it should just inherit and work.
Any ideas on what I am missing?
Thanks
(MS windows env with MS Sql server- we don't have Pentaho enterprise).

You can do it by adding below entries in "kettle.properties" file.
kettle logging properties
KETTLE_TRANS_LOG_DB=
KETTLE_TRANS_LOG_SCHEMA=
KETTLE_TRANS_LOG_TABLE=etl_trans_log
KETTLE_JOB_LOG_DB=
KETTLE_JOB_LOG_SCHEMA=
KETTLE_JOB_LOG_TABLE=etl_job_log

Ok so I have foound that provided I set the proerties file on the machine and then set each transformation by right clicking and setting each log to have the connection, then when I call the job it all logs correctly.
So you need the database connection in all transforms and you need to set this a sdefault in the logging tab.
I think this is right anyway unless someone else has a shorter cut

Related

TEIID Importing ddl into vdb ddl

Currently my VDB DDL file is getting quite big. I want to split into different files using the following.
IMPORT FROM REPOSITORY "DDL-FILE"
INTO test OPTIONS ("ddl-file" '/path/to/schema1.ddl')
However, this does not seem to work.
Can the DDL file path be relative, how?
The schema test, can it be VIRTUAL?
Does "DDL-FILE" refer to "ddl-file"?
What should I put in my main VDB ddl and what should I put in my extra ddl's. Should the
extra ddl's contain server configuration details or should they be defined as a VDB.
I would like to see a working example on how to use this.
This will be used in a teiid springboot project where you can only load one main vdb file. It is not workable to have one very large ddl file.
I tried multiple approaches but it does not seem to work, either giving me a null pointer with no error codes or error codes that tell me nothing.
Also the syntax in Teiid 9.3 seems different:
IMPORT FOREIGN SCHEMA public
FROM REPOSITORY DDL-FILE
INTO test OPTIONS ("ddl-file" '/path/to/schema.ddl')
This feature is currently not implemented in Teiid Spring Boot. This issue is captured in https://issues.redhat.com/browse/TEIIDSB-219
Update: I added the needed code to master, should be available with 1.7 release meanwhile you can build the master branch and test it out.

DB2 on Cloud. Change the configuration STRING_UNITS to CODEUNITS32

I need some help with my DB2 instance on Cloud. I need execute a command for change system property, for example I need change the configuration STRING_UNITS to CODEUNITS32, but I cant do it using IBM DATA SERVER DRIVER( IBM console). I will to thank some help. Thank you very much.
string_units is configurable on-line
https://www.ibm.com/support/knowledgecenter/SSEPGG_11.5.0/com.ibm.db2.luw.admin.config.doc/doc/r0060936.html
so if you have SYSADM, SYSCTRL or SYSMAINT authority on the instance you could run
call admin_cmd('UPDATE DB CFG USING STRING_UNITS CODEUNITS32 IMMEDIATE')
https://www.ibm.com/support/knowledgecenter/SSEPGG_11.5.0/com.ibm.db2.luw.sql.rtn.doc/doc/r0023593.html
The change will only take effect on new connections, existing connections would need to re-connect to pick up the new default.
If you don't have one of those authorities, then you can change the NLS_STRING_UNITS global variable at a session level https://www.ibm.com/support/knowledgecenter/SSEPGG_11.5.0/com.ibm.db2.luw.sql.ref.doc/doc/r0060917.html

WSO2 Siddhi RDBMS Store Extension - how to set batchEnable to false

I'm using siddhi to create some app which also interacts with PostgreSQL DB. Although I'm not sure, I believe, there is a bug about making multiple updates on the same PG table, within a single event (i.e. upon receiving an event, update a record in the table, and create another one again in the same table) it seems the batch updates are causing some problems. SO, I just want to give it a try after disabling batchUpdate (it is enabled by default). I just don't know how to configure it using siddhi-sdk (via Intellij plugin). There are two related tickets:
https://github.com/wso2-extensions/siddhi-store-rdbms/issues/43
https://github.com/wso2/product-sp/issues/472
Until these are documented, I'd like to get some quick response how to set these fields.
Best regards...
I'm using siddhi to create some app which also interacts with PostgreSQL DB. Although I'm not sure, I believe, there is a bug about making multiple updates on the same PG table, within a single event (i.e. upon receiving an event, update a record in the table, and create another one again in the same table) it seems the batch updates are causing some problems.
When batchEnabled has been set to true, it will perform the insert/update operation on batch of events instead of performing those operations on each and every single event. Simply, this has been introduced to improve the performance.
The default value of this parameter is currently set to "true".
However, batchEnable configurations is done through a system parameter called, "{{RDBMS-Name}}.batchEnable" which have to be configured in the WSO2 Stream Processor's deployment.yaml
If you want to overide this property in Product-SP please find the steps below.
Open the deployment.yaml file located in {Product-SP-Home}/conf/editor/
Insert the following lines in the file.
siddhi:
extensions:
extension:
name: store
namespace: rdbms
properties:
PostgreSQL.batchEnable: true
But currently there is no way to overwrite those system configurations from the siddhi app level. Since you are using the SDK, what you can do is changing the default value of above parameter to "false".
Please find the steps below do it.
Find the siddhi-store-rdbms-4.x.xx.jar file in the siddhi
sdk. This is located in the {siddhi-sdk-home}/lib/ .
Open the jar file using an archive manager and open the
rdbms-table-config.xml file located inside it with a text editor.
Set false in <batchEnable>true</batchEnable> attribute under the
<database name="PostgreSQL"> tag and save it.
Thanks Raveen. with a simple dash (-) before "extension" I was able to set the config.
siddhi:
extensions:
- extension:
name: store
namespace: rdbms
properties:
PostgreSQL.batchEnable: false

Writing a flat file, permission denied

I have got this issue :
WRT_8004
Writer initialization failed [Error opening session output file [/*/diff_zipcode1.out] [error=Permission denied]].
Writer terminating.
The user for informatica has the right to write in this specific folder (I tried a touch it directly and it worked) but I still get this error.
The only way for this workflow to work is to set the writing permission to everyone...
So I was wondering if informatica uses another user than the one who launchs the informatica server like my user on informatica ? And if this is the case how can I set the properties right to write on my folder.
Answer to my situation : I change the settings of the user of informatica after I launched the informatica server so the modification wasn't really done for informatica point of view. To fix this problem, I only had to reboot the informatica server.
Informatica will use whichever user has logged in to Power Center to create the file.
If you do not want to set full permissions to your folder, it would be best if you add the user into a group and provide write permissions to groups only.

How to programmatically dump Launch Services database?

How can I programmatically dump/query Launch Services database in MacOS (i.e. analog of command lsregister -dump)?
EDIT: I want to get set of associations UTI -> Bundle_IDs. Using LSCopyAllRoleHandlersForContentType - does not always work, here a similar trouble, therefore concluded that the best working method - parsing the output of "lsregister -dump", but the location of lsregister changes from version to version.