Two questions on WSO2 BAM 2.5 Output Event Adaptor - 1) Why is there no "email" option in the output event adaptor type? As per the documentation, it should be there. Even if I create my own XML file for the Email event adaptor and drop it in the required folder, then type "email" is not recognized and the BAM is showing that as "inactive". 2) Which directory and file does the default logger output event adaptor write the logs to? I have configured that and I can see that the messages have got generated through Hive scripts and written to BAMNotifications column family but I am not able to see the logs in the repository/logs directory log files? Please help.
1) This issue occurred because of an OSGI loading issue in soap output adapter (It causes failure to some other output adapters). We have fixed that in next BAM version. For the moment to overcome this issue, please remove the soap output adapter jar (from plugins directory), restart and continue.
2) It needs to go to wso2carbon.log file. Can you please verify the log4j properties.
Related
when I do a simple expression like wso2:vault-lookup('my-token') it returns empty. But i have the my-token present in the vault.
Has anyone encountered this problem before ? Any possible workaround that you can suggest.
EI version 6.2.0
Can you try the following approach?
Run the ciphertool.sh with the following command.
bin/ciphertool.sh -Dorg.wso2.CipherTransformation=RSA/ECB/OAEPwithSHA1andMGF1Padding
Enter the plain text value which you need to encrypt and copy the encrypted value
Navigate to the carbon console and expand the registry browse section.
Go to the following path.
/_system/config/repository/components/secure-vault
Above is the location where the registry holds the secure vault properties and the values.
Inside the secure vault, create a new property with a name and paste the encrypted value which you acquired from the initial step.
Try to get the property from the mediation sequence.
wso2:vault-lookup('prop-name')
Since you have confirmed that this is working as expected in the vanilla version of the EI server, can you compare the configurations available in secret-conf.properties file located in [EI_HOME]/conf/security directory of the existing server and the vanilla pack (that this working).
Unable to view the table contents in Tibco Spotfire Analytics Explorer.
I have copied the hive EMR jar files to library folder in tibco server. Once the server is up, I tried to configure the data source in Information Designer. I am able to setup the datasource and I can see schema. But when I tried to expand the table, I am not getting any results
This is the error message I am getting here
Error message: An issue occurred while creating the default model. It may be partially constructed.
The data source reported a failure.
InformationModelException at Spotfire.Dxp.Data:
Error retrieving metadata: java.lang.NullPointerException (HRESULT: 80131500)
The following is the URL template in Information Designer
jdbc:hive2://<host>:<port10000>/<database>
which I changed to
jdbc:hive2://sitename:10000/db_name.
Please let me know what need to be changed in the driver config or any other place to see the contents of the table.
My tech req is the following :
Poll CSV Files
Read Data line by line
Transform data to desired format
convert to JSON/XML
Publish data thru REST/JMS
Deploy in WSO2 EI6.1.1
How is this possible in DS Tooling 3.8.0 of WSO2 ?
I know inbound endpoints,mediators,sequences proxy services etc can be used, but cant find a single document/article that helps in doing this.
Where do I start? How do I sequentially execute these steps? Artifacts are independently created, but dont how to automate them to an integration flow.
Appreciate if someone can shed some light.
Solution :
Create empty ESB solution project.
Create Proxy-Service.
Use Smooks-config for tranformation of CSV flat data to XML format
Create endpoint, for producing JMS messages to JMS queue of ActiveMQ.
Use datamapper mediator, if transformation is required
Use log mediator , for logging.
Use property mediator for setting endpoint related properties.
Config - axis2.xml,axis2Client.xml for enabling transport settings on E1611.
Export to CAR file, deploy on EI611 management console.
Happy Testing!!
Is there a way to purge the log in Sitecore such that logs are written immediately. It's for production debugging.
Also strolling through log files, there are number of log files e.g. log.date.text and log.date.time.txt. Which one is the latest i.e. with our without time.
You can use next module for production server if you have remote access there :
https://marketplace.sitecore.net/Modules/S/Sitecore_Log_Analyzer.aspx
Other option is to use this module:
https://marketplace.sitecore.net/Modules/S/Sitecore_ScriptLogger.aspx
The log with no timestamp in the file name is the first on for that day.
A new log file is created each time the application pool restarts.
If you haven't changed any of the default log4net settings then the initial log file will be in the format log.yyyyMMdd.txt, each subsequent restart will cause a new file to be generated with the following format log.yyyyMMdd.HHmmss.txt.
The latest log file for the day will be the file with the latest timestamp.
I all
I have a job schudled by tivoli for an Informatica workflow.
i have checked property to save workflow logs for 5 runs.
Job is running fine through informatica but if u try to run is from tivoli using pmcmd it fails to rename the workflow log file .
pLease help , i am getting this error :
Cannot rename workflow log file [E:\Informatica\etl_d\WorkflowLogs\wf_T.log.bin] to [E:\Informatica\etl_d\WorkflowLogs\wf_T.log.4.bin]. Please check the Integration Service log for more information.
Disconnecting from Integration Service
Check the log file name in Workflow Edit options. Possibly you have same workflow log file name for multiple workflows.
HTH
Irfan