WSO2 EI Analytics Profile Database configuration - wso2

I'm configuring WSO2 EI Analytics profile to use PostgreSQL instead of H2 database.
I have changed the following files:
analytics-datasources.xml,
master-datasources.xml,
metrics-datasources.xml
in \wso2\analytics\conf\datasources.
I have, also, executed the scripts to create the database in dbscripts. The scripts generate only tables for metrics and master, but they do not create tables for analytics.
Anyway when I run the analytics server i have some errors as shown below:
Failed to perform Category Drilldown on table: org_wso2_esb_analytics_stream_MediatorStatPerMinute: Error while connecting to the remote service. Connection refused (Connection refused) {JAGGERY.controllers.apis.eianalytics:jag}
TID: [-1234] [] [2017-11-06 16:43:00,262] ERROR {org.wso2.carbon.databridge.core.internal.queue.QueueWorker} - Dropping wrongly formatted event sent for -1234 {org.wso2.carbon.databridge.core.internal.queue.QueueWorker}
org.wso2.carbon.databridge.core.exception.EventConversionException: Error when converting org.wso2.esb.analytics.stream.FlowEntry:1.0.0 of event bundle with events 1
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:181)
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.toEventList(ThriftEventConverter.java:90)
at org.wso2.carbon.databridge.core.internal.queue.QueueWorker.run(QueueWorker.java:73)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.wso2.carbon.databridge.core.exception.EventConversionException: No StreamDefinition for streamId org.wso2.esb.analytics.stream.FlowEntry:1.0.0 present in cache
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:166)
... 7 more
It seems they are missing some database tables, but i don't know how to create them.
These errors are not present when i use H2 database with the default configuration.
Anyone can help me?

I solved the problem.
It was a JDBC driver problem.
With JDK 1.8 it is necessary to use the PostgreSQL JDBC 42.1.4.
I hope it will be useful for someone.

Related

Distributed WSO2 APIM: Problems with KeyManager

Now I am testing the API-Manager doing a distributed install of pruduct.
When I start the Analytitcs and publisher (both in ditributed hosts), the analytic's Log donĀ“t stop to show the error messages:
[2018-04-12 15:00:18,770] ERROR {org.wso2.carbon.databridge.core.internal.queue.QueueWorker} - Dropping wrongly formatted event sent for -1234
org.wso2.carbon.databridge.core.exception.EventConversionException: Error when converting loganalyzer:1.0.0 of event bundle with events 1
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:181)
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.toEventList(ThriftEventConverter.java:90)
at org.wso2.carbon.databridge.core.internal.queue.QueueWorker.run(QueueWorker.java:73)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.wso2.carbon.databridge.core.exception.EventConversionException: No StreamDefinition for streamId loganalyzer:1.0.0 present in cache
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:166)
... 7 more
This means the APIM (or other product) is sending events with streamId loganalyzer:1.0.0 , however the analytics server has no such stream definition.
The analytics server is effectively a WSO2 DAS with preconfigured streams and analytics related to some other product. The log messages indicates, that the analytics application (org_wso2_carbon_analytics_apim-1.0.0.car) is not (yet) deployed.
It happens commonly when you start up the analytics server, it receives the product (APIM) events before the analytics app is deployed. Once the analytics app is deployed, the DAS should stop logging these messages
So in your case I'd try to have a look on the analytics server in the start of the log file why the analytics application is not properly deployed

WSO2 API Manager 2.1 Analytics cannot be restarted when using Oracle DB

When restarting APIM Analytics the following error appears in log (the first run is ok). We need to truncate ANX___8GEKYOMM_ table for being able to start it again.
TID: [-1234] [] ERROR {org.wso2.carbon.analytics.dataservice.core.AnalyticsDataServiceComponent} - Error in activating analytics data service: null {org.wso2.carbon.analytics.dataservice.core.AnalyticsDataServiceComponent}
java.lang.RuntimeException
at org.wso2.carbon.analytics.datasource.rdbms.RDBMSAnalyticsRecordStore$RDBMSResultSetIterator.next(RDBMSAnalyticsRecordStore.java:881)
at org.wso2.carbon.analytics.datasource.rdbms.RDBMSAnalyticsRecordStore$RDBMSResultSetIterator.hasNext(RDBMSAnalyticsRecordStore.java:843)
at org.apache.commons.collections.IteratorUtils.toList(IteratorUtils.java:848)
at org.apache.commons.collections.IteratorUtils.toList(IteratorUtils.java:825)
at org.wso2.carbon.analytics.datasource.core.util.GenericUtils.listRecords(GenericUtils.java:284)
at org.wso2.carbon.analytics.dataservice.core.AnalyticsDataServiceImpl.readTenantIds(AnalyticsDataServiceImpl.java:468)
Caused by: java.lang.NullPointerException
at java.io.OutputStream.write(OutputStream.java:75)
at org.wso2.carbon.analytics.datasource.rdbms.RDBMSAnalyticsRecordStore$RDBMSResultSetIterator.extractDataFromRS(RDBMSAnalyticsRecordStore.java:890)
at org.wso2.carbon.analytics.datasource.rdbms.RDBMSAnalyticsRecordStore$RDBMSResultSetIterator.next(RDBMSAnalyticsRecordStore.java:863)
... 124 more
I have solved it upgrading from org.wso2.carbon.analytics.datasource.rdbms-1.3.0.jar to org.wso2.carbon.analytics.datasource.rdbms-1.3.6.jar

Option 'schema' not specified setting wso2 AM 1.10.x with DAS 3.1.0

I am trying to setup wso2 API Manager 1.10.0 with DAS 3.1.0. DAS will use MySQL 5.7.18. I ran mysql5.7.sql from DAS package to create DB schema in MySQL. I also downloaded MySQL-connector-java-5.1.35-bin.jar and copied it into repository\components\lib directory.
I turned on Configure Analytics in API manger, and saved the configuration successfully. I can see that API manager can talk to DAS without a problem.
But in the carbon log of DAS, I see exceptions like this:
TID: [-1234] [] [2017-05-26 15:30:00,368] ERROR {org.wso2.carbon.analytics.spark.core.AnalyticsTask} - Error while executing the scheduled task for the script: APIM_STAT_SCRIPT {org.wso2.carbon.analytics.spark.core.AnalyticsTask}
org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException: Exception in executing query create temporary table APIRequestSummaryData using CarbonJDBC options (dataSource "WSO2AM_STATS_DB", tableName "API_REQUEST_SUMMARY")
at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQuery(SparkAnalyticsExecutor.java:721)
at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeQuery(CarbonAnalyticsProcessorService.java:201)
at org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeScript(CarbonAnalyticsProcessorService.java:151)
at org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(AnalyticsTask.java:60)
at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Option 'schema' not specified
at scala.sys.package$.error(package.scala:27)
at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider$$anonfun$3.apply(JDBCRelation.scala:113)
at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider$$anonfun$3.apply(JDBCRelation.scala:113)
at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
at org.apache.spark.sql.execution.datasources.CaseInsensitiveMap.getOrElse(ddl.scala:150)
at org.apache.spark.sql.jdbc.carbon.AnalyticsJDBCRelationProvider.createRelation(JDBCRelation.scala:113)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
at org.apache.spark.sql.execution.datasources.CreateTempTableUsing.run(ddl.scala:92)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:145)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:130)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:760)
How to resolve it?
Thanks.
API Manager 1.10 and DAS 3.1.0 aren't compatible each other. Unless you customize database and CApps, it won't work.
You can use API Manager 2.1 with DAS 3.1.0 or API Manager 1.10 with DAS 3.0.x
It turns out that I need to import the correct schema declaration script from the /dbscript/stat/sql folder to the DAS database I am setting here.

WSO2 DAS server configuration issue - Dropping wrongly formatted event sent for -1234

I have configured DAS with API manager server using REST client, but not able to push data to DAS server. Please see error logs in DAS server. Could you please help me to understand what is wring in configuration?
TID: [-1234] [] [2016-05-20 18:07:05,566] ERROR {org.wso2.carbon.databridge.core.internal.queue.QueueWorker} - Dropping wrongly formatted event sent for -1234 {org.wso2.carbon.databridge.core.internal.queue.QueueWorker}
org.wso2.carbon.databridge.core.exception.EventConversionException: Error when converting org.wso2.apimgt.statistics.throttle:1.0.0 of event bundle with events 1
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:181)
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.toEventList(ThriftEventConverter.java:90)
at org.wso2.carbon.databridge.core.internal.queue.QueueWorker.run(QueueWorker.java:73)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: org.wso2.carbon.databridge.core.exception.EventConversionException: No StreamDefinition for streamId org.wso2.apimgt.statistics.throttle:1.0.0 present in cache
at org.wso2.carbon.databridge.receiver.thrift.converter.ThriftEventConverter.createEventList(ThriftEventConverter.java:166)
... 7 more
Can you try redeploying the car app again. For that first do following
Delete the .car application from /repository/deployment/server/carbonapps
Delete any existing streams defs (related to APIM stats) by login to DAS management console and going to Manage > Event > Streams
Re deploy car app by putting it in /repository/deployment/server/carbonapps
If everything goes well you would see two scripts in Manage > Batch Analytics > Scripts section. Try to execute each script and see if there is any error.

WSO2 - Error while invoking APIUsageStatisticsClient for ProviderAPIUsage

I have setup WSO2 API Manager (version 1.9.1) and have configured it with BAM (version 2.5.0) following the instructions provided here to the letter:
https://docs.wso2.com/display/AM191/Publishing+API+Runtime+Statistics
I am using MySQL server and can see the api_* tables created and getting updated as I make API calls. However, when I go to look at the statistics, I see an exception on the console, while the UI tells me that I need to configure BAM to see the dashboards.
Here is a snippet of the ERRORs seen in wso2carbon.log
TID: [0] [AM] [2015-11-19 11:27:39,110] ERROR {org.wso2.carbon.apimgt.hostobjects.APIProviderHostObject} - Error while invoking APIUsageStatisticsClient for ProviderAPIUsage {org.wso2.carbon.apimgt.hostobjects.APIProviderHostObject}
org.wso2.carbon.apimgt.usage.client.exception.APIMgtUsageQueryServiceClientException: Error occurred while querying from JDBC databasecom.mysql.jdbc.driver
at org.wso2.carbon.apimgt.usage.client.APIUsageStatisticsClient.queryFirstAccess(APIUsageStatisticsClient.java:2762)
at org.wso2.carbon.apimgt.usage.client.APIUsageStatisticsClient.getFirstAccessTime(APIUsageStatisticsClient.java:2706)
... 65 more
Caused by: java.sql.SQLException: com.mysql.jdbc.driver
at org.apache.tomcat.jdbc.pool.PooledConnection.connectUsingDriver(PooledConnection.java:254)
at org.apache.tomcat.jdbc.pool.PooledConnection.connect(PooledConnection.java:182)
at org.apache.tomcat.jdbc.pool.ConnectionPool.createConnection(ConnectionPool.java:701)
at org.apache.tomcat.jdbc.pool.ConnectionPool.borrowConnection(ConnectionPool.java:635)
at org.apache.tomcat.jdbc.pool.ConnectionPool.getConnection(ConnectionPool.java:188)
at org.apache.tomcat.jdbc.pool.DataSourceProxy.getConnection(DataSourceProxy.java:128)
at org.wso2.carbon.apimgt.usage.client.APIUsageStatisticsClient.queryFirstAccess(APIUsageStatisticsClient.java:2730)
... 65 more
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.driver
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:475)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:421)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:412)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.tomcat.jdbc.pool.PooledConnection.connectUsingDriver(PooledConnection.java:246)
... 71 more
TID: [0] [AM] [2015-11-19 11:27:39,138] ERROR {JAGGERY.modules.statistics.usage:jag} - java.lang.NullPointerException: null {JAGGERY.modules.statistics.usage:jag}
I have looked at other similar posts but nothing seems to be exactly relevant. The closest similar post had a SQL Server specific issue, but I am using MySQL.
Additional Information: I am running BAM and API Manager on the same machine. I have set offset on BAM to 3 as recommended in the guide.
Apparently, you have not added the SQL connector to /repository/components/lib and /repository/components/lib folders. This is mentioned in the step 5 of https://docs.wso2.com/display/AM191/Publishing+API+Runtime+Statistics.
Make sure that the SQL connector is correctly placed on those locations
If you are trying above scenario in a Windows environment, please use "mysql-connector-java-5.1.30" connector and place it in /repository/components/lib and /repository/components/lib folders. This could be due to a connector version problem.