I have Home Assistant logfile and I need to extract specific information from it by script in bash.
Log look like this:
2017-09-26 20:54:09 INFO (MainThread) [homeassistant.core] Bus:Handling <Event state_changed[L]: new_state=<state media_player.player1=playing; media_duration=6054, is_volume_muted=False, media_season=-1, media_episode=-1, media_series_title=, media_content_type=movie, media_album_name=, friendly_name=player1, media_title=My movie, supported_features=55231, entity_picture=/api/media_player_proxy/media_player.player1?token=5ce13d0e4d8ed86b3073a795a4bd82627b387cf2b6c5dfdc8c64ab9b62ec7d61&cache=f4a48, media_content_id=unknown=tt0266543, volume_level=0.91 # 2017-09-26T20:44:37.080005+02:00>, old_state=<state media_player.player1=playing; media_duration=6054, is_volume_muted=False, media_season=-1, media_episode=-1, media_series_title=, media_content_type=movie, media_album_name=, friendly_name=player1, media_title=My movie, supported_features=55231, entity_picture=/api/media_player_proxy/media_player.player1?token=5ce13d0e4d8ed86b3073a795a4bd82627b387cf2b6c5dfdc8c64ab9b62ec7d61&cache=23108, media_content_id=unknown=tt0266543, volume_level=0.91 # 2017-09-26T20:44:37.080005+02:00>, entity_id=media_player.player1>
2017-09-26 21:30:22 INFO (MainThread) [homeassistant.core] Bus:Handling <Event state_changed[L]: new_state=<state media_player.tv_player2=playing; is_volume_muted=False, app_id=233637DE, app_name=YouTube, volume_level=1.0, friendly_name=player2, media_position=0, supported_features=21437, media_position_updated_at=2017-09-26T21:30:22.660153+02:00, entity_picture=/api/media_player_proxy/media_player.tv_player2?token=f03173d412f3c7396762605a4f347b83f651d4c0f23935c8b1fccd9305698ffe&cache=88a6b, media_content_id=AcADAFAD, media_title=YT test movie # 2017-09-26T21:30:21.848088+02:00>, old_state=<state media_player.tv_player2=playing; supported_features=21437, is_volume_muted=False, media_position_updated_at=2017-09-26T21:30:22.557344+02:00, app_id=233637DE, friendly_name=player2, media_content_id=AcADAFAD, app_name=YouTube, media_position=0, volume_level=1.0 # 2017-09-26T21:30:21.848088+02:00>, entity_id=media_player.tv_player2>
2017-09-26 21:30:22 INFO (MainThread) [homeassistant.core] Bus:Handling <Event state_changed[L]: new_state=<state media_player.tv_player2=playing; is_volume_muted=False, app_id=233637DE, app_name=YouTube, volume_level=1.0, friendly_name=player2, media_position=0, supported_features=21437, media_position_updated_at=2017-09-26T21:30:22.695325+02:00, entity_picture=/api/media_player_proxy/media_player.tv_player2?token=f03173d412f3c7396762605a4f347b83f651d4c0f23935c8b1fccd9305698ffe&cache=88a6b, media_content_id=AcADAFAD, media_title=YT test movie # 2017-09-26T21:30:21.848088+02:00>, old_state=<state media_player.tv_player2=playing; is_volume_muted=False, app_id=233637DE, app_name=YouTube, volume_level=1.0, friendly_name=player2, media_position=0, supported_features=21437, media_position_updated_at=2017-09-26T21:30:22.660153+02:00, entity_picture=/api/media_player_proxy/media_player.tv_player2?token=f03173d412f3c7396762605a4f347b83f651d4c0f23935c8b1fccd9305698ffe&cache=88a6b, media_content_id=AcADAFAD, media_title=YT test movie # 2017-09-26T21:30:21.848088+02:00>, entity_id=media_player.tv_player2>
2017-09-26 21:30:29 INFO (MainThread) [homeassistant.core] Bus:Handling <Event state_changed[L]: new_state=<state media_player.tv_player2=playing; media_duration=1789.213605442177, is_volume_muted=False, app_id=233637DE, app_name=YouTube, volume_level=1.0, friendly_name=player2, media_position=0, supported_features=21437, media_position_updated_at=2017-09-26T21:30:29.115053+02:00, entity_picture=/api/media_player_proxy/media_player.tv_player2?token=f03173d412f3c7396762605a4f347b83f651d4c0f23935c8b1fccd9305698ffe&cache=88a6b, media_content_id=AcADAFAD, media_title=YT test movie # 2017-09-26T21:30:21.848088+02:00>, old_state=<state media_player.tv_player2=playing; is_volume_muted=False, app_id=233637DE, app_name=YouTube, volume_level=1.0, friendly_name=player2, media_position=0, supported_features=21437, media_position_updated_at=2017-09-26T21:30:22.695325+02:00, entity_picture=/api/media_player_proxy/media_player.tv_player2?token=f03173d412f3c7396762605a4f347b83f651d4c0f23935c8b1fccd9305698ffe&cache=88a6b, media_content_id=AcADAFAD, media_title=YT test movie # 2017-09-26T21:30:21.848088+02:00>, entity_id=media_player.tv_player2>
2017-09-26 21:30:29 INFO (MainThread) [homeassistant.core] Bus:Handling <Event state_changed[L]: new_state=<state media_player.tv_player2=playing; media_duration=1789.213605442177, is_volume_muted=False, app_id=233637DE, app_name=YouTube, volume_level=1.0, friendly_name=player2, media_position=0.733, supported_features=21437, media_position_updated_at=2017-09-26T21:30:29.843295+02:00, entity_picture=/api/media_player_proxy/media_player.tv_player2?token=f03173d412f3c7396762605a4f347b83f651d4c0f23935c8b1fccd9305698ffe&cache=88a6b, media_content_id=AcADAFAD, media_title=YT test movie # 2017-09-26T21:30:21.848088+02:00>, old_state=<state media_player.tv_player2=playing; media_duration=1789.213605442177, is_volume_muted=False, app_id=233637DE, app_name=YouTube, volume_level=1.0, friendly_name=player2, media_position=0, supported_features=21437, media_position_updated_at=2017-09-26T21:30:29.115053+02:00, entity_picture=/api/media_player_proxy/media_player.tv_player2?token=f03173d412f3c7396762605a4f347b83f651d4c0f23935c8b1fccd9305698ffe&cache=88a6b, media_content_id=AcADAFAD, media_title=YT test movie # 2017-09-26T21:30:21.848088+02:00>, entity_id=media_player.tv_player2>
2017-09-26 21:31:34 INFO (MainThread) [homeassistant.core] Bus:Handling <Event state_changed[L]: new_state=<state media_player.tv_player2=playing; media_duration=1789.213605442177, is_volume_muted=False, app_id=233637DE, app_name=YouTube, volume_level=1.0, friendly_name=player2, media_position=65.681, supported_features=21437, media_position_updated_at=2017-09-26T21:31:34.823855+02:00, entity_picture=/api/media_player_proxy/media_player.tv_player2?token=f03173d412f3c7396762605a4f347b83f651d4c0f23935c8b1fccd9305698ffe&cache=88a6b, media_content_id=AcADAFAD, media_title=YT test movie # 2017-09-26T21:30:21.848088+02:00>, old_state=<state media_player.tv_player2=playing; media_duration=1789.213605442177, is_volume_muted=False, app_id=233637DE, app_name=YouTube, volume_level=1.0, friendly_name=player2, media_position=0.733, supported_features=21437, media_position_updated_at=2017-09-26T21:30:29.843295+02:00, entity_picture=/api/media_player_proxy/media_player.tv_player2?token=f03173d412f3c7396762605a4f347b83f651d4c0f23935c8b1fccd9305698ffe&cache=88a6b, media_content_id=AcADAFAD, media_title=YT test movie # 2017-09-26T21:30:21.848088+02:00>, entity_id=media_player.tv_player2>
2017-09-26 22:45:58 INFO (MainThread) [homeassistant.core] Bus:Handling <Event state_changed[L]: new_state=<state media_player.player1=playing; media_duration=0, is_volume_muted=False, media_album_name=, media_content_type=music, friendly_name=player1, volume_level=0.75, supported_features=55231, media_title=My radio R1 # 2017-09-26T22:44:57.441924+02:00>, old_state=<state media_player.player1=playing; media_duration=0, is_volume_muted=False, media_album_name=, media_content_type=music, friendly_name=player1, volume_level=0.75, supported_features=55231, media_title= # 2017-09-26T22:44:57.441924+02:00>, entity_id=media_player.player1>
2017-09-26 22:46:09 INFO (MainThread) [homeassistant.core] Bus:Handling <Event state_changed[L]: new_state=<state media_player.player1=playing; media_duration=0, is_volume_muted=False, media_album_name=, media_content_type=music, friendly_name=player1, volume_level=0.75, supported_features=55231, media_title= # 2017-09-26T22:44:57.441924+02:00>, old_state=<state media_player.player1=playing; media_duration=0, is_volume_muted=False, media_album_name=, media_content_type=music, friendly_name=player1, volume_level=0.75, supported_features=55231, media_title=My radio R1 # 2017-09-26T22:44:57.441924+02:00>, entity_id=media_player.player1>
And I need to extract by script info about: time (first occured), friendly_name and media_title without duplicites in media_title.
something like this in output file:
2017-09-26 20:54:09, player1, My movie
2017-09-26 21:30:22, player2, YT test movie
2017-09-26 22:45:58, player1, My radio R1
I tried grep, sed and without propper result.
Thank you
GNU awk solution:
awk '{
d=$1" "$2;
match($0,/friendly_name=([^,]+).*media_title=([^[:space:]][^#,]+)[[:space:]]*[#,]/,a);
if (a[1]!="" && a[2]!="" && !(a[1]"#"a[2] in res)) {
res[a[1]"#"a[2]];
print d,a[1],a[2]
}
}' OFS=', ' logfile
The output:
2017-09-26 20:54:09, player1, My movie
2017-09-26 21:30:22, player2, YT test movie
2017-09-26 22:45:58, player1, My radio R1
GNU awk solution. Since we don't know in what order friendly_name and media_title occur we need to split up the matches:
$ awk 'match($0,/friendly_name=([^,]+)/,a) &&
match($0,/media_title=([^,#]+?)(\s#)?/,b) &&
a[1] && b[1] && !(b[1] in arr){
arr[b[1]];
printf "%s, %s, %s\n", $1" "$2, a[1], b[1]}' file
which gives as result:
2017-09-26 20:54:09, player1, My movie
2017-09-26 21:30:22, player2, YT test movie
2017-09-26 22:45:58, player1, My radio R1
I am not able to figure out the root cause for the problem, why the forEach loop is not working,
<bpel:forEach name="eachMarket" parallel="no" counterName="marketCounter">
<bpel:startCounterValue>1</bpel:startCounterValue>
<bpel:finalCounterValue>count($input.payload/tns:DCTResponse/tns:DCTIDs/tns:DCTID)</bpel:finalCounterValue>
<bpel:scope>
<bpel:assign>
<bpel:copy ignoreMissingFromData="yes" insertMissingToData="yes">
<bpel:from>$input.payload/tns:DCTResponse/tns:DCTIDs/tns:DCTID[round($marketCounter)]/tns:DEFTYPE</bpel:from>
<bpel:to>$OrderParameterPLRequest.parameters/ns:DCTResponse/ns:DCTIDs/ns:DCTID[round($marketCounter)]/ns:DEFTYPE</bpel:to>
</bpel:copy>
<bpel:copy ignoreMissingFromData="yes" insertMissingToData="yes">
<bpel:from>$input.payload/tns:DCTResponse/tns:DCTIDs/tns:DCTID[round($marketCounter)]/tns:MarketName</bpel:from>
<bpel:to>$OrderParameterPLRequest.parameters/ns:DCTResponse/ns:DCTIDs/ns:DCTID[round($marketCounter)]/ns:MarketName</bpel:to>
</bpel:copy>
<bpel:copy ignoreMissingFromData="yes" insertMissingToData="yes">
<bpel:from>$input.payload/tns:DCTResponse/tns:DCTIDs/tns:DCTID[round($marketCounter)]/tns:DCTID</bpel:from>
<bpel:to>$OrderParameterPLRequest.parameters/ns:DCTResponse/ns:DCTIDs/ns:DCTID[round($marketCounter)]/ns:DCTID</bpel:to>
</bpel:copy>
</bpel:assign>
</bpel:scope>
</bpel:forEach>
<bpel:forEach name="eachParameter" parallel="no" counterName="parameterCounter">
<bpel:startCounterValue>1</bpel:startCounterValue>
<bpel:finalCounterValue>count($input.payload/tns:DCTResponse/tns:AdditionalParamters/tns:Parameter)</bpel:finalCounterValue>
<bpel:scope name="parameterScope">
<bpel:assign>
<bpel:copy ignoreMissingFromData="yes" insertMissingToData="yes">
<bpel:from>$input.payload/tns:DCTResponse/tns:AdditionalParamters/tns:Parameter[$parameterCounter]/tns:Name</bpel:from>
<bpel:to>$OrderParameterPLRequest.parameters/ns:DCTResponse/ns:AdditionalParamters/ns:Parameter[$parameterCounter]/ns:Name</bpel:to>
</bpel:copy>
<bpel:copy ignoreMissingFromData="yes" insertMissingToData="yes">
<bpel:from>$input.payload/tns:DCTResponse/tns:AdditionalParamters/tns:Parameter[$parameterCounter]/tns:Value</bpel:from>
<bpel:to>$OrderParameterPLRequest.parameters/ns:DCTResponse/ns:AdditionalParamters/ns:Parameter[$parameterCounter]/ns:Value</bpel:to>
</bpel:copy>
</bpel:assign>
</bpel:scope>
</bpel:forEach>
Input will contain multiple Ids,
<p:DCTIDs>
<p:DCTID>
<p:DEFTYPE>acvinclis</p:DEFTYPE>
<p:MarketName>pectoreflammas</p:MarketName>
<p:DCTID>3</p:DCTID>
</p:DCTID>
<p:DCTID>
<p:DEFTYPE>acvinclis</p:DEFTYPE>
<p:MarketName>pectoreflammas</p:MarketName>
<p:DCTID>3</p:DCTID>
</p:DCTID>
<p:DCTID>
<p:DEFTYPE>acvinclis</p:DEFTYPE>
<p:MarketName>pectoreflammas</p:MarketName>
<p:DCTID>3</p:DCTID>
</p:DCTID>
</p:DCTIDs>
Error message:
faultExplanation={http://docs.oasis-open.org/wsbpel/2.0/process/executable}selectionFailure: No results for expression: '$OrderParameterPLRequest.parameters/ns:DCTResponse/ns:DCTIDs/ns:DCTID[round($marketCounter)]/ns:DEFTYPE' against '
This question is probably not relevant for the original poster anymore, but I wanted to include an answer in case anybody else has the same problem.
If you want to get the DEFTYPE of the DCTID at position $Counter, use the following XPath:
($input.payload/tns:DCTResponse/tns:DCTIDs/tns:DCTID)[position() = $Counter]/tns:DEFTYPE
Two things are important here:
Put parentheses around the entire expression up to DTCID. For the reason why, see https://stackoverflow.com/a/8336922/5986352
While [1], [2], etc. will work, [$Counter] will not work. Instead, use [position() = $Counter]. You can also use position() to perform some more complex queries, e.g., [position() < 3] in order to select the first two nodes from a certain set.
Everyone please note that this question has been updated as I've recently had logs to work with. Below the log is the original posting with my configs, and the behavior I was seeing.
Today, WSO2 APIM continues to send to BAM receivers which update the Stats database. However when I click on any of the statistics link in the publisher I get:
TID: [0] [AM] [2014-03-04 13:43:18,815] ERROR {org.wso2.carbon.apimgt.hostobjects.APIProviderHostObject} - Error while invoking APIUsageStatisticsClient for ProviderAPIUsage {org.wso2.carbon.apimgt.hostobjects.APIProviderHostObject}
org.wso2.carbon.apimgt.usage.client.exception.APIMgtUsageQueryServiceClientException: Error occurred while querying from JDBC database
at org.wso2.carbon.apimgt.usage.client.APIUsageStatisticsClient.queryFirstAccess(APIUsageStatisticsClient.java:1747)
at org.wso2.carbon.apimgt.usage.client.APIUsageStatisticsClient.getFirstAccessTime(APIUsageStatisticsClient.java:1675)
at org.wso2.carbon.apimgt.hostobjects.APIProviderHostObject.jsFunction_getFirstAccessTime(APIProviderHostObject.java:2911)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.mozilla.javascript.MemberBox.invoke(MemberBox.java:126)
at org.mozilla.javascript.FunctionObject.call(FunctionObject.java:386)
at org.mozilla.javascript.optimizer.OptRuntime.call2(OptRuntime.java:42)
at org.jaggeryjs.rhino.publisher.modules.statistics.c1._c_getFirstAccessTime_13(/publisher/modules/statistics/usage.jag:351)
at org.jaggeryjs.rhino.publisher.modules.statistics.c1.call(/publisher/modules/statistics/usage.jag)
at org.mozilla.javascript.ScriptRuntime.applyOrCall(ScriptRuntime.java:2430)
at org.mozilla.javascript.BaseFunction.execIdCall(BaseFunction.java:269)
at org.mozilla.javascript.IdFunctionObject.call(IdFunctionObject.java:97)
at org.mozilla.javascript.optimizer.OptRuntime.call2(OptRuntime.java:42)
at org.jaggeryjs.rhino.publisher.modules.statistics.c0._c_anonymous_13(/publisher/modules/statistics/module.jag:29)
at org.jaggeryjs.rhino.publisher.modules.statistics.c0.call(/publisher/modules/statistics/module.jag)
at org.mozilla.javascript.optimizer.OptRuntime.call1(OptRuntime.java:32)
at org.jaggeryjs.rhino.publisher.site.blocks.stats.ajax.c0._c_anonymous_1(/publisher/site/blocks/stats/ajax/stats.jag:220)
at org.jaggeryjs.rhino.publisher.site.blocks.stats.ajax.c0.call(/publisher/site/blocks/stats/ajax/stats.jag)
at org.mozilla.javascript.optimizer.OptRuntime.call0(OptRuntime.java:23)
at org.jaggeryjs.rhino.publisher.site.blocks.stats.ajax.c0._c_script_0(/publisher/site/blocks/stats/ajax/stats.jag:4)
at org.jaggeryjs.rhino.publisher.site.blocks.stats.ajax.c0.call(/publisher/site/blocks/stats/ajax/stats.jag)
at org.mozilla.javascript.ContextFactory.doTopCall(ContextFactory.java:394)
at org.mozilla.javascript.ScriptRuntime.doTopCall(ScriptRuntime.java:3091)
at org.jaggeryjs.rhino.publisher.site.blocks.stats.ajax.c0.call(/publisher/site/blocks/stats/ajax/stats.jag)
at org.jaggeryjs.rhino.publisher.site.blocks.stats.ajax.c0.exec(/publisher/site/blocks/stats/ajax/stats.jag)
at org.jaggeryjs.scriptengine.engine.RhinoEngine.execScript(RhinoEngine.java:570)
at org.jaggeryjs.scriptengine.engine.RhinoEngine.exec(RhinoEngine.java:273)
at org.jaggeryjs.jaggery.core.manager.WebAppManager.execute(WebAppManager.java:432)
at org.jaggeryjs.jaggery.core.JaggeryServlet.doPost(JaggeryServlet.java:29)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:755)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:749)
at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:487)
at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:379)
at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:339)
at org.jaggeryjs.jaggery.core.JaggeryFilter.doFilter(JaggeryFilter.java:21)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:222)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:99)
at org.wso2.carbon.tomcat.ext.valves.CompositeValve.continueInvocation(CompositeValve.java:178)
at org.wso2.carbon.tomcat.ext.valves.CarbonTomcatValve$1.invoke(CarbonTomcatValve.java:47)
at org.wso2.carbon.webapp.mgt.TenantLazyLoaderValve.invoke(TenantLazyLoaderValve.java:56)
at org.wso2.carbon.tomcat.ext.valves.TomcatValveContainer.invokeValves(TomcatValveContainer.java:47)
at org.wso2.carbon.tomcat.ext.valves.CompositeValve.invoke(CompositeValve.java:141)
at org.wso2.carbon.tomcat.ext.valves.CarbonStuckThreadDetectionValve.invoke(CarbonStuckThreadDetectionValve.java:156)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:936)
at org.wso2.carbon.tomcat.ext.valves.CarbonContextCreatorValve.invoke(CarbonContextCreatorValve.java:52)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1004)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:589)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1653)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.sql.SQLException: Incorrect syntax near 'limit'.
at net.sourceforge.jtds.jdbc.SQLDiagnostic.addDiagnostic(SQLDiagnostic.java:372)
at net.sourceforge.jtds.jdbc.TdsCore.tdsErrorToken(TdsCore.java:2988)
at net.sourceforge.jtds.jdbc.TdsCore.nextToken(TdsCore.java:2421)
at net.sourceforge.jtds.jdbc.TdsCore.getMoreResults(TdsCore.java:671)
at net.sourceforge.jtds.jdbc.JtdsStatement.executeSQLQuery(JtdsStatement.java:505)
at net.sourceforge.jtds.jdbc.JtdsStatement.executeQuery(JtdsStatement.java:1427)
at org.wso2.carbon.apimgt.usage.client.APIUsageStatisticsClient.queryFirstAccess(APIUsageStatisticsClient.java:1729)
... 63 more
TID: [0] [AM] [2014-03-04 13:43:18,836] ERROR {JAGGERY.modules.statistics.usage:jag} - java.lang.NullPointerException: null {JAGGERY.modules.statistics.usage:jag}
I have configured WSo2 API Manager 1.6.0 and BAM 2.4.0 to both use the same datasource configuration for WSO2AM_STATS_DB.
For the store it is a single WSO2AM_STATS_DB entry that matches the entry on the BAM server (below) and API-Manager.xml is updated to:
<APIUsageTracking>
<Enabled>true</Enabled>
<PublisherClass>org.wso2.carbon.apimgt.usage.publisher.APIMgtUsageDataBridgeDataPublisher</PublisherClass>
<ThriftPort>7612</ThriftPort>
<BAMServerURL>tcp://myBAMserver:7612/</BAMServerURL>
<BAMUsername>user</BAMUsername>
<BAMPassword>pwd</BAMPassword>
<DataSourceName>jdbc/WSO2AM_STATS_DB</DataSourceName>
<GoogleAnalyticsTracking>
<Enabled>false</Enabled>
<TrackingID>UA-XXXXXXXX-X</TrackingID>
</GoogleAnalyticsTracking>
</APIUsageTracking>
and now the datasource for APIM:
<datasource>
<name>WSO2AM_STATS_DB</name>
<description>The datasource used for getting statistics to API Manager</description>
<jndiConfig>
<name>jdbc/WSO2AM_STATS_DB</name>
</jndiConfig>
<definition type="RDBMS">
<configuration>
<defaultAutoCommit>false</defaultAutoCommit>
<url>jdbc:jtds:sqlserver://mydbserver:1433/wso2_apiStatsdb</url>
<username>wso2storeuser</username>
<password>storepwd</password>
<driverClassName>net.sourceforge.jtds.jdbc.Driver</driverClassName>
<maxActive>50</maxActive>
<maxWait>60000</maxWait>
<testOnBorrow>true</testOnBorrow>
<validationQuery>SELECT 1</validationQuery>
<validationInterval>30000</validationInterval>
</configuration>
</definition>
</datasource>
For the BAM server here are my datasources:
<datasource>
<name>WSO2_CARBON_DB</name>
<description>The datasource used for API Manager database</description>
<jndiConfig>
<name>jdbc/WSO2CarbonDB</name>
</jndiConfig>
<definition type="RDBMS">
<configuration>
<defaultAutoCommit>false</defaultAutoCommit>
<url>jdbc:jtds:sqlserver://mydbserver:1433/wso2_carbondb</url>
<username>user</username>
<password>pwd</password>
<driverClassName>net.sourceforge.jtds.jdbc.Driver</driverClassName>
<maxActive>50</maxActive>
<maxWait>60000</maxWait>
<testOnBorrow>true</testOnBorrow>
<validationQuery>SELECT 1</validationQuery>
<validationInterval>30000</validationInterval>
</configuration>
</definition>
</datasource>
<datasource>
<name>WSO2AM_STATS_DB</name>
<description>The datasource used for getting statistics to API Manager</description>
<jndiConfig>
<name>jdbc/WSO2AM_STATS_DB</name>
</jndiConfig>
<definition type="RDBMS">
<configuration>
<defaultAutoCommit>false</defaultAutoCommit>
<url>jdbc:jtds:sqlserver://mydbserver:1433/wso2_apiStatsdb</url>
<username>user</username>
<password>pwd</password>
<driverClassName>net.sourceforge.jtds.jdbc.Driver</driverClassName>
<maxActive>50</maxActive>
<maxWait>60000</maxWait>
<testOnBorrow>true</testOnBorrow>
<validationQuery>SELECT 1</validationQuery>
<validationInterval>30000</validationInterval>
</configuration>
</definition>
</datasource>
<datasource>
<name>WSO2BAM_DATASOURCE</name>
<description>The datasource used for analyzer data</description>
<definition type="RDBMS">
<configuration>
<defaultAutoCommit>false</defaultAutoCommit>
<url>jdbc:jtds:sqlserver://mydbserver:1433/wso2_apiStatsdb</url>
<username>user</username>
<password>pwd</password>
<driverClassName>net.sourceforge.jtds.jdbc.Driver</driverClassName>
<maxActive>50</maxActive>
<maxWait>60000</maxWait>
<testOnBorrow>true</testOnBorrow>
<validationQuery>SELECT 1</validationQuery>
<validationInterval>30000</validationInterval>
</configuration>
</definition>
</datasource>
<datasource>
<name>WSO2BAM_CASSANDRA_DATASOURCE</name>
<description>The datasource used for Cassandra data</description>
<definition type="RDBMS">
<configuration>
<url>jdbc:cassandra://localhost:9161/EVENT_KS</url>
<username>admin</username>
<password>admin</password>
</configuration>
</definition>
</datasource>
<datasource>
<name>WSO2BAM_UTIL_DATASOURCE</name>
<description>The datasource used for BAM utilities, such as message store etc..</description>
<definition type="RDBMS">
<configuration>
<url>jdbc:cassandra://localhost:9161/BAM_UTIL_KS</url>
<username>admin</username>
<password>admin</password>
</configuration>
</definition>
</datasource>
<!-- The URL configs are loaded from cassandra-component.xml -->
<datasource>
<name>WSO2BAM_HIVE_INCREMENTAL_DATASOURCE</name>
<definition type="RDBMS">
<configuration>
<username>admin</username>
<password>admin</password>
<dataSourceProps>
<property name="replicationFactor">1</property>
<property name="strategyClass">org.apache.cassandra.locator.SimpleStrategy</property>
<property name="readConsistencyLevel">QUORUM</property>
<property name="writeConsistencyLevel">QUORUM</property>
<property name="keyspaceName">HIVE_INCREMENTAL_KS</property>
</dataSourceProps>
</configuration>
</definition>
</datasource>
When I look in mydbserver/wso2_apiStatsdb I see that the following tables were created and have been updated with data:
API_REQUEST_SUMMARY, API_Resource_USAGE_SUMMARY
update: some additional tables have been added and updated...
API_FAULT_SUMMARY, API_RESPONSE_SUMMARY, API_VERSION_USAGE_SUMMARY
However when I go to the publisher page I get:
As far as I can tell I've updated the publisher/store and gateway/key manager to send data to the BAM server. The BAM server appears to process and put that summarized data in the sql server database. My entries for SQL server database match for the publisher/store config and the BAM server config. Does anyone have any ideas what I'm missing that will enable me to see the stats when in the publisher?
Can u please check if the EnableBillingAndUsage is set to false. This is available in conf/api-manager.xml
<EnableBillingAndUsage>false</EnableBillingAndUsage>
Gammonster, this has been figured out. It appears that the devil is indeed in the details - the runtime was trying to execute a sql query using the 'limit' keyword in a SQL Server database and SQL Server does not support 'limit'. It has 'top' instead.
Would have appreciated if the product documentation expressly identified SQL Server to not be a preferred DB instance for BAM.
Are you running the BAM on port offset 1? As far as I can see you have BAM running on a different server. Try running BAM with port offset 1. The default config provided in the site is for a usecase where API Manager and BAM runs on the same server with port offsets (API M -0 and BAM -1). Hence all the relevant thrift ports are also incremented by 1 in the default config. Hence I suggest you to run BAM on offset 1.
In the meantime pls check if the MQSQL DB that has summarized stats contain any data inside them.
This can also happen if you haven't configured the summarized database correctly in API Manager. From what you have mentioned, API Manager seems to publish data correctly, but it's not fetching summarized data. Please see if you have defined WSO2AM_STATS_DB in master-datasources.xml inside APIM directory.