WSO2 BAM-Custom events in Dashboard - wso2

I am working with dashboard and want custom events to be shown using dashboard. Is it possible to see custom events using this?

Yes, You can do this. WSO2 BAM is designed to receive events and store in Apache Cassandra,has the capability to summarise the collected data by Apache Hive/Hadoop and writes the RDBMS database. By default h2 is used to write the data.
You can write the jaggery application or you can use gadget generation tool. Or any other third party dashboard which you want to be integrated also can be done as far as it can read it from RDBMS for which you are writing back the summarised data.

Related

How to externalize API Manager Analytics

Is there a way to externalize API Manager Analytics without using Choreo Cloud?
In a situation where we don't have ELK stack, can we use custom log files, CSV or a Database to store Analytics information so we can run custom reports against them?
Can we write a custom handler to write Analytics information into a different external source?
WSO2 API Manager, by default uses logs files to log API Analytics information. It logs all the successful, fault and throttled event into the log file. With on-premises Analytics option, we can use ELK stack and with ELK stack approach, it uses Filebeats and Logstash to read those logs, and filter Analytics related information.
There is no out-of-the-box option to plug in a custom log file, CSV or a Database as the destination for Analytics information. However, we can write a custom handler/ event publisher (as demonstrated in [1]) to read those log files and collect Analytics related information and put them into a different format such as CSV, Database. Instead of publishing already available analytics events data, it is also possible to publish custom analytics data with the existing event schema as demonstrated here [2].
But this requires a lot of effort and sounds like implementing a new Analytics options from the scratch as this custom event publisher is responsible only for publishing the events and we will need some way of visualizing them.
As of now, there is only 2 options for API Manager Analytics and they are either Choreo Cloud (cloud) or ELK stack (on-premise).
[1] https://apim.docs.wso2.com/en/latest/api-analytics/samples/publishing-analytics-events-to-external-systems/
[2] https://apim.docs.wso2.com/en/latest/api-analytics/samples/publishing-custom-analytics-data/

When to use Firestore vs Pub/Sub

Can you elaborate on the differences between Pub/Sub and Firestore and provide some scenarios or use cases on which one to choose?
I'm not sure which one to use for building an app for a food delivery service that services real-time updates reflected as soon as they are added or changed to the database, ensuring that customers and drivers are aware of when food is ready for pickup and when food is in transit to their end destination such as UberEats.
The difference is quite simple:
Firestore (RealtimeDB) is for backend to frontend (customers/users) communication and realtime updates
Pubsub is a backend to backend message bus for async processing.
In your use case, you won't use PubSub to send notification to your users! Use realtimeDB to perform these updates.
Pub/Sub is like a notification system wherein you receive updates when something is added, changed or removed.
Firestore, on the other hand, is a NoSQL database for mobile (Android, iOS) and other web apps that can be directly access via native SDK. It can support many data types, from simple strings to complex objects. It also supports whatever data structure that works best for your app.
It is best to use Firestore for your app as it provides realtime updates.
You can check for the detailed documentation of Pub/Sub and Firestore.
For Firestore, you can either use either mobile/web client library or server client library.
Here's the link for Firestore, containing its benefits and key features.

WSO2 - Best way to implement ETL jobs

What is the best way to implement ETL jobs using WSO2.
We've been trying to leverage data services within WSO2 EI 6.4.
Our objective is to fetch data from web services as well as RDBMS and to store it to an RDBMS.
Any suggestions / ideas will be much appreciated.
In My Experience with WSO2 middleware, Data services may not be the best fit for ETL jobs.
We had similar case where we wanted to copy data from one Databse to another Application.
For that we wrote integration (java) web service to fetch data from database send to the applciation using application Interface (web service exposed from application). and configured the Integration web service in EI scheduler to periodically run the service to sync the data.
Yes, Stream processor is better but need to really see if it fits in for ETL

How to configure Sitecore processing server?

I just installed Sitecore Experience Platform and configured it according to the Sitecore scaling recommendations for processing servers.
But I want to know the following things:
1.How can I use the sitecore processing server?
2.How can I check whether processing server is working fine?
3.How collections DB data is processed and send to reporting server?
The processing server is a piece of the whole analytics (xDB) part of the Sitecore solution. More info can be found here.
Snippet:
"The processing and aggregation component extracts information from
captured, raw analytics data and transforms it into a form suitable
for use in reporting applications. It also performs specific tasks on
the collection database that involve mass updates.
You implement processing and aggregation on a Sitecore application
server connected to both the collection and reporting databases. A
processing server can run independently on a dedicated server, or on
the same server together with other Sitecore components. By
implementing multiple processing or aggregation servers, it is
possible to achieve higher performance on high-traffic solutions."
In short: the processing server will aggregate the data in Mongo and processes it (to the reporting database). This can be put on a separate server in order to spare resources on your other servers. I'm not quite sure what it all does behind the scenes and how to check exactly and only that part of the process, but you could check the the reporting tools in the Sitecore backend, like Experience Analytics. If those are working, you probably are fine. Also, check the logs on the processing server - that will give you an indication what he is doing and if any errors occur.

how to persist runtime parameter of a service call then use as parameter for the next service call WSO2 ESB

I am seeking advice on the most appropriate method for the following use case.
I have created a number of services using the WSO2 Data Services Server which I want to run periodically passing in parameters of last run date. ie. the data services has two parameters start and end dates to run the sql against.
I plan to create a service within WSO2 ESB to mediate the execution of these service, combine the results to pass onto another web service. I think I can manage this ;-) I will use a scheduled task to start this at a predefined interval.
Where I am seeking advice is how to keep track of the last successful run time as I need to use this as parameters for the data services web services.
My options as I see them
create a config table in my database and create another data services web service to retrieve and persist these values
use vfs transport and somehow persist these values to a text file as xml, csv or json
use some other way like property values in the esb sequence and somehow persist these
any other??
With my current knowledge it would seem that 1 is easiest but it doesn't feel right as I would have to have write access to the database, something I possibly wouldn't normally have when architecting a solution like this in the future, 2 appears like it could work with my limited knowledge of WSO2 ESB to date but is 3 the best option? But as you see from the detail above this is where I start to flounder.
Any suggestions would be most welcome
I do not have much experience with ESB. However I also feel that your first option would be easier to implement.
A related topic was also discussed in WSO2 architecture mailing list recently with subject "[Architecture] Allow ESB to put and update registry properties"
It was discussed to introduce a registry mediator, but I'm not sure it will be implemented soon.
I hope this helps.
As of now there is no direct method to save content to ESB through ESB. But you can always write a custom mediator to do that or use the script mediator to achieve this
Following is the code snippet for the script mediator
<script language="js"><![CDATA[
importPackage(Packages.org.apache.synapse.config);
/* creates a new resource */
mc.getConfiguration().getRegistry().newResource("conf:/store/myStore",false);
/* update the resource */
mc.getConfiguration().getRegistry().updateResource(
"conf:/store/myStore", mc.getProperty("myProperty").toString());
]]></script>
I've written a blog post on how to do this in ESB 4.8.1. You can find it here