Can someone explaine me step by step how configure location tracking with WSO2 iot 3.1.0 analytics for 1000 android and mobiles devices?
We have a lot of tecniciens working on fields and we need to track them when they are out of their itinaries or not. We have several groups of techniciens across the country.
My hope is to have several custom reports base on location with devices and tecniciens intinaries on an Analytics dashboard. When the guy is out of the zone, I receive alerte message.
I have a plateform with wso2iot-3.1.0-update1, running on 1 server OS: Centos 6.7 .
The geofencing is activated in devicemgt (In zone, out of zone, )
I'm still using the default open sreet map
Devices are already in device management and users accounts but when we clic on location of a device, the itinary is not well adjusted to the map.
is it possible to do this with open street map or I must used google map and how?
Notice that is for real time tracking.
Documenttaion used:
https://docs.wso2.com/display/IoTS310/Monitoring+Devices+Using+Location+Based+Services
https://docs.wso2.com/display/IoTS310/Understanding+the+WSO2+IoT+Server+Analytics+Framework
Thanks in advance
Related
I have a Books API project, and the GCP shows "No data is available for the selected time frame" for the last 30 days. This message appears on both the "Metrics" and "Quotas" pages. See screenshots below.
Clearly there is data, which I can see via my app analytics reports.
Any suggestions on how to fix it?
UPDATE 1:
Following are some points that were missing on the original post:
The Google Books API is used by an iOS app, which is available on the App Store and widely used across many iOS devices (iPhone and iPads) in many countries.
There are thousands of iOS devices running my app so the Google Books API calls are invoked from thousands of endpoints with different locations and different IPs. All endpoints are using the same API_KEY.
The Google Books API calls are performed successfully from the iOS devices and there is no API issue (I can clearly see that using analytics tool).
The only issue I have, is with GCP console not showing the number of the API calls (and other metrics) associated with my API_KEY. As you can see in the previous screenshots, I get "No data is available for the selected time frame" anywhere.
This is a regression issue since until recently I could successfully view the actual data of the API usage. I didn't change anything in this period.
When going to GCP > IAM & Admin > Quotas, you can clearly see that the app indeed consumes API calls (see screenshot below).
Any suggestion why would the GCP console tell that no data is available, while data is indeed available?
As the documentation [1], Google Books respects copyright, contract, and other legal restrictions associated with the end user's location. As a result, some users might not be able to access book content from certain countries. For example, certain books are "previewable" only in the United States; we omit such preview links for users in other countries. Therefore, the API results are restricted based on your server or client application's IP address.
On the other hand, I hope link [2] could be helpful for you which seems similar to the issue you are facing. Also, documentation [3] [4] could be helpful for us to have more information about books API to use in the Google Cloud Platform.
[1] https://developers.google.com/books/docs/v1/using#UserLocation
[2] Google books api always returns nothing
[3] https://developers.google.com/books/docs/v1/using
[4] https://developers.google.com/books/docs/v1/getting_started
I am looking at Uber architecture picture : https://imgur.com/a/c1Nkuvf and I am wondering In the center there is a box with DISCO and Supply and Demand services and the idea is that the Demand Service calls the Supply Service which calls one of the servers -Region1 to Region5 gets the information and sends it back to the Demand service and then it is send to the client.My question is where do these services reside and this box with the 3 of them is it some kind of module,message bus or something else?
The box with words Disco, Supply and Demand represents the three major services that work together to match the riders with drivers. Demand service receives the demand from the riders and keeps track of their gps locations while supply service keeps track of the drivers and their vehicle locations. The disco service performs the calculations so that riders are matched with drivers optimally in terms of distance, time and other factors. There would be a set of sub services within each major service to perform several low level tasks. These services run inside numerous geo distributed app servers as depicted by Regions 1 to 5 in the diagram. So to answer your question, the box is just the representation of these geo distributed app servers. Hope this helps!
I am having some trouble with Monitoring System Statistics in WSO2 Identity Server 5.3.3. There is no activity being reported in the Service Summary. We are about to go live with the WSO2 Identity server in a couple weeks, so I really want to keep an eye on the response time and counts.
Our current production system is not heavily used, but currently it shows Total Response Count: 0, when in fact I have tested several logins earlier today.
I know the monitoring was updating a few weeks ago, but something happened. Do I need to enable this via a setting or is it possible it was turned off?
This is the official documentation entry point on setting up and using statistics/analytics for latest WSO2 Identity server version. Please make sure you have followed all the steps accurately. You will have to separately enable it and configure in below files. After enabling you should be able to view stats.
<IS_HOME>/repository/conf/identity/identity.xml
<IS_HOME>/repository/deployment/server/eventpublishers/IsAnalytics-Publisher-wso2event-*
I have a real time web analytics problem to address, and I'm wondering if some of the WSO2 products might be an appropriate solution.
An ecommerce web site shows pages of products to a browser user, and the web site vendor wants to collect details of what products were viewed in a list, what products were selected from the list for more info, what products were put into the basket, and what products were actually purchased - all in real time. I can use web page tagging to generate logging events for the four states (I.e. In list, view detail, in basket, purchased). The web site vendor wants too see results summarized by product and by rolling time band (e.g. Last hour, last 6 hours, last 24 hours, last 72 hours) by the four product states.
As a complete WSO2 newbie I'm hoping somebody can help with some pointers on how to address this. I've been reading about the BAM module to capture events. Is that a good place to start? Also can anybody suggest a good in memory data store to hold the event data aggregated by event type and rolling time period?
TIA
Yes, BAM is more kind batch processing, monitoring and complex engine and using it you can capture data, process and then present. In architectural point of view, the product states that are changed by the browser user will be captured by the web server and publish to BAM server.
A good point to start is learning about data publishing. Once you define the data [in BAM it is known as stream definition] to be published, you can write a hive script to process it and present. You can pump all data to BAM and then you can use hive script to process and store it in the manner you wanted. Later you can retrieve and present.
I was looking here if is possible to insert/update a large quantity of rows from an as400 system.
I have a website stored on another server online and that website must be updated with the new stocks for each article. But this data only exists in the as400 system.
I would like to be the as400 system linking the web-server instead of the web-server link to as400 for security reasons.
A better system would be to update/insert everytime a change has been made in the as400, but if this is not possible it could be making an update every 3 hours in order to mantain consistency between the 2 servers.
Thanks
Yes it can be done.
You can attach a database trigger to your stock table that pushes the key fields to a data queue anytime an insert, update or delete is performed. You can then process the data queue to send the updates to the web site using an HTTP POST or other means.
IBM Redbook: Stored Procedures, Triggers, and User-Defined Functions
on DB2 Universal Database for
iSeries
IBM i 7.1 Information Center: Data Queue APIs
Scott Klement's RPG IV Sockets Tutorial