WSO2 APIM Analytics: How to capture seconds in WSO2AM_STATS_DB - wso2

I have one requirement where the Column Name 'Time' in 'WSO2AM_STATS_DB' should capture seconds also.
By default it captures only like 2018-11-18 15:04. How it can capture the seconds also in Time column.
Thanks in advance.

Related

Dynamic setting the timestamp fields in superset dashboards

I'm building few dashboards in Apache superset. All my available timestamp fields are in UTC timezone. (for example fields are, class_start_time & class_end_time).
I want that in the timezone the dashboard is opened all the timestamp fields will be automatically converted.
For example, I'm opening dashboard in Norway , so the UTC data should be converted to CET timezone of Norway.
I have tried to add some value here in Hours offset but its not working.
Can you please guide how we can achieve this.?
Just for reference :
In Kibana dashboards (ELK stack) have feature to automatically convert the timezone into which it is being opened. So I need same thing in Superset.
Normally you would be able to set this with environment variables when you start the program or container. In Apache Superset, this is not possible. There is an ongoing discussion on Github about this issue. One GitHub user posts the problem and workaround, which is far from workable:
Daylight savings causes issues where users have to update datasource
timezone offset for each datasource twice per year.
So the only thing you can do is update the hours offset twice a year. To make matters even worse, if you use Postgresql, this may not even be possible due to a bug as described here.

How to get entire period aws cloud watch logs data in logs insights query

I have aws cloud watch log group like "data_analysis" and retention period is 30 days and yesterday i changed the retention period to "never expires" . while query the data using "logs insights" on "data_analysis" log group still getting 30days data. Could you please suggest the solution to get the entire data(at least 365 days) .
Note: the query results not more than 1000 and i am adding the limit condition on "logs insights" query.
Thanks in advance

Error in Google Play transfer frequency - Google BigQuery

I want to set weekly Google Play transfer, but it can not be saved.
At first, I set daily a play-transfer job. It worked. I tried to change transfer frequency to weekly - every Monday 7:30 - got an error:
"This transfer config could not be saved. Please try again.
Invalid schedule [every mon 7:30]. Schedule has to be consistent with CustomScheduleGranularity [daily: true ].
I think this document shows it can change transfer frequency:
https://cloud.google.com/bigquery-transfer/docs/play-transfer
Can Google Play transfer be set to weekly?
By default transfer is created as daily. From the same docs:
Daily, at the time the transfer is first created (default)
Try to create brand new weekly transfer. If it works, I would think it is a web UI bug. Here are two other options to change your existing transfer:
BigQuery command-line tool: bq update --transfer_config
Very limited number of options are available, and schedule is not available for update.
BigQuery Data Transfer API: transferConfigs.patch Most transfer options are updatable. Easy way to try it is with API Explorer. Details on transferconfig object. schedule field need to be defined:
Data transfer schedule. If the data source does not support a custom
schedule, this should be empty. If it is empty, the default value for
the data source will be used. The specified times are in UTC. Examples
of valid format: 1st,3rd monday of month 15:30, every wed,fri of
jan,jun 13:15, and first sunday of quarter 00:00. See more explanation
about the format here:
https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format
NOTE: the granularity should be at least 8 hours, or less frequent.

WSO2 CEP : Siddhi QL: Creating a unique stream with similar event records

I am pretty new to WSO2 CEP Siddhi QL, I got a requirement to analyze the events coming to a stream.
For ex: I have data coming in like this [id,value]:
InputStream=[1001,90]
InputStream=[1001,85]
InputStream=[1002,70]
InputStream=[1001,85]
InputStream=[1003,70]
InputStream=[1003,85]
InputStream=[1002,70]
InputStream=[1003,70]
InputStream=[1003,85]
InputStream=[1002,70]
InputStream=[1001,95]
InputStream=[1001,65]
In this, I want to segregate each records and group based on the id 1001, 1002 and 1003 records and create a new temp stream for each one of the id's grouped and check the highest value in that and alert it. Tried different patterns and joins, however not able to zero-in to an exact solution.
Any help / guidance towards the solution would be greatly appreciated. Thank you.
In Siddhi its not possible to create new streams based on the event values. However since your requirement is to have groups based on the id and alert the highest value of each group. We could achieve that with a single output stream. What we need to do is maintain a window (time or length window) for a duration, then group by on id and select max(temp) and insert into alert stream. Please refer following sample siddhi query.
from TempStream#window.time(2 min)
select max(temp) as highestTemperature
group by id
insert into alertStream;

Google Admin Reports API: Users Usage Report stats accuracy

I am trying to use Google Admin Reports API: Users Usage Report to pull emails received/sent per user per day in our org's google app.
When I use Google APIs Explorer to pull my own stats for a particular day and compared it with real situation, it seems the number is far off.
For example, on Sunday, 7th Dec 2014, I only sent out one email. But the stats shows there were 4 emails sent out by me on that day.
Any assistance would be appreciated
Cheers,
You should get the same results than searching in Gmail:
in:sent from:me after:2014/12/07 before:2014/12/08
The missing bit is the time zone the server is using which in my research it is always Pacific Standard Time.
Did you:
Send out any calendar invitations that day? (1 email per attendee)
Share any Google Drive files/folders that day (1 email per file shared)
Send mail from a Google Group
there are likely other actions you may have performed in other Google Apps which caused emails to go out in your name and count against your quota but not necessarily show in your Sent folder.
If you'd like for these messages to appear in your Sent folder, turn on Comprehensive Storage.