I want to push report pdf on slack on daily basis for that I did not find a direct approach to do this, there is Reporting schedule I created in kibana from dashboard but that doesn't have slack integration available.
I found this endpoint to kibana /_plugin/kibana/api/reporting/generateReport?timezone=Asia/Calcutta to download dashboard raw data using curl but its not opening in file browser, this data is in this format :
{"data":"<some big raw data>","filename":"XXX.pdf"}
How to convert this data into an openable format? OR if any alternative available then how to get Kibana report PDF in slack?
Steps to get the curl command to fetch dashboard PDF data:
Open developer tool -> Networks tab
Goto Kinbana -> Dashboards -> select a dashboard -> Reporting(top right) -> Download PDF
get the network call and right click and copy as curl
Related
We have a requirement to fetch data from a rest api into powerbi and schedule a refresh every night. The rest api support jwt authentication so it needs header with xapikey and access token.
I have managed to write a function in power query to get access token from our auth endpoint and able to inject access token for the rest api call and it works fine with powerbi-desktop. I have published the report to powerbi cloud.
The auth endpoint require username and password, we would not like to store this details in .pbix file and publish to cloud but instead use azure key vault and powerbi to fetch details at runtime.
Please advise ?
Power Automate has a great Azure Vault connector.
You could make a simple 3-action flow:
A post to that URL will json back the secret/credentials.
Now, here is the goofy part - hide that URL in a permissioned location (Onedrive, Sharepoint, etc). Have your pbi pickup from that location, using privileged credentials. Now the URL and the credentials get picked up at runtime, and neither is persisted in PBIX.
I am assuming that there is an available premium PAutomate env in which to spin up that flow, of course. But, given that you already have an azure vault, that seems like a standard PBI+ toolkit to have at that point.
I am trying to send HTTPS Post Request from GCP to Segment.io
I want to create a service that will read data from BigQuery table and then send calls directly to Segment.io API (link) from where I'll redirect the data to other destinations, but on the GCP site I'm struggling to find the most optimal way to do it. Cloud Run seems like a good option but I'm wondering if there might be an easier way?
The recommended products to be used for this task can be either Cloud Run or Cloud Functions.
You can either use the Client Libraries or API in order to extract the data from the BigQuery table and use any HTTP request library of your favorite programming language to issue the POST request to the Segment.io API.
I am using wso2is-km-5.9.0 for serving our application authentication and SSO needs and we have coupled wso2is-analytics-5.8.0 with our Identity Server. I have followed the steps mentioned in the offical documentation i.e setting Analytics in api-manager.xml to True and enabling the eventpublishers but I'm unable to see any data in Analytics dashboard. Please refer the error logs for the same attached.
Why is the data not being populated in WSO2 analytics dashboard??
Analytics error log
Identity Server error log
With the new config model introduction, you should not edit the api-manager.xml file. It should be done via deployement.toml file.
Moreover you have to enable event listeners and publishes in deployment.toml file. This will enable publishing data to the analytics. You should refer the documentation in https://is.docs.wso2.com/en/5.9.0/learn/configuring-identity-analytics/ because IS and IS-KM is more similar, difference is having an apim-manger.xml file.
[[event_listener]]
id = "authn_data_publisher_proxy"
type = "org.wso2.carbon.identity.core.handler.AbstractIdentityMessageHandler"
name = "org.wso2.carbon.identity.data.publisher.application.authentication.AuthnDataPublisherProxy"
order = 11
[identity_mgt.analytics_login_data_publisher]
enable=true
[identity_mgt.analytics_session_data_publisher]
enable=true
I have an issue with the scheduled refresh function in Power BI. I have published a PBIX file to the web environment of Power BI. As with other PBIX files, I set the scheduled refresh via the on-premises gateway. My PBIX file has data from several sources (MySQL, OData, other Web connectors).
Setting up and connecting the MySQL source to scheduled refresh (via the gateway) works fine. However, when trying to connect the OData source to the gateway, this fails. The message shows that credentials are invalid, "AccessUnauthorized". However, via PBI Desktop there is no need for me to use credentials (as access is via Anonymous, with an API key "Bearer .........").
The following settings are used (in the gateway setup tab):
Type of source: OData
URL: https://tcodata.azurewebsites.net/estimates
Authentication method: Anonymous
Privacy-settings: None
The following code is used in PBI Desktop:
let
apiUrl = "https://tcodata.azurewebsites.net/estimates",
Source = OData.Feed(apiUrl , null, [Implementation="2.0", Headers = #"Authorization"=Text.From(ApiKey)]])
in
Source
The API key refers to ApiKey = Bearer ........(key here)
No real authentication is needed, because it is accessed as Anonymous. However, when setting the scheduled refresh, this does not work (as credentials are said to be invalid).
Help is much appreciated, thanks!
The question was answered on the PowerBI forum:
When refreshing odata source in Power BI service, with the power query code as yours, you don't need to add it under the on-premise gateway, just go to "data setting"->"schedule refresh"->edit credential for that odata source, select "anonymous".
Source
WSO2 maps the requested Url called in it to an Url in another server. How can see the mapped URL that WSO2 effectively called?
For debugging, another option you have is enabling wire logs.
1) Uncomment below line in <APIM_HOME>/repository/conf/log4j.properties
log4j.logger.org.apache.synapse.transport.http.wire=DEBUG
2) Restart the Server.
3) Send a request and wire logs can be found in console and <APIM_HOME>/repository/logs/wso2carbon.log file.
Just found an answer at WSO2 blog: [Trace API calls and responses](https://wso2.com/blogs/cloud/trace-api-calls-and-responses/}.
Open for editing the API that you want to trace,
Go to step 2 (Implement),
Click the Enable Message Mediation checkbox and then select the debug_ sequences from the dropdowns for all 3 flows below it
Click the Next: Manage button at the bottom of the screen,
Click Save & Publish at the bottom of the last step of the editing wizard.
Open the live log by clicking the Configure / Admin Dashboard menu, and then clicking Log Analyzer / Live Log Viewer in Admin Dashboard’s left-hand menu pane.
Now invoke the API (for example, in the API Store‘s API Console for that API).
You will see detailed information on the API request and response in the log
When you are done troubleshooting, disable the message mediation that you enabled in step 3.
This solution has a great disadvantage: you must have Analytics running (I don't have it now). But you can turn it on just for your own API. If you don't have Analytics running, you can compromise with #Bee solution below and some tail -f|grep .