is there any way to read multiple data alerts in power bi, using flow, or some other way? - powerbi

is there a way to read data alerts in power bi using some sort of python code or something else? i want to be able to gather multiple data alerts for a specified account, then integrate them into an adaptive card.
flow doesn't seem to be able to do this for me, using flow i would need to create multiple flow apps to read one at a time and then somehow write the data somewhere that i can read later. this creates a availability problem for me, since i wouldn't want to be creating a new flow app every time i have a new powerbi alert.
Thanks for any suggestions.

You can read multiple data alerts in one logic app/power-automate if you use "When a HTTP request is received" as the trigger of the flow. You can specify the required data for multiple data alerts as the request body of the request.
For example, you set the "When a HTTP request is received" trigger as POST method. And then define the request body json schema for the data you want to input.
Then you can use the data which input in the request body in your python code to gather multiple data alerts.

Related

Cloud Data Fusion - Input HTTP Post Body from BQ rows

I am a new cloud data fusion user and have run into a problem I cant find a solution for.
I have a table in BQ with ~150 rows of latitude and longitude points. For each row, I want to pass the lat and lng into an HTTP post request to get a result from TravelTime API. Ultimately I want to have a table with all my original rows with a column with the response for each one/
Where I am stuck is that so far I have only been able to hard-code the body of the post request into the HTPP Source plugin and successfully write the response to a file in gcs. However, I expect the rows will change over time, so I would like to dynamically generate and pass in the POST request body from my BQ data.
Is this possible with data fusion? Is this an advisable approach? Or is there a better way?
As #Albert Shau and #user3750486 agreed in the comments:
There is no out-of-the-box way to pass data from BQ rows dynamically in a POST HTTP request.
A possible workaround is to have an HTTP transform plugin that sits in the middle of the pipeline and can be configured to make calls based on the input data. Then you would have a BQ source followed by that plugin followed by the GCS sink. I think your best bet would be to write a custom transform.
This can be done by following this link that #Albert Shau provided or to do a custom code using GCP's Cloud Function as OP did.
Posting the answer as community wiki for the benefit of the community that might encounter this use case in the future.
Feel free to edit this answer for additional information.

How to take conditional actions on bas64 encoded trigger output in Power Automate Flow?

My trigger is an Event Hub stream from Azure IoT Central telemetry data.
I need to conditionally run an action based on the keys in the Content from the Event Hub trigger.
The Content output from Event Hub trigger is base64 encoded.
If I use Content in an Action like Post Message to Teams, it is decoded and looks fine.
The content is NOT decoded when I try to use it in a Conditional Action, or if I instantiate a variable first. There is no base64ToString() in the expressions for Power Automate. Power Automate doc's say they will automatically decode for you (and no they don't).
Anyone have a working base64 decode method with Power Automate or another workaround?
This just does not work in Power Automate.
It works fine in Logic Apps.
I just rebuilt everything in Logic Apps and it works great.

AWS IOT ANALYTICS

am trying to fetch data from iot analytics(AWS) from my java sdk, I have created channels and pipeline and data are in the datasets
does anyone have idea about aws iot analytics data fetch mechanism?
AWS IoT Analytics distinguishes between raw data stored in channels, processed data stored in datastores and queried data stored in data sets.
As part of creating the dataset with CreateDatasetContent [1], you'll write your SQL query which runs against your datastore and produces the result set stored in your dataset. This guy can either be run ad-hoc or periodically every x hours. After you created the dataset successfully, you can get the query result via the GetDatasetContent API [2].
Please note that the CreateDatasetContent API is async, meaning you'll need to wait until the query ran successfully. By default, GetDatasetContent will always return you the latest successful result which might be empty directly after creating the dataset since the query hasn't finished yet. In order to get the current state of your query, you can pass the optional version=$LATEST parameter to the GetDatasetContent call. This will give you more information about the currently running query or whether it failed to execute.
Hope this helps
[1] https://docs.aws.amazon.com/iotanalytics/latest/APIReference/API_CreateDatasetContent.html
[2] https://docs.aws.amazon.com/iotanalytics/latest/APIReference/API_GetDatasetContent.html

Unable to capture inputs from EventHub into Stream Analytics

I am able to send messages from my application to eventhub.
However I am unable to input those messages in Stream Analytics.
you should use Operations Logs for defining what is going on with your stream. If there are problems with incoming messages, there can be the situation that you will not see any problems in the Dashboard.
One of the most popular issues is the format of the data, so check if your SQL query use the same schema as the incoming message and the same format.
I had the exact same problem. I deleted the input and re-created it just like the first time (except I gave it a new name). I am sure I had done this before without success. But this time it is working.

Preventing Ember Data Caching & loading model data on demand

We are considering moving from Backbone to Ember. There are a few issues through I can't get answers to from the docs.
1) Ember-Data caches it's data. Our application is multi-user so other users need to be able to see new records created by everyone. Is there a way around this? I read on another post that when a query string is passed, ember data does not cache data, is this true? If it is, can I then just always send query string and nothing will be cached?
2) Ember data has a single model in the router that appears to be instantiated at route load time. I can see that your can request data from multiple sources by returning an object with many this.store.find calls. Say I have a select element and when you select an option, another select gets populated with items based on the first select (which requires a call back to the server). How would that work, how can I get model data on demand (not at route load time)?
I'm not sure if it answers your question but you can always call
model.reload()
to refetch data from server so you can work with up to date data.
You may want to consider Faye (http://faye.jcoglan.com/), which would let you have a pub/sub setup that could update your store by listening to topics of interest. This uses WebSocket for the streaming interface. You could then put new objects into the store, remove or update existing objects which the server could publish to the client.