Unable to capture inputs from EventHub into Stream Analytics - azure-eventhub

I am able to send messages from my application to eventhub.
However I am unable to input those messages in Stream Analytics.

you should use Operations Logs for defining what is going on with your stream. If there are problems with incoming messages, there can be the situation that you will not see any problems in the Dashboard.
One of the most popular issues is the format of the data, so check if your SQL query use the same schema as the incoming message and the same format.

I had the exact same problem. I deleted the input and re-created it just like the first time (except I gave it a new name). I am sure I had done this before without success. But this time it is working.

Related

Can you get data from a Big Query table to an outside stream

Because of companies policies, we have a lot of information that we need as input inserted into a BigQuery table that we need to SELECT from.
My problem is that doing a select directly into this table and trying to run a process (a virtual machine, etc) is prone to errors and reworking. If my process stops, I need to run the query again and reprocess everything.
Is there a way to export data from Big Query to a Kinesis-like stream (I'm more familiar with AWS)?
DataFlow + PubSub seems to be the way to go for this kind of issue.
Thank you jamiet!

is there any way to read multiple data alerts in power bi, using flow, or some other way?

is there a way to read data alerts in power bi using some sort of python code or something else? i want to be able to gather multiple data alerts for a specified account, then integrate them into an adaptive card.
flow doesn't seem to be able to do this for me, using flow i would need to create multiple flow apps to read one at a time and then somehow write the data somewhere that i can read later. this creates a availability problem for me, since i wouldn't want to be creating a new flow app every time i have a new powerbi alert.
Thanks for any suggestions.
You can read multiple data alerts in one logic app/power-automate if you use "When a HTTP request is received" as the trigger of the flow. You can specify the required data for multiple data alerts as the request body of the request.
For example, you set the "When a HTTP request is received" trigger as POST method. And then define the request body json schema for the data you want to input.
Then you can use the data which input in the request body in your python code to gather multiple data alerts.

Stop a BigQuery script using a condition

I am running a BigQuery script to generate a table. The script assumes the existence of another table, performs some transformations, and places the transformed data into an output table. However, I want the script to terminate its execution (and possibly post a message) if the input table does not comply with some conditions. What is the best way of terminating a BigQuery script using a condition?
To achieve this without any external app that call the BigQuery API and perform the requirements checks (which is a nicer way, easier to maintain and evolve), is to create a schedule query. In this case, it's well designed for recurring request. If not, code this in your preferred language.
So, with BigQuery scheduled queries, you can perform your query, define the destination table and define a notification channel
Set the PubSub topic that you want. However, this message isn't custom. You will have the status and the reason of the latest execution. Then you will need to dig into to understand exactly what happened during the query and perform complex code to read the log and find the root cause.
If your check wants to know the status OK/KO, this solution is suitable, if not, prefer your own code, you will have a better granularity on the error management.

BigQuery python client dropping some rows using Streaming API

I have around a million data items being inserted into BigQuery using streaming API (BigQuery Python Client's insert_row function), but there's some data loss, ~10,000 data items are lost while inserting. Is there a chance BigQuery might be dropping some of the data? Since there aren't any insertion errors (or any errors whatsoever for that matter).
I would recommend you to file a private Issue Tracker in order for the BigQuery Engineers to look into this. Make sure to provide affected project, the source of the data, the code that you're using to stream into BigQuery along with the client library version.

Output table in slack slash command

I want slash command to output data in a table format?
I know that I will have to setup a custom integration for this. I did that using GET Method.
I can setup my own web service on EC2 machine, but how should I make sure that data comes in table format.
May be something like this
My problem is how should I make available data present in tabular format?
It's unfortunately not possible to format Slack messages as a table in this way. You would need to resort to generating an image and referencing it in a message attachment. There is limited support for displaying simple fields and labels, but may not quite meet your needs.
We had the same problem. So We made a slack app and made it free for public. Please feel free to check it out. https://rendreit.digital
After installing the app to your slack. You can do /tableit and paste in csv data or anything you copied from a spreadsheet (Excel or Google Sheet).
It also let your preview the rendered table before you send it to the chat.