We would like to create a data analytics dashboard (using tools such as PowerBI, Kibana or similar).
In a visualization we would like to give an user an addition opportunity in the front end to add additional information to the visualized data point - e.g. right click on data point, check box that this value is now confirmed by expert and then store this info back to database.
Is this usecase already available in some tool/plug in?
Related
I am working on the reporting project that uses PowerBI as the data visualization tool.
I need create a processing approval workflow on the PowerBI tool. After seeing the Dashboard, the employer can approve some exception cases and the workflow can direct connect with email or ticket system.
There are 2 cases:
Approve for the whole dashboard that supports to be easy. I don't have any problem here.
Approve for singular object/row in a table chart. So I must generate number of buttons according to the number of row, which I need help. I don't know how to generate dynamic number of buttons and attached to row. And how to program/code it to create a view or action to become an approval step.
Button PowerBI
In this screenshot, my plan is create buttons in each row and each button has the same function with parameter is username or IP. And after that I can send email to the user and notice him/her that his/her case is approve for exception.
I find this https://community.powerbi.com/t5/Community-Blog/A-simple-and-fun-guide-to-Microsoft-Flow-and-Power-BI/ba-p/151530. But it doesn't seem helpful. Anyone here has ever dealt with approval case like this.
Is PowerBI able to do the approval process like I want?
Thank you so much.
First: This kind of goes against the spirit of BI in general. BI is for data visualization, exploration, etc. It's not really a UI for inserting data and executing tasks. Maybe you want instead to have a front end that lets you do things, and only needs to handle a very limited dataset? PowerApps is good for that. If the dataset is less than 1000 rows, this could work.
Second: I'm pretty sure it's not possible to create a button (like one that you'd see in an HTML page) that does what you want it to do in a Power BI table visual.
Third: There is a "drill through" button capability, but this just lets you navigate from one area in the report to another, not send an email or execute a Power Automate flow or anything like that. You may have seen a button on a table visual, but it's misleading. It's not really programmable like an HTML/JS button on a website.
https://www.c-sharpcorner.com/article/create-a-drill-through-button-in-power-bi/
That said, within the last 1.5 years or so, we now have the PowerApps add-in available. You could create an app that utilizes your streaming dataset, create a gallery that looks at that dataset and creates a kind of table with buttons on it, and then each button is set to execute the flow you've created in Power Automate.
PowerApp Add in chiclet
All of this is very, extremely straightforward, but beware, the PowerApp will start to cost you extra money depending on where your data is housed. If it's a SQL server, you'll need both a premium PowerApp license and Power Automate license too.
Sorry for the not so great news, but this is kind of a limitation of Power BI.
Please allow me to ask a rather newbie question. So far, I have been using local tools like imagemagick or GOCR to perform the job, but that is rather old-fashioned, and I am urged to "move to google cloud AI".
The setup
I have a (training) data set of various documents (as JPG and PDF) of different kinds, and by certain features (like prevailing color, repetitive layout) I intend to classify them, e.g. as invoice type 1, invoice type 2, not an invoice. In a 2nd step, I would like to OCR certain predefined areas of each document and extract e.g. the address of the company sending the invoice and the date.
The architecture I am envisioning
In a modern platform as a service (pass), I have already set up an UI where I can upload new files. These are then locally stored in a directory with filenames (or in a MongoDB). Meta info like upload timestamp, user, original file name is stored in a DB.
The newly uploaded file should should then be submitted to google cloud which should do the classification step, and deliver back the label to be saved in the database.
The document pages should be auto-cropped, i.e. black or white margins are removed, most probably with google cloud as well. The parameters of the crop should be persisted in the DB.
In case it is e.g. an invoice, OCR should be performed (again by google cloud) for certain regions of the documents, e.g. a bounding box of spanning from the mid of the page to the right margin in the upper 10% of the cropped page. The results of the OCR should be again persisted locally.
The problem
I seem to be missing the correct search term to figure out how to do it with google cloud. Is there an google-API (e.g. REST), I can use to upload and which gives me back the results of steps 2 to 4?
I think that your best option here is to use Document AI (REST API and Libraries).
Using Document AI, you can:
Convert images to text
Classify documents
Analyze and extract entities
Additionally, for your use case, we have a new Document AI feature that is still in preview and has limited access which is the Invoice parser.
Invoice parser is similar to Form parser but for invoices instead of forms. Check out the Invoice parser page and you will see what I mean by preview and limited access.
AFIK, there isn't any GCP tool for image edition.
For context, we would like to visualize our data in google data studio - this dataset receives more entries each week. I have tried hosting our data sets in google drive, but it seems that they're too large and this slows down google data studio (the file is only 50 mb, am I doing something wrong?).
I have loaded our data into google cloud storage --> google bigquery, and connected my google data studio to my bigquery table. This has allowed me to use the google data studio dashboard much quicker!
I'm not sure what is the best way to update our data weekly in google cloud/bigquery. I have found a slow way to do this by uploading the new weekly data to google cloud, then appending the data to my table manually in bigquery, but I'm wondering if there's a better way to do this (or at least a more automated way)?
I'm open to any suggestions, and if you think that bigquery/google cloud storage is not the answer for me, please let me know!
If I understand your question correctly, you want to automate the query that populate your table, which is connected to Data Studio.
If this is the case, then you can use Scheduled Query from BigQuery. Scheduled query allow you to define a query which results can be inserted in a new table. Particularly you can specify different rules for repetition (minimum each 15 minutes) and execution, as well as destination writing options (destination table, writing mode: append, truncate).
In order to use Scheduled Queries your account must have the right permissions. You can have a look at the following documentation to better understand how to use Scheduled Query [1].
Also, please note that at the front end the updated data in the BigQuery table will be seen updated in Datastudio at each refresh (click on refresh button in Datastudio). To automatically refresh the front-end visualization you can use the following plugin [2] or automate the click on the refresh button through Browser console commands.
[1] https://cloud.google.com/bigquery/docs/scheduling-queries
[2] https://chrome.google.com/webstore/detail/data-studio-auto-refresh/inkgahcdacjcejipadnndepfllmbgoag?hl=en
I have a range of devices that push data to an Azure IoT hub. I then have a stream analytics job which ingests the data from IoThub and outputs it to PowerBI. In PowerBI then I have created a report from the dataset which is working. I now want that report to be ‘live’ using the streaming data from Stream Analytics, but can’t see how to make it so?
Looking at this guide, under the section ‘Create and publish a Power BI report to visualize the data’ point 4 says ‘Click Streaming Datasets’ which sounds like exactly what I need, but I don’t see that option anywhere?
Have I missed a step or set something up wrong, or is there another way to make my report ‘live’ without needing manual refresh and publishing?
Here's what I see when creating a report:
Thanks
Go to the report where you want to add the streaming data tile to.
Click on '+ Add tile' over on the top navigation
You should see an option for Real time - Custom streaming data on the resulting blade.
I have a URL that returns a json object with everything I need for my power bi embedded report. I get the data for the report by adding a new web data source and pasting the URL in. a few transformations later and tada! sexy report. the report shows lots of charts and graphs etc... however I need to be able to change the datasource URL depending on who is looking at it.
The report shows data for a single organization. You can only look at it if you're in that organization. how can I pass an organizations ID when embedding the report so that the datasource will show different data?
for example if my datasource is defined in the originating pbix as
Json.Document(Web.Contents("http://www.testdata.com/api/json?orgId=1"))
how can I change it to
Json.Document(Web.Contents("http://www.testdata.com/api/json?orgId=2"))
when I'm pull the report to embed on a page?
I know you can filter data but that means I have to make the datasource URL pull ALL the data which would be huge and intensive just to have bi filter out something.
In short, I'm embedding a report on a website and tat report's only way to get data is via a json endpoint. That endpoint requires the org id of the user so how do I pass it to bi which in turn uses it in the data source url?
Your only option for this scenario is to pull all the required data into your dataset. Then you can use either Role Level Security (RLS) or the new JS API to filter the data for each user.
You should probably look at an Azure SQL data source as a more efficient, flexible and scalable back-end for PBI Embedded.