I have set up a build pipeline Azure DevOps, which builds the project, runs the MSTests and generates code coverage report as well as code analysis metrics results.
How do I get these results to a dashboard such as Power BI or any similar? What are the different visualization options from Azure DevOps?
I know adding a widget and getting the visualization in a Azure DevOps dashboard, but looking for an option where I can publish the results, also see the historic code metrices, and drill down to each class level results.
You can check the sample reports in the following link:
https://learn.microsoft.com/en-us/azure/devops/report/powerbi/sample-odata-overview?view=azure-devops
For example, you can paste the Power BI query listed below directly into the Get Data->Blank Query window.
let
Source = OData.Feed (""
in
Source
&"Pipeline/PipelineName eq '{pipelineName}' "
&"And Date/Date ge {startdate} "
&"And Workflow eq 'Build' "
&") "
&"/aggregate( "
&"ResultCount with sum as ResultCount, "
&"ResultPassCount with sum as ResultPassCount, "
&"ResultFailCount with sum as ResultFailCount, "
&"ResultNotExecutedCount with sum as ResultNotExecutedCount, "
&"ResultNotImpactedCount with sum as ResultNotImpactedCount "
&") "
,null, [Implementation="2.0",OmitValues = ODataOmitValues.Nulls,ODataVersion = 4])
in
Source
I got the metrics xml files to a blob storage from the pipeline and then added the blob storage as data source to power BI. Did necessary transformations in Power BI and published the report to PowerBI dashboard which gave me required report and made it accessible to people in the organization.
Related
I have a power bi desktop connected with SQL database and have some reports in it.
Now I want to generate following things
Every time user enter the name of the database my reports gets automatically refresh and exported in PDF or excel.
If possible I have browser where user can enter the SQL database name and UI pop up and end user can see the power BI dashboard where all reports automatically refreshed as per the user input SQL database name.
Thanks in advance!
For ask 1:
Refer Create and use R visuals in Power BI Once R is installed, open the Run R Script and paste the below R script, depending on your destination and save it.
Example as a excel file: write.xlsx(dataset, file = "C:/dataset.xlsx")
If you are interested in (PDF) and know Microsoft Power Platform, use On Power BI button clicked trigger followed with Export To File for Power BI Reports actions API to export a PDF to your local directory. Demo screenshot
For ask 2:
You can check Embed a report in a secure portal or website to easily embed reports in internal web portals and utilize Automatic page refresh in the Power BI service to set automatic page refresh for reports as long as the data source is DirectQuery.
Power BI Desktop has no restrictions for refresh intervals and can be as frequent as every second. But when reports are published to the Power BI service, certain restrictions apply.
Alternately you can use Datasets Power BI REST APIs - to trigger a refresh for the specified dataset from "My Workspace".
POST https://api.powerbi.com/v1.0/myorg/datasets/{datasetId}/refreshes
You can also leverage Q&A for Power BI business users the fastest way to get an answer from your data by asking a question using natural language.
We are currently using Snowflake and Power BI for dashboarding. These two together have been working well for us but we lack the ability to create automated reports for larger file exports
I need to schedule automated reports that save a csv/excel file of ~500K rows (Power BI limits at 150k) into a shared location (preferable OneDrive)
Every solution I look into is trying to sell you on their BI solution or other features that we do not need. I just need a low-cost solution to export data from Snowflake. I looked into SSRS by creating a linked server but ran into issues with UTF-8 and thought there has to be an easier solution.
Any ideas/recommendations?
Could you export the data to Azure Blob Storage and have Power BI read the export file from there?
Assuming that is possible, you can create a task in Snowflake that exports data every n minutes/hours/day, etc and writes the result set you are looking for to Azure Blob.
create task export_to_blob
warehouse = task_wh
schedule = '60 minute'
as
copy into #azure_blob from sales.public.nation file_format = (type = csv);
https://docs.snowflake.com/en/sql-reference/sql/create-task.html
Good evening!
We are currently having an issue with the below error being thrown when trying to refresh a dataset via the on-premises gateway. Refreshing direct through Power BI desktop does not provide any errors and seems to refresh fine.
Data source error: {"error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError",
"pbi.error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError",
"parameters":{},"details":[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorCode",
"detail":{"type":1,"value":"-2147467259"}},{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage",
"detail":{"type":1,"value":"The key didn't match any rows in the table."}},{"code":"DM_ErrorDetailNameCode_UnderlyingHResult",
"detail":{"type":1,"value":"-2147467259"}},{"code":"Microsoft.Data.Mashup.ValueError.Key",
"detail":{"type":1,"value":"[entity = \"Forecast\"]"}},{"code":"Microsoft.Data.Mashup.ValueError.Reason",
"detail":{"type":1,"value":"Expression.Error"}}],"exceptionCulprit":1}}} Table: FACT - Cost Forecast Tool.
Cluster URI: WABI-EUROPE-NORTH-B-redirect.analysis.windows.net
Activity ID: f4b629a6-a9bc-4966-954c-ae37139737a4
Request ID: db16cb6d-a765-1e0e-f9f5-b8803c8baa6e
Time: 2020-11-25 17:20:30Z
From reading previous posts, I'm not sure any of the normal responses apply in this scenario, as they tend to relate to either pulling data from SQL Server, or Excel files (where the sheet name causes an issue).
The table throwing the error (FACT - Cost Forecast Tool) is made by appending two other tables - one comes from a Power BI dataflow (Forecast), and one from a folder import of CSV files (Forecast Tool Adjustments).
Publishing a model with just the two individual tables (Forecast and Forecast Tool Adjustments) works and refreshes fine.
The above error only occurs where a third table is added combining these two tables via a one liner in Power Query:
= Table.Combine({#"Forecast Tool Adjustments", Forecast})
Oddly the value being flagged in the error, "[entity = "Forecast"]", is one of the first steps in Power Query for the 'Forecast' table. So while that table refreshes fine if it's on it's own, it seems to throw an error when refreshed through the combined table?
Any thoughts would be greatly appreciated.
So, I created dataset that can be used to generate multiple reports and dashboards.
I would like to publish to the web portal and allow report designers to create their own reports and dashboards. Which works great until I need to update the model. If I update the model and push the data set via Publish function it deletes the reports.
So is the only option for everyone to use PowerBI Desktop? What is the web-based report designer for? Is this only for datasets that will never change?
If your Power BI file has reports in it, it will over write them. What I think is happening is that your users are updating the report in your dataset, for example adding a new tab, so when you load changes it overwrites them.
You need to decouple the Dataset from the reports. You can load your dataset with out any reports. You can then build reports using two methods.
1) Connect to the dataset using Power BI Desktop, using the connect to Power BI Dataset connection. That links to the dataset in a Live Connection, so you can build your reports from that. You can then publish your report. Any changes to the dataset will not overwrite it.
2) For the Web builder you can do the same, create a new report in the portal and select a dataset from that workspace to build your report. What the user needs to do then is 'Save as' and give that report as a new name.
Both methods just link to your dataset, not build a report directly in the file
Hi I am currently trying to develop a reporting tool. There is a SharePoint online list which has various SharePoint sites in that list. My objective is to retrieve all SharePoint sites from that list and to connect to the documents folder of all retrieved SharePoint sites. I am able to connect to all documents in Power Bi desktop but the refresh fails on the Power Bi service saying - Unable to refresh the model because it references an unsupported data source.
Here is the logic that i am using to connect to the document folder of all sites.
Main Query -
let
Source = SharePoint.Tables("https://xxxxx.sharepoint.com/sites/Projects/", [ApiVersion = 15]),
#"xxxxxxxxxxxxxx" = Source{[Id="xxxxxxxxxxxxxx"]}[Items],
#"Renamed Columns" = Table.RenameColumns(#"xxxxxxxxxxxxxx",{{"ID", "ID.1"}}),
#"Expanded SiteUrl" = Table.ExpandRecordColumn(#"Renamed Columns", "SiteUrl", {"Description", "Url"}, {"SiteUrl.Description", "SiteUrl.Url"}),
#"Removed Other Columns" = Table.SelectColumns(#"Expanded SiteUrl",{"Title", "Id","SiteStatus","ProjectCode", "SiteUrl.Url"}),
#"Documents" = Table.AddColumn(#"Filtered Rows2", "Documents", each GetList([SiteUrl.Url], "Documents"))
in
#"Documents"
Below is the code for GetList function -
= (siteURL,listname) =>
let
Source = SharePoint.Tables(siteURL,[ApiVersion = 15]),
#"MyListData" = Source{[Title=listname]}[Items]
in
#"MyListData"
I have taken help from this article which is very well written.
https://marque360.com/aggregating-sharepoint-list-data-in-power-bi/
I am not sure why this works on Power Bi desktop but says unsupported data source on Power BI service.
Could anyone please guide me on how to get this refresh working on Power BI service.
Try using "Auto" value for ApiVersion in query. Connection support for Sharepoint
When entering the URL for the SharePoint Lists, enter the root Site Collection URL and then provide the correct credentials, say the LDAP login credentials.
Enter the URL with full path (http:///app/_api/web/Lists/GetByTitle('')/Items?$select=)
Ref: Syntax
SharePoint.Tables(url as text, optional options as nullable record) as table
About
Returns a table containing a row for each List item found at the specified SharePoint list, url. Each row contains properties of the List. options may be specified to control the following options:
ApiVersion : A number (14 or 15) or the text "Auto" that specifies the SharePoint API version to use for this site. When not specified, API version 14 is used. When Auto is specified, the server version will be automatically discovered if possible, otherwise version defaults to 14. Non-English SharePoint sites require at least version 15.
If you created your datasets and reports based on a Power BI Desktop file on SharePoint Online, Power BI performs another type of refresh, known as OneDrive refresh. For more information, see Get data from files for Power BI
Unlike a dataset refresh during which Power BI imports data from a data source into a dataset, OneDrive refresh synchronizes datasets and reports with their source files. By default, Power BI checks about every hour if a dataset connected to a file on OneDrive or SharePoint Online requires synchronization.
Note: It can take Power BI up to 60 minutes to refresh a dataset, even once the sync has completed on your local machine and after you've used Refresh now in the Power BI service.
The dataset settings page only shows the OneDrive Credentials and OneDrive refresh sections if the dataset is connected to a file in SharePoint Online, as in the following screenshot enter image description here. Datasets that are not connected to sources file in SharePoint Online don't show these sections.
In most cases, Power BI datasets that use dynamic data sources cannot be refreshed in the Power BI service. There are a few exceptions in which dynamic data sources can be refreshed in the Power BI service, such as when using the RelativePath and Query options with the Web.Contents M function. Queries that reference Power Query parameters can also be refreshed.
For refresh issues related to dynamic data sources, including data sources that include hand-authored queries, see refresh and dynamic data sources