Hello I am working on a dashboard that needs to combine 2 tables: sales table, sales order table. Two two tables come from different sources: sales table from SAP BW and sales order table through a dataflow. Both tables have a number of applied steps that delete, reformat and rename the columns. I then want to append the Sales Order Table to the Sales table. In the desktop version everything works even if I hit "refresh all" in the Power Query Editor. When I publish the report to the service and refresh, I am getting an error saying: The key didn't match any rows in the table
Full Error Message:
After troubleshooting for the last couple of days I'm pretty sure that the issue has to do with my column names being renamed. However its strange that the append would work in desktop but not in service.
My main questions are:
Can you append tables after multiple applied transformation steps?
Can the tables both refresh daily or does one or the other have to be static?
Does the order of the table queries in query editor effect the order in which tables are loaded or does everything load all at once? Can this order affect my append query?
Any suggestions or help would be greatly appreciated. I am 99% of the way to launching this new report but this refresh issue is driving me crazy.
You can append tables after multiple applied transformation steps
Both tables can refresh daily
The order of the table queries in query editor doesn't matter. Power BI will determine the order in which they are loaded. E.g. if you want to append a table to another table, that other table is loaded first.
Thanks for answering those questions Peter. I concluded that the issue had something to do with appending a dataflow. I switched to a new data source from SAP BW and its working now. What's interesting is that the dataflow is still being imported and refreshed daily so the issue had to do with just the append.
Related
I have created a dataset in Power BI web, and then imported it on Power BI desktop (using File -> Get data -> Power BI datasets).
I don't see a way of adding Calculated Columns. Does the dataset need some special configuration to enable this?
Based on my experience you won't be able to see the columns because it using direct query. you can still add some columns and measure but your'e basically flying blind. you can just select the table in the report(visual view) and add column however you'll have to play the guessing game. But if you know your dataset you should be fine.
However a Dataflow wouldn't give the issue you're having.
Good evening!
We are currently having an issue with the below error being thrown when trying to refresh a dataset via the on-premises gateway. Refreshing direct through Power BI desktop does not provide any errors and seems to refresh fine.
Data source error: {"error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError",
"pbi.error":{"code":"DM_GWPipeline_Gateway_MashupDataAccessError",
"parameters":{},"details":[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorCode",
"detail":{"type":1,"value":"-2147467259"}},{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage",
"detail":{"type":1,"value":"The key didn't match any rows in the table."}},{"code":"DM_ErrorDetailNameCode_UnderlyingHResult",
"detail":{"type":1,"value":"-2147467259"}},{"code":"Microsoft.Data.Mashup.ValueError.Key",
"detail":{"type":1,"value":"[entity = \"Forecast\"]"}},{"code":"Microsoft.Data.Mashup.ValueError.Reason",
"detail":{"type":1,"value":"Expression.Error"}}],"exceptionCulprit":1}}} Table: FACT - Cost Forecast Tool.
Cluster URI: WABI-EUROPE-NORTH-B-redirect.analysis.windows.net
Activity ID: f4b629a6-a9bc-4966-954c-ae37139737a4
Request ID: db16cb6d-a765-1e0e-f9f5-b8803c8baa6e
Time: 2020-11-25 17:20:30Z
From reading previous posts, I'm not sure any of the normal responses apply in this scenario, as they tend to relate to either pulling data from SQL Server, or Excel files (where the sheet name causes an issue).
The table throwing the error (FACT - Cost Forecast Tool) is made by appending two other tables - one comes from a Power BI dataflow (Forecast), and one from a folder import of CSV files (Forecast Tool Adjustments).
Publishing a model with just the two individual tables (Forecast and Forecast Tool Adjustments) works and refreshes fine.
The above error only occurs where a third table is added combining these two tables via a one liner in Power Query:
= Table.Combine({#"Forecast Tool Adjustments", Forecast})
Oddly the value being flagged in the error, "[entity = "Forecast"]", is one of the first steps in Power Query for the 'Forecast' table. So while that table refreshes fine if it's on it's own, it seems to throw an error when refreshed through the combined table?
Any thoughts would be greatly appreciated.
I'm using PowerBI Desktop Version: 2.87.762.0 64-bit (November 2020).
Reading data from ADLS Gen 2.
I have been trying to refresh the data in PowerBI to bring in new columns but can’t seem to do it.
I have data on ADLS which I imported into PowerBI.
Now in the source ADLS table I have additional columns.
I would like to refresh the data in PowerBI for it to read the latest tables and hence bring in the new columns.
I have tried refreshing but it doesn’t seem to work.
Tried clicking on “Refresh” in the “Report” view – (which reading online will not work)
Tried refreshing in the “Power Query Mode”. (Steps shown in the figure below, which I think should have worked)
But the new columns don’t show up in PowerBI.
I can download the data from ADLS and confirm that new columns are present.
What am I missing?
There is a few things you can check:
can you see the new columns in the power query editor? try a go a few steps back in transform history.
fx. Sometimes if you "expand" a column it will not add new columns and you will need to add those afterwards ie. do the expand-transform again.
btw. the "Refresh All" in the power query editor is only refreshing the example data.
if you see the data in the power query editor, try and hit refresh in the pbi editor.
if there is no data in the power query editor. use another tool to look at the source to confirm that the data i actually present in the stream that pbi has access to.
I am in the process of creating a dashboard in power BI with multiple people. Currently I have 4 entities in a Dataflow that move to a dataset which are then visualized in reports. I recently added a column to one of my entities that I would like to show up in a report that is already created. However, despite the column being added to the entity (it shows up when I try to create a new report), it isn't displayed in the older report. How can I get my new column to display in an already created report?
You need to get the old report, go to the Query Editor and refresh the preview for it to pick up the new column.
You may have to go through the steps to make sure it is not removed, by for example reducing the columns down via a selection. When you create a new report you can see the column as it is getting the dataflow table structure with out any history in the query. Note this is not just for Dataflows, but for most types of connection where the structure changes, for example CSV, Excel etc.
Check if the source data set is set to private by the person who published the report. Changing this might grant you access to the source dataset.
I’m building an model in Azure Analysis Services. The model should contain only data for the last 3 months and is processed every day.
I have a separate dimension for date that has a relation with a fact table using a datekey. I’m using a power query to only load the last 3 months in the date dimension. In the power query to load the fact table I used Table.nestedjoin to only load the rows that have a value in the date table.
When I do this, the processing of the model takes forever. After some troubleshooting I saw that the query Analysis Services is using to retrieve data from the SQL database retrieves all rows. So, Am I correct saying AS load all data before it merge the rows? Is there a way to change this? Or is there a better way to a chief my solution?
Kind regards,
Joins are super slow in Power Query. You should avoid them if you can do it in the datasource or use normal relationships in the data model.
Also, you can setup the date dimension in DAX and dynamically populate it to contain only dates present in the FACT table.
As for the load of all the data, it could be because the data is fetched as is, and only then power query applies the transformations (the join).
You can modify the query in the Power Query Editor / Advenced Editor to add a where clause direclty in the query