I have a report using a calculation group for currency conversion, which works fine in PBI desktop. When published to the service, none of the visuals will render and they show the following error:
The query referenced Calculation Group Table 'CurrencyConversion'
which does not hold any data because it needs to be recalculated or
refreshed.
I have refreshed the dataset a couple of times, but there's no change. How do I fix this?
EDIT:
This seems to have healed itself overnight. The dataset is currently not scheduled for a refresh, so I would still like to know how it sorted itself out. I can't have an outage when we deploy this to production.
Related
When I refresh the powerbi report in powerbi service, I end up getting an empty table.
When I refresh it in powerbi desktop, I don't have any issues refreshing. In the online environment, powerbi service, I get the report initially on first upload. When I refresh the dataset and the report I end up getting the empty table shown or a summarized version.
First I thought this would be a datatype issue, so I changed the week of year to the start date of that week, but that also didn't work. I tried both a date table of my own and an external one, both gave me the same issue.
Any suggestions on what I can try to make this work? Perhaps a different way to display weeks in powerbi? Thanks in advance.
failed refresh 1
failed refresh 2
how it should be
I have found the solution.
When I extracted the data to powerbi I then transformed the date from date/time to date in powerbi. This seemed to have caused the refresh to mess up.
Once I deleted the transformation and then changed my separate date table to date/time aswell, I reuploaded the report and it no longer showed a bad refresh.
I need help with improving refresh times on a Power BI dashboard with about 20M rows of data and 80 columns pulling from SQL Server. I cannot use Power BI Service in any capacity, this has to load into Power BI Desktop.
My refresh times on the raw data (virtually no transformations in Power Query) are taking about 3-4 hours.
Microsoft recommends incremental refresh to archive my historical data and only refresh the latest changes, but that requires Service and I 100% cannot use it.
Is there any other way to significantly improve my refresh times beyond Service's incremental refresh? If it was under an hour I'd be happy.
What I've tried:
Native Query to leverage the server
reducing column selections
removing all transformations
Splitting tables in Power Query and selectively turning off refresh in the historical tables - as soon as they get stacked/appended Power Query triggers a refresh on all stacked tables regardless of which ones have refresh turned off.
Looked into Power Query PQFL/M code to activate refresh of tables - can't find any method/property to control this in M code.
optimizing the SQL, haven't gotten any significant improvements.
20 million rows should not take that long, especially with no transformations. Something else is going wrong but without access to your data and hardware, it is impossible to say.
One possibility is do an initial data load and then turn off refresh on that query. Add a new query for just the new data (which should be quick) but load the new query to a completely new table. In PBI, you will then have two tables. Create a calculated table in DAX which is a union of your old, non-refreshed data and your new data. Refreshes should be very quick after your first load but obviously you need to think about how it scales as your data grows.
Please help me in resolving the issue.
There are 6 reports in my workspace connecting to same database but only one report is facing the issue when it's dataset is refreshed. It's scheduled refresh fails with error
Column '....' in table '....' contains a duplicate value '.....' and this is not allowed for columns on the one side of a many-to-one relationship or for columns that are used as the primary key of a table. Table: ......
But when refreshed manually after an hour or so it's successful.
There was no issue initially but I can see the report fails daily in past one month.
My analysis
There are no duplicates in the column mentioned in the error message as we are using "distinct" in transform data. I have checked the data as well and i could see no duplicates.
I am not using incremental refresh
Gateway connection is working fine as the rest of the reports are being successfully refreshed and the status also shows as successful.
4.there are no changes being made to the database during the interval of scheduled and manual refresh.
I want to know what would cause this issue.
I am using Azure Logic App workflows to refresh a Power BI dataset every 5 minutes. The workflow was working for several weeks without any issues . But since last 2 days the dataset refresh is failing (Message - Another refresh running at same time). I have attached the picture on the refresh status. Though it shows the alternate refresh status as Success , the Power BI dashboard shows the same number all the time.
Is there a way to fix this the steps I tried - Disabled and enabled the flow, Deleted the flow and created again but its the same problem.
Also there are other reports that do not have this issue. I am using a Service account for the API connection .Is there a way to check if there is another refresh running at same time. Thanks for any inputs.
.
I have a power Bi dashboard (deployed to a report server) where I have imported the data (query folding is in place), but every time that I click on the report link it shows a message of 'Load data' 2 times before it displays the report.
To allow me to test this on a simpler dashboard, I created a power Bi dashboard where I have imported 1 view with only 3 columns and I have deployed to the server, but I still get the 'Load data message twice every time I click on the report link.
Why is Power BI showing this message twice and is there a way to disable it as this is causing delays when loading other reports.
Is it because of one of the following?
Data eviction
Is it due to how Import uses the VertiPaq engine to store data in memory?
Power BI wants to know the schema of the table before the query actually runs, so it asks Power Query to return the top 0 rows. Unfortunately, in this case query folding can’t take place and the top 0 filter can’t be pushed back to the database, so the entire query gets run once to get the schema and once to get the data.
Or is there another reason for this and is it possible to disable this?