Visual Studio Analysis Services Extension not deploying tabular Measures - visual-studio-2017

I have a fairly simple tabular sales model (Customers, Products, Sales) deployed to a SSAS 2016 server in tabular mode and I have a few measures setup on the sales table, things like total value, total kgs, total units, but after deploying the model, they do not show up when connecting with Power BI. Is there a setting that i might be missing somewhere that could be stopping the measure from showing up?
Update
One of the things I have tried, is to run the Create Script Task in SSMS to see if the measures get added as part of the SSAS Database create script and it appears that they are not there from that angle either.

I have figured it out... I haven't deployed it just. It was the separator between the naming of the measure and the formula that was the issue. I just had the equal sign (=) instead of the colon equal (:=) which is what DAX is expecting. It wasn't until I tried removing the name and leaving the formula much like an Excel formula that the formula editor decided I had screwed up and fixed up my naming of the measure. It was at this point that I found the issue with my naming without the colon.
The measures are now calculating as expected in VS and they now show up on the Tabular Model Explorer.
The odd thing though is that there were no errors at all to say that there was an issue. The deploy just worked without any issues. So that is one to watch when your measures don't show up.
Update
I have deployed to my on-premise 2016 server and the measures now show up.

Related

Calculation Groups not refreshed in Power BI Service

I have a report using a calculation group for currency conversion, which works fine in PBI desktop. When published to the service, none of the visuals will render and they show the following error:
The query referenced Calculation Group Table 'CurrencyConversion'
which does not hold any data because it needs to be recalculated or
refreshed.
I have refreshed the dataset a couple of times, but there's no change. How do I fix this?
EDIT:
This seems to have healed itself overnight. The dataset is currently not scheduled for a refresh, so I would still like to know how it sorted itself out. I can't have an outage when we deploy this to production.

about in-memory mode working logic in power bi

When going through DAX overview documentation on the following https://learn.microsoft.com/en-us/dax/dax-overview,
under the Calculated Column heading, I couldn't understand the highlighted information in the following statement:
"(calculated) Column values are only recalculated if the table or any related table is processed (refresh) or the model is unloaded from memory and then reloaded, like when closing and reopening a Power BI Desktop file."
Does it mean, that whenever we close the Power BI Desktop file, the model is unloaded from memory? does the memory is used like Cache?
Would anyone please help me understand what actually happens in practical world? Thank you for giving your valuable time and support!
The main point of this remark is to distinguish the difference between measures and calculated columns since the distinction is often not clear to report builders.
A measure is dynamically calculated on-demand when the end-user interacts with the report via slicers or any other sort of filtering and interaction.
A calculated column is only calculated when the model is refreshed. As a result, calculated columns cannot be dynamically responsive to the end-user since they have already been computed before the user can interact with the report.
Power BI Desktop is excel on steraloids. It is an application and as with every application when you close it, it unloads from memory. Its great and you can do a lot..
It has the option to refresh (a button) when you press it, it recalculates your calculated columns.
If you want dashboards and reports served, to be shown in other (web) applications. You want to go with Power BI server. It runs in the cloud and you can refresh the data (daily, on the hour or direct) you can set this all up scheduled in the admin tools.
It is logical that only on a refresh, your calculated columns get recalculated. Only in this moment of time your raw data is changing what will have effect on your calculated columns.

Is there a way to query for how many times a user made changes to a TFS ticket?

I'm looking at setting up a query in TFS that tells me how many times a user changed a particular TFS ticket. Is this a possibility in TFS2017 or even in Power BI Desktop?
For example, creating a query in TFS with the following added:
And - Changed By - Was Ever - Smith.John
gives all tickets "Smith.John" ever changed but doesn't tell me how many times he's made changes to them.
Example Image: https://imgur.com/AIUIwPG
From the image above, I'm looking for an answer of 4 (Smith.John changed the ticket 4 times) but it will only appear once in a query. I know this is likely not possible in TFS itself so I'm hopefully looking for a solution in Power BI Desktop.
Is there a way to query for how many times a user made changes to a TFS ticket?
Indeed, it is not possible in TFS itself, all the results of the query are workitems list not the details of workitems.
There is a extension which using the TFS API, it could pull all history for one specify workitem:
TFS SDK: Work Item History Visualizer using TFS API
If this extension does not meet your needs,you could use the API to output the workitem all changes history:
TFS API: TFS Work Item All Changes History
Then export this grid in format you want and save it locally.
If all above are not what you want, you just want to use Power BI Desktop to resolve this issue, you can check following blog, it should give you some help:
Querying the VSTS Work Items API directly from Power BI
Hope this helps.

Power BI "Apply query changes" taking unfeasible time to run

I'm using Power BI Desktop to get data from a local SQL Server instance. I have 60+ queries in the Queries pane. Most of them have 1 million+ rows. For reasons that I don't fully understand I'm frequently prompted to "apply query changes". Each time Power BI decides to do this it starts updating all queries, which is simply time unfeasible. I don't know how many hours it would take because it's so slow I have to cancel the process. I've tried changing several Data Load settings but without results. Is there a workaround for this problem? For example an option to do it incrementally, one query at a time? Prompting to apply query changes to all queries at once as Power BI does seems unworkable.

Strange powerbi behaviour cannot explore dataset

I am sending data from Stream Analytics to powerbi.
Most of the time it works fine but from time to time I upload large amounts of data for test purposes. I notice that my SU utilization is 100% and get error messages in Stream Analytics saying that I might encounter strange behaviour because of that.
And in most of these cases when I goto powerbi I see my datasets but when I try to explore them I get nothing. The table page is not opened.
Best to connect with the Stream Analytics folks on this. If you've maximized your SA jobs, you might try scaling up your SA implementation. You should also double check how much data you're sending to Power BI and how often. You want to have a tumbling windows of 1 second or greater for most purposes (Power BI updates the dashboards every second) and choose an appropriate amount of data so you don't get throttled on the Power BI side (see restrictions: https://powerbi.microsoft.com/pricing).
HTH,
-Lukasz
PowerBI is currently in preview, and there are still some silent bugs here and there.
Two silent bugs that come to mind are:
PowerBI currently does not support embedded JSON objects. If a single entry goes into a PowerBI that has an embedded object, it will break the entire table.
Table schemas are set in stone. If later you decided you wanted to change your schema (by changing the select statement in Stream Analytics), you have to output to an entirely different table. If two different schemas go into the same table, the table will break and fail to load as you have mentioned.