A table with some sales data has an associated running total measure. When viewed in Data view of PowerBi Desktop the data does reflect an aggregated total.
However when applied to a line chart the running total is simply the monthly totals. The expectation would be that a running total never decreases (assuming only positive sales) and that the line chart would reflect the values in the measure. So month by month should actually be 500, 1500, 3000.
Update 1: As per Foxans suggestion - same result:
Update 2: Works when using an index instead of a date (dd/MM/yyyy):
You ISONORAFTER filter should be based on Date instead of Spend for a running total (Or in the absence of a date column, should be the column which can identify the order you're trying to summarize, e.g. an incremental index), i.e.
Spend running total in Date =
CALCULATE(
SUM(Spend[Spend]),
FILTER(
ALLSELECTED(Spend),
ISONORAFTER(Spend[Date], MAX(Spend[Date]), DESC)
)
)
It's causing some confusion here because your sample data in Spend column is coincidentally in ascending order of values (100 -> 200 -> 300). If you update it to some random data, you'll notice it won't work in the first place.
Related
I have a table (we'll just call it MyTable) in PowerBI that includes a Date column. I have created a DateDimension table using CALENDARAUTO() with columns for the Date, Month#, MonthName, etc and created the relationship between that Date and the Date in MyTable.
I want to calculate the average year based on the records in MyTable. For example, if I have records with 1/1/2005, 1/1/2014, and 1/1/2015, the average year should be 2011.
How would I go about doing that? I've tried doing a measure of AVERAGE(DateDimension[Year]) and adding that to a Card visual, but it just shows 2.01K. If I do FORMAT(AVERAGE(DateDimension[Year]), "####"), the average is completely wrong.
Since you already have this DateDimension table, use
Average =
AVERAGEX(
MyTable,
RELATED(DateDimension[Year])
)
You can control the formatting by clicking on the measure (Average) and then using the Measure tools pane:
Set it to Whole number and no thousands operator.
I am trying to return the percentage of the grand total from #number of clients using the second and fourth columns. The number of clients values are text and are collected using Count('Table'[Column]). That is where I have run into issues. When I try Countrows() or AllSelected() to try and work around it, it returns all the rows and doesn't keep the filters I have set.
My current measures:
Client Total = COUNTA('Table'[Client_Name])
% with at least 1 document = DIVIDE(SUM('Table'[At Least 1 Document Sum]), [Client Total])
Right now, it only calculates the percentage based on the filtered # of clients in the same row versus the grand total. I am hoping to have it use a dynamic grand total that is shown at the bottom of the table.
[Current Table] (https://i.stack.imgur.com/IG254.png)
Here is my solve
Measure Name = CALCULATE(COUNTROWS('Table'),ALLSELECTED('Table'))
Then:
Measure Name = SUM('Table'[At Least 1 Document Sum])/'Table'[Measure]
I want to show an hourly score for today that refreshes every hour. I built it and it works, but now I want it to compare the hourly average for the last week and if it was higher it shows a red arrow up if lower it shows a green arrow down. I don't have a problem adding the arrows as it's very easy, but I've previously added a column to the query that shows if day = today and then used it as a filter inside the visual to show today's data, so when I try to compare the results the filter also affects the calculation I created:
Measure = calculate(average(rawdata[contacts]),rawdata[Week to Average = 1)
week to average is the column that tells if the week was the previous week is simply if(weekcolumn=weeknum(today())-1,1,0)
Do you know any way i can compare last week average to todays data?
Also visual that i used is a matrix
This is just an idea. So if it is enough to solve the case then ok. If it's not it, then, please, add more info about your data table - a screenshot or sample data. how you calculate your average and what do you mean by average? At least what kind of data you are dealing with is it a sum of some values per day, do you have a different number of values for each day? etc.
measure 1:
averContacts = AVERAGE(rawdata[contacts]) -- no CALCULATE()
measure 2:
avrToday =
CALCULATE(
[averContacts]
,TreatAS({TODAY()}, yourTable[DatesColumn])
)
measure 3:
aveLastWeek =
VAR prevWeekEnd = TODAY() - WEEKDAY(Today(),2) -- 2 -> Mon-Sun week format
VAR prevWeekStart = prevWeekEnd - 7
VAR DatesLastWeek = CALENDAR(prevWeekStart ,prevWeekEnd)
RETURN
CALCULATE(
[averContacts]
,TreatAS(DatesLastWeek, yourTable[DatesColumn])
)
measure for your visual
lastWeek_vs_Today = DIVIDE(avrToday ,aveLastWeek)
I have:
Dim Table:
Accounts: customer granularity level.
Fact Table:
PhoneCalls. calls to customers granularity level.
I need to create and see the number of calls made to a customer up until the customer made their first deposit.
I was able to do this on the customer level but on the Total level I get a weird result:
my measure:
ACC_Calls_2_FDP =
CALCULATE(
COUNTROWS(PhoneCalls),
PhoneCalls[disposition] = "ANSWERED", -- only calls that were answered
PhoneCalls[calldate] <= MAX(Accounts[FDP_Date]), -- up until FDP date per customer
USERELATIONSHIP(Accounts[AccountNo],PhoneCalls[AccountNo]) -- make the connection active
)
the results are:
On the total level i was expecting to see 14. not 536
what is going on? what am i missing?
The data model:
Filter direction: Accounts filters PhoneCalls
Cardinality 1:*
Power BI doesn't sum up each of the row values as in Excel. Instead, for Grand total value they will still calculate it as a row itself with no filters on AccountNo on it.
In this case, the max value of FDP_Date in the whole table is taken in the measure. So for example, ACC 1 has max date of 3, ACC 3 has the max of 10. The grand total calculation will use 10 as the max value and start countrows on whichever rows with FDP_Date < 10.
To fix this is tricky as I don't have any other information than this. However, I hope this explains you well of the "bug".
I have a dataset of patients visiting several categories(SPECIALISM) of a hospital. A visit lasts a couple of hours each day. Each row in my dataset represents an hour that they are present for a certain hospital specialism.
Input
I want to calculate for each hour of the day, the number of patients that are present on average, per specialism, I used the following code (measure):
daggem = AVERAGEX(values('Date'[Date]),[distinctpat])
with distinctpat being a distinct count of patient IDs
This gives me almost the desired result, but the tales of the graph are too heavy (it shows an average of 1 patient during the night, but this 1 patient was there only on 1 specific day, normally it is zero. But the average, as I calculated, it does not include all other nights when there were zero patients. So I would like to obtain an average that is much lower (and more correct)
Output
You probably need an extra table with all the days and times. Based on the reports, you will be able to find the hours without visits
Here is an example of how to create a Date table with hours; Add relationship and use in calculation.
DatesWithHours =
var Dates = CALENDAR("2021-01-01 00:00:00", "2021-01-03 00:00:00")
var DatesHour =
GENERATE(Dates, (ADDCOLUMNS(GENERATESERIES(1,12,1),"Hours", [Date] + TIME([Value],0,0))))
return
DatesHour
That's the problem with AVERAGEX, it ignores blanks. Try a simple division instead.
daggem =
DIVIDE (
COUNTROWS ( TargetTable ),
COUNTROWS ( Dates )
)