I have this PBIX folder.
In this folder, you will find the PBIX File and the source data.
EDIT:
See herewith Sample Data as requested:
TxDate | Reference | Code | ItemGroup | Quantity | Sales
__________________________________________________________________________
24/02/2021 | AJI237677 | 2490/1008/999 | | 1 | 342144.5
28/02/2021 | AJI238993 | 1500/9999/999 | | 1 | 140000
13/04/2021 | AJI239912 | ATGS - Cut Pull Down | fabrication | 4 | 100
13/04/2021 | AJI239912 | AC 760 200 15060 A | Alu-Ext-Std-Mil | 8 | 2512
13/04/2021 | AJI239912 | AC 760 200 15060 A | Alu-Ext-Std-Mil | 6 | 1884
13/04/2021 | AJI239916 | ATGS - Cut Guilotine | fabrication | 2 | 250
13/04/2021 | AJI239917 | ATC252 SQR 60 A | Alu-Ext-Spe | 1 | 307
13/04/2021 | AJI239917 | ATGH - 25MM3TA | Hardware | 8 | 256
13/04/2021 | AJI239927 | ATGS - Cut Pull Down | fabrication | 1 | 0
13/04/2021 | AJI239927 | AAE 127 127 16060 A | Alu-Ext-Std | 4 | 324
13/04/2021 | AJI239929 | AHS 200 200 15060 A | Alu-Ext-Spe | 2 | 430
13/04/2021 | AJI239929 | ATGS - Cut Pull Down | fabrication | 1 | 0
13/04/2021 | AJI239933 | ATGH - 19MMSQCPC | Hardware | 4 | 56
13/04/2021 | AJI239933 | AHS 200 200 15060 A | Alu-Ext-Spe | 1 | 215
13/04/2021 | AJI239933 | AAU 500 250 16060 A | Alu-Ext-Std-Mil | 1 | 255
13/04/2021 | AJI239947 | AXSTAIRNOSING | Alu-Ext-Spe | 3 | 915
13/04/2021 | AJI239947 | ATGS - Cut Pull Down | fabrication | 1 | 0
13/04/2021 | AJI239947 | ATGH - SEIBLACK | Hardware | 30 | 240
13/04/2021 | AJI239950 | AS 202500125050 | Alu-Rol--She-Mil | 1 | 1240
13/04/2021 | AJI239957 | ATGS - Cut Guilotine | fabrication | 7 | 175
13/04/2021 | AJI239957 | AS 092500125050 P | Alu-Rol--She-Pre | 1 | 596
13/04/2021 | AJI239966 | AC 444 190 16060 A | Alu-Ext-Std-Mil | 1 | 252
Using this Sample Data, I'm sure you'll be able to replicate it.
I need to be able to calculate the Fabrication Sales.
To explain this, each Product Item is sold to a Customer, but sometimes, fabrication of this item needs to take place, i.e. welding, bending, etc.
So for some invoices (Reference), the product is sold with fabrication.
The customer requires to see the Total Fabrication Sales per Item and the average percentage of Fabrication Sales, i.e. the percentage of the total invoice Fabrication takes up.
Using the following script in SQL, I'm able to replicate the required results:
with source as (
select
Code
, (select sum(ActualSalesValue) from _bvSTTransactionsFull t join _bvStockFull s on t.AccountLink = s.StockLink and ItemGroup = 'Fabrication' and tx.Reference = t.Reference and TxDate between '2020-03-01' and '2021-02-28') FabricationSales
, (select sum(ActualSalesValue) from _bvSTTransactionsFull t join _bvStockFull s on t.AccountLink = s.StockLink and ItemGroup <> 'Fabrication' and tx.Reference = t.Reference and TxDate between '2020-03-01' and '2021-02-28') OtherSales
, (select sum(ActualSalesValue) from _bvSTTransactionsFull t join _bvStockFull s on t.AccountLink = s.StockLink and tx.Reference = t.Reference and TxDate between '2020-03-01' and '2021-02-28') TotalSales
from _bvSTTransactionsFull tx join _bvStockFull st on tx.AccountLink = st.StockLink
)
, results as (
select
Code
, isnull(round(sum(FabricationSales),2),0) FabricationSales
, isnull(round(sum(OtherSales),2),0) OtherSales
, isnull(round(sum(TotalSales),2),0) TotalSales
from source
group by
Code
)
select
*
, isnull(iif(isnull(TotalSales,0)=0,0,FabricationSales/TotalSales),0) [Fabrication%]
from results
where FabricationSales>0
The results look like this:
I need to replicate this using a DAX Formula.
I'm calculating the Sales using this measure : Sales = SUM( Sales[Sales] )
Then I'm filtering the sales by Item Group using this measure:
Fabrication Sales =
CALCULATE( [Sales],
FILTER( ProductGroups, ProductGroups[StGroup] = "Fabrication" )
)
I've tried the following measure to get my required results, but I just can't seem to get it right:
Actual Fabrication Sales =
VAR InvoiceSales = SUMMARIZE( Sales, Sales[Reference], Products[Code], "InvSales", [Fabrication Sales] )
VAR TotalInvSales = SUMX( InvoiceSales, [InvSales] )
VAR ProductSales = SUMMARIZE( InvoiceSales, Products[Code], "ProductSales", TotalInvSales )
VAR Results = SUMX( ProductSales, [ProductSales] )
RETURN
Results
Please, if someone could help me with the correct DAX formula to get the required result?
If I could just get the correct DAX Formula to calculate the Fabrication Sales, I will be able to calculate the Quantity & Percentage.
EDIT:
Expected results as per #msta42a answer:
Ok, maybe I miss something but here we go. I split this into 3 measures: First, I search for a sum of sales for Fabrication ItemGroup in the scope of reference [In this sample = AJI239912]. Second, I search for all other ItemGroup in this scope. And at last, divide to get a percentage.
Fabrication Sales =
CALCULATE( SUM(Sales[Sales]),
FILTER( ALL(Sales[ItemGroup], Sales[Reference], Sales[Code]), Sales[ItemGroup] = "Fabrication" && Sales[Reference] = SELECTEDVALUE(Sales[Reference]))
)
Other Sales =
CALCULATE( SUM(Sales[Sales]),
FILTER( ALL(Sales[ItemGroup], Sales[Reference], Sales[Code]), Sales[ItemGroup] <> "Fabrication" && Sales[Reference] = SELECTEDVALUE(Sales[Reference]))
)
Fabrication% = DIVIDE([Fabrication Sales],[Other Sales],0)
Related
I have a measure which displays number of employees in relation to the date.
Each day the FactEmployee is updated to reflect who is working. this means that my measure (obviously) can't display how many employees there are tomorrow.
I would like to persist the latest value (ie. todays value) into the future.
Data model
My (not perfect) measure
Count, employee :=
VAR today = TODAY()
VAR res =
IF (
MAX ( DimDate[fulldate] ) > today,
CALCULATE (
COUNT ( DimEmployee[emp_key] ),
FILTER ( ALL ( FactEmployee ), RELATED ( DimDate[fulldate] ) = today)
),
CALCULATE ( COUNT ( DimEmployee[emp_key] ), FactEmployee )
)
RETURN
res
Output
year-month count, emp
---------------------------
2020-01 182
2020-02 180
2020-03 174
2020-04 171
2020-05 171
2020-06 173
2020-07 172
2020-08 175
2020-09 172
Expected Output
year-month count, emp
--------------------------
2020-01 182
2020-02 180
2020-03 174
2020-04 171
2020-05 171
2020-06 173
2020-07 172
2020-08 175
2020-09 172
2020-10 172 <----repeated value from 2020-09
2020-11 172 <----repeated value from 2020-09
2020-12 172 <----repeated value from 2020-09
how can i fix my measure to get the missing values (oktober to december)?
I have replicated your model using a simplified version, I don't think you need dimEmployee in this case.
Assuming your model is like this
And your tables look like these:
FactEmployee
+----------+---------+
| date_key | emp_key |
+----------+---------+
| 20200101 | 1 |
+----------+---------+
| 20200102 | 1 |
+----------+---------+
| 20200103 | 1 |
+----------+---------+
| 20200104 | 1 |
+----------+---------+
| 20200105 | 1 |
+----------+---------+
| 20200101 | 2 |
+----------+---------+
| 20200102 | 2 |
+----------+---------+
| 20200104 | 2 |
+----------+---------+
| 20200101 | 3 |
+----------+---------+
| 20200102 | 3 |
+----------+---------+
| 20200103 | 3 |
+----------+---------+
| 20200104 | 3 |
+----------+---------+
| 20200105 | 4 |
+----------+---------+
DimDate
+------------+----------+
| Date | Date_key |
+------------+----------+
| 01/01/2020 | 20200101 |
+------------+----------+
| 02/01/2020 | 20200102 |
+------------+----------+
| 03/01/2020 | 20200103 |
+------------+----------+
| 04/01/2020 | 20200104 |
+------------+----------+
| 05/01/2020 | 20200105 |
+------------+----------+
| 06/01/2020 | 20200106 |
+------------+----------+
| 07/01/2020 | 20200107 |
+------------+----------+
I have created a calculation that follow these steps:
Compute the maximum date with valid or non blank values for the distinct count of emp key, under the variable MaxDateKey.
IF statement evaluated for date_key greater than 'MaxDatekey' - in this case 20200106 and 20200107. For those dates, the calculation retrieves the distinct count of emp_key for MaxDateKey.
When the IF stamenet is false, distinct count is calculated as usual.
Count =
VAR MaxDateKey =
CALCULATE (
LASTNONBLANK ( FactEmployee[date_key], DISTINCTCOUNT ( FactEmployee[emp_key] ) ),
REMOVEFILTERS ( DimDate[Date] )
)
VAR Result =
IF (
MAX ( DimDate[Date_key] ) > MaxDateKey,
CALCULATE (
DISTINCTCOUNT ( FactEmployee[emp_key] ),
ALL ( DimDate[Date] ),
DimDate[Date_key] = MaxDateKey
),
DISTINCTCOUNT ( FactEmployee[emp_key] )
)
RETURN
Result
The output below. The values from the last valid date 5th of Jan is applied to the subsequent dates (6th and 7th of Jan).
For line chart, you can check the Forecast option from the Analytics pane as shown below.
The output will be something like below-
Consider the following tables - one of printers, the other of page counts from meter readings:
Printers
+------------+---------+--------+
| Printer ID | Make | Model |
+------------+---------+--------+
| 1 | Xerox | ABC123 |
| 2 | Brother | DEF456 |
| 3 | Xerox | ABC123 |
+------------+---------+--------+
Meter Read
+-------+------------+-----------+------------+
| Index | Printer ID | Poll Date | Mono Pages |
+-------+------------+-----------+------------+
| 1 | 1 | 1/1/2019 | 1000 |
| 2 | 2 | 1/1/2019 | 800 |
| 3 | 3 | 1/1/2019 | 33000 |
| 4 | 1 | 1/2/2019 | 1100 |
| 5 | 2 | 1/2/2019 | 850 |
| 6 | 3 | 1/2/2019 | 34000 |
| 7 | 1 | 1/3/2019 | 1200 |
| 8 | 2 | 1/3/2019 | 900 |
| 9 | 3 | 1/3/2019 | 35000 |
| 10 | 1 | 1/4/2019 | 1400 |
| 11 | 2 | 1/4/2019 | 950 |
| 12 | 3 | 1/4/2019 | 36000 |
| 13 | 1 | 1/5/2019 | 1800 |
| 14 | 2 | 1/5/2019 | 1000 |
| 15 | 3 | 1/5/2019 | 36500 |
| 16 | 1 | 1/6/2019 | 2000 |
| 17 | 2 | 1/6/2019 | 1050 |
| 18 | 3 | 1/6/2019 | 37500 |
| 19 | 1 | 1/7/2019 | 2100 |
| 20 | 2 | 1/7/2019 | 1100 |
| 21 | 3 | 1/7/2019 | 39000 |
| 22 | 1 | 1/8/2019 | 2200 |
| 23 | 2 | 1/8/2019 | 1150 |
| 24 | 3 | 1/8/2019 | 40000 |
+-------+------------+-----------+------------+
In my Power BI report, I have a Dates table:
Dates = CALENDAR(DATE(2019, 1, 1), DATE(2019, 1, 31))
that I am using as a slicer. The goal is to end up with a delta of Mono Pages during the date range from the slicer. I'm able to grab the difference between each meter read with a fairly complicated calculated column on the Meter Read table:
PagesSinceLastPoll =
IF(
ISBLANK(
LOOKUPVALUE(
'Meter Read'[Mono Pages],
'Meter Read'[Index], CALCULATE(
MAX(
'Meter Read'[Index]
), FILTER(
'Meter Read',
'Meter Read'[Index] < EARLIER('Meter Read'[Index])
&& 'Meter Read'[Printer ID] = EARLIER('Meter Read'[Printer ID] )
)
)
)
),
BLANK(),
'Meter Read'[Mono Pages] -
LOOKUPVALUE(
'Meter Read'[Mono Pages],
'Meter Read'[Index], CALCULATE(
MAX(
'Meter Read'[Index]
), FILTER(
'Meter Read',
'Meter Read'[Index] < EARLIER('Meter Read'[Index])
&& 'Meter Read'[Printer ID] = EARLIER('Meter Read'[Printer ID] )
)
)
)
)
But the performance over 10,000+ rows is pretty bad. I'd like to grab the max and min values for a device in the filtered date range and just subtract instead, but I'm having a hard time getting the right value. My DAX so far keeps getting me the max value from the ENTIRE table, not the table filtered on the dates in my slicer. Everything I've tried so far is some variation on:
MaxInRange =
CALCULATE (
MAX ( 'Meter Read'[Mono Pages] ),
FILTER ( 'Meter Read', 'Meter Read'[Printer ID] = Printers[Printer ID] )
)
To summarize: If I have a slicer starting 1/2/2019 and ending 1/5/2019, the max value for Printer ID 1 should read 1800, not 2200.
Thoughts?
The calculated column can be done more efficiently like this:
PagesSinceLastPoll =
VAR PrevRow =
TOPN(1,
FILTER('Meter Read',
'Meter Read'[PrinterID] = EARLIER('Meter Read'[PrinterID]) &&
'Meter Read'[PollDate] < EARLIER('Meter Read'[PollDate])
),
'Meter Read'[PollDate]
)
RETURN 'Meter Read'[MonoPages] - SELECTCOLUMNS(PrevRow, "Pages", 'Meter Read'[MonoPages])
Using that, the number of pages between two dates can just sum this column on those dates.
If you want to skip that and go straight to a measure, try something like this:
PagesInPeriod =
VAR StartDate = FIRSTDATE(Dates[Date])
VAR EndDate = LASTDATE(Dates[Date])
RETURN
SUMX(
VALUES('Meter Read'[PrinterID]),
CALCULATE(
MAX('Meter Read'[MonoPages]),
Dates[Date] = EndDate
)
-
CALCULATE(
MAX('Meter Read'[MonoPages]),
Dates[Date] < StartDate
)
)
Note that if you use Dates[Date] = StartDate, then you'll be off. You want to calculate the max pages before your first included date.
Both of these methods should give the same result:
Alexis' measure is the correct way to handle this (my thanks!), but I made a very small edit. Since it is possible that a reading was not taken on the end date, we need to look on or before that date, else it treats the max on end date like a zero. The final code then becomes:
PagesInPeriod =
VAR StartDate = FIRSTDATE(Dates[Date])
VAR EndDate = LASTDATE(Dates[Date])
RETURN
SUMX(
VALUES('Meter Read'[PrinterID]),
CALCULATE(
MAX('Meter Read'[MonoPages]),
Dates[Date] <= EndDate
)
-
CALCULATE(
MAX('Meter Read'[MonoPages]),
Dates[Date] < StartDate
)
)
I would like to calculate average yield between two relation tables of a given date
Table1 Table2
+-------------------------------+ +-------------------------------+
| ID TradeDate Amount | | ID TradeDate Yield |
+-------------------------------+ +-------------------------------+
| 1 2018/11/30 100 | | 1 2018/11/8 2.2% |
| 1 2018/11/8 101 | | 1 2018/8/8 2.1% |
| 1 2018/10/31 102 | | 1 2018/5/8 2.0% |
| 1 2018/9/30 103 | | 2 2018/9/8 1.7% |
| 2 2018/11/30 200 | | 2 2018/6/8 1.6% |
| 2 2018/10/31 203 | | 2 2018/3/8 1.5% |
| 2 2018/9/30 205 | | 3 2018/10/20 1.7% |
| 3 2018/11/30 300 | | 3 2018/7/20 1.6% |
| 3 2018/10/31 300 | | 3 2018/4/20 1.6% |
| 3 2018/9/30 300 | +-------------------------------+
+-------------------------------+
I create a table named 'DateList' and use slicer to select a specified date.
Screen Shot DateList.
I want to achieve the following result:
as of *11/9/2018*
+-----------------------------------------------------------------+
| ID LastDate Value LatestYieldDate LastYield |
+-----------------------------------------------------------------+
| 1 2018/11/8 101 2018/11/8 2.2% |
| 2 2018/10/31 203 2018/9/8 1.7% |
| 3 2018/10/31 300 2018/10/20 1.7% |
+-----------------------------------------------------------------+
| Total 604 1.7836% |
+-----------------------------------------------------------------+
Currently, I use the following formula to achieve the partial result
Create 2 measures in table1
LastDate =
VAR SlicerDate = MIN(DateList[Date])
VAR MinDiff =
MINX(FILTER(ALL(Table1),Table1[ID] IN VALUES(Table1[ID])),
ABS(SlicerDate - Table1[TradeDate]))
RETURN
MINX(FILTER(ALL(Table1),Table1[ID] IN VALUES(Table1[ID])
&& ABS(SlicerDate - Table1[TradeDate]) = MinDiff),
Table1[TradeDate])
Value = CALCULATE(SUM(Table1[Amount]), FILTER(Table1, Table1[TradeDate] = [LastDate]))
Create 2 measures in table2
LastYieldDate =
VAR SlicerDate = MIN(DateList[Date])
VAR MinDiff =
MINX(FILTER(ALL(Table2),Table2[ID] IN VALUES(Table2[ID])),
ABS(SlicerDate - Table2[TradeDate]))
RETURN
MINX(FILTER(ALL(Table2),Table2[ID] IN VALUES(Table2[ID])
&& ABS(SlicerDate - Table2[TradeDate]) = MinDiff),
Table2[TradeDate])
LastYield = CALCULATE(SUM(Table2[Yield]), FILTER(Table2,
Table2[TradeDate] = [LastYieldDate]))
I have no idea to calculate right average yield between 2 tables
Here is my current result.
Screen Shot Current Result.
You'll first need to create a bridge table for the ID values so you can work with both tables more easily.
IDList = VALUES(Table1[ID])
Now we'll use IDList[ID] on our visual instead of the ID from one of the other tables.
The measure we use for the average last yield is a basic sum-product average:
LastYieldAvg =
DIVIDE(
SUMX(IDList, [Value] * [LastYield]),
SUMX(IDList, [Value])
)
Note that when there is only a single ID value, it simplifies to
[Value] * [LastYield] / [Value] = [LastYield]
Object: Sum up the nearest date's value by a given date
Here is my data
Table: MyData
+-------------------------------+
| ID TradeDate Value |
+-------------------------------+
| 1 2018/11/30 105 |
| 1 2018/11/8 101 |
| 1 2018/10/31 100 |
| 1 2018/9/30 100 |
| 2 2018/11/30 200 |
| 2 2018/10/31 201 |
| 2 2018/9/30 205 |
| 3 2018/11/30 300 |
| 3 2018/10/31 305 |
| 3 2018/9/30 301 |
+-------------------------------+
I create a table named 'DateList' and use slicer to select a specified date
DateList Slicer
I want to achieve the result as follows:
as of *11/9/2018*
+-----------------------------------+
| ID TradeDate Value |
+-----------------------------------+
| 1 2018/11/8 101 |
| 2 2018/10/31 201 |
| 3 2018/10/31 305 |
+-----------------------------------+
| Total 607 |
+-----------------------------------+
Currently, I try to use the steps to achieve the above result.
First, i want to find the nearest date from table 'MyData' use the new measure
MyMaxDate = CALCULATE(MAX(MyData[TradeDate]),Filter(MyData, MyData[TradeDate] <= FIRSTDATE(DateList[Date]) ))
Second, i create a new measure "MySum" to the sum up the values if [tradedate] equal to the "MyMaxDate"
MySum = CALCULATE(SUM(MyDate[Value]),Filter(MyData, MyData[TradeDate] = MyMaxDate))
Third, i create a matrix to show the result (see Result)
Unfortunately, the result 1313 is different from my goal 607
So, how can i fix my DAX formula to achieve the right result?
Many Thanks
You can calculate the closest date by taking a min over the difference in dates and then taking the minimal date with that minimal difference.
MyDate =
VAR SlicerDate = MIN(DateList[Date])
VAR MinDiff =
MINX(
FILTER(ALL(MyData),
MyData[ID] IN VALUES(MyData[ID])
),
ABS(SlicerDate - MyData[TradeDate]))
RETURN
MINX(
FILTER(ALL(MyData),
MyData[ID] IN VALUES(MyData[ID])
&& ABS(SlicerDate - MyData[TradeDate]) = MinDiff
),
MyData[TradeDate])
From there you can create the summing measure fairly easily:
MySum = CALCULATE(SUM(MyData[Value]), FILTER(MyData, MyData[TradeDate] = [MyDate]))
I have a table with the following headers:
Dates | Category | Value
1/1/00 | A | 100
1/1/00 | B | 200
1/2/00 | A | 300
1/2/00 | B | 100
What I would like to do is to be able to add a custom column with the daily rank as such:
Dates | Category | Value | Rank
1/1/00 | A | 100 | 1
1/1/00 | B | 200 | 2
1/2/00 | A | 300 | 2
1/2/00 | B | 100 | 1
My goal is to run calcs over the top for average rank, etc. How would I write the DAX code for this column?
Cheers
Try this as a calculated column:
Column =
VAR rankValue = 'table'[Value]
RETURN
CALCULATE (
RANK.EQ ( rankValue, 'table'[Value], ASC ),
ALLEXCEPT ( 'table', 'table'[Dates] )
)