Calculated columns bases on the past 3 months in Power query - powerbi

First time trying to use M in power query... what I have is this table
I need to create two columns that per each row (combination of CD_Loja x CD_Produto )returns me the sum of QT_VENDA for that combination divided by the # of days in the past 3 months. The other column is pretty much the same but with the sum of  VL_VENDA_LIQ  Instead.
I.e: For the first row I want to sum up all QT_VENDA that matches CD_PRODUTO =1001930 AND CD_LOJA = 151 in the past 3 months (the DATE column has daily data) and divide it by the number of days in those 3 months.
Is there a way to do so ? And how do I go about this ?
Thanks in advance.

In powerquery, M, something along these lines
let Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
#"Changed Type" = Table.Buffer(Table.TransformColumnTypes(Source,{{"DATA", type date}, {"CD_PRODUCTO", type text}, {"CD_LOIA", type text}, {"QT_VENDA", Int64.Type}, {"VL_VENDA_LIQ", Int64.Type}})),
#"Added Custom" = Table.AddColumn(#"Changed Type" ,"QT_VENDA_90",(i)=>List.Average(Table.SelectRows(#"Changed Type" , each [CD_PRODUCTO]=i[CD_PRODUCTO] and [CD_LOIA]=i[CD_LOIA] and [DATA] <= i[DATA] and [DATA] >= Date.AddDays(i[DATA] ,-90)) [QT_VENDA]), type number),
#"Added Custom2" = Table.AddColumn(#"Added Custom" ,"VL_VENDA_LIQ_90",(i)=>List.Average(Table.SelectRows(#"Changed Type" , each [CD_PRODUCTO]=i[CD_PRODUCTO] and [CD_LOIA]=i[CD_LOIA] and [DATA] <= i[DATA] and [DATA] >= Date.AddDays(i[DATA] ,-90)) [VL_VENDA_LIQ]), type number)
in #"Added Custom2"

You can try a Measure like below-
your_expected_value =
var current_row_data = MIN(your_table_name[DATA])
--I Guess the column name should be Date instead
var current_row_data_minus_90_day = MIN(your_table_name[DATA]) - 90
var current_row_cd_produto = MIN(your_table_name[CD_PRODUTO])
var current_row_cd_loja = MIN(your_table_name[CD_LOJA])
RETURN
CALCULATE(
SUM(your_table_name[QT_VENDA]),
FILTER(
ALL(your_table_name),
your_table_name[DATA] >= current_row_data_minus_90_day
&& your_table_name[DATA] <= current_row_data
&& your_table_name[CD_PRODUTO] = current_row_cd_produto
&& your_table_name[CD_LOJA] = current_row_cd_loja
)
)/90
--Here 90 used for static 3 month consideration

Related

In Dax, how can i count values in one column that equal the value of another column?

I have two columns, i want a table that shows the number of "Assign Date" in "Week Start" so for "Week Start" of 1/1/2022 it should be 0, for "Week Start" of 1/7/2022, it should be 2, and it should be 1 for 1/14/2022 and 1/21/2022.
I have two date column
Week Start
Assign Date
1/1/2022
1/8/2022
1/8/2022
1/8/2022
1/15/2022
1/15/2022
1/22/2022
1/22/2022
I want one date column and one count column
Week Start
Assign Count
1/1/2022
0
1/8/2022
2
1/15/2022
1
1/22/2022
1
I am very new to DAX and i assume that i am over complicating the solution but i can't figure out where to start. Because i am learning DAX, i would like to get this in a DAX measure.
Or this measure:
Assign Count :=
VAR ThisWeekStart =
MIN( Table1[Week Start] )
RETURN
0
+ COUNTROWS(
FILTER(
ALL( Table1 ),
Table1[Assign Date] = ThisWeekStart
)
)
which you can place in a visual together with the Week Start field.
There may be more efficient M-Code, but what I did here was
to use List.Accumulate to count the number of entries that were in the correct range: >=Week Start and <Week Start + 7 days
M Code
let
Source = Excel.CurrentWorkbook(){[Name="Table3"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Week Start", type date}, {"Assign Date", type date}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Assign Count",
each List.Accumulate(
#"Changed Type"[Assign Date],
0,
(state, current)=>
if current >=[Week Start] and current < Date.AddDays([Week Start],7) then state +1 else state)),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Assign Date"})
in
#"Removed Columns"

Group by a SUM function in M power BI

In PowerQuery I try to sum the quantité livréeby BL 2 (i have several rows with the same BL2 number but i can't delete doublons, regarding of details of the data)
The data looks like:
I tried:
=Table.Group(#"Somme quantité livrée", {"BL 2"},{{"quantité livrée", each List.Sum([#"quantitée livrée"]), type number}}
but the function doesnt work, have the same error message "RightParen token excepted" but i don't see what should i do here (or even if its the right function to do what i except)
Basically i want to obtain the sum of the quantité livrée, quantité retournée, quantité facturée by distinct BL 2
Any idea?
I tried the Group By table proposed in answers but using it i lost others columns:
before:
And after:
Why not use the group interface to create the code for you?
#"Grouped Rows" = Table.Group(#"previous_step_name", {"BL 2"}, {{"quantité livrée", each List.Sum([quantité livrée]), type number}, {"quantité retournée", each List.Sum([quantité retournée]), type number}, {"quantité facturée", each List.Sum([quantité facturé]), type number}})
== == ==
If you want to retain other columns then in the group use an All Rows operation.
then after, expand desired columns back into table using arrows on top and slightly to right of the new column
== == ==
a totally different way to do this is just adding in three custom columns to sum ql, qr and qf based on BL2 of each row. It does NOT do any grouping, so for each BL combination you'd see the same total on each row
let Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
#"Added Custom" = Table.AddColumn(Source,"sum_ql",(i)=>List.Sum(Table.SelectRows(Source, each [bl 2]=i[bl 2]) [ql]), type number ),
#"Added Custom2"= Table.AddColumn(#"Added Custom" ,"sum_qr",(i)=>List.Sum(Table.SelectRows(#"Added Custom" , each [bl 2]=i[bl 2]) [qr]), type number ),
#"Added Custom3"= Table.AddColumn(#"Added Custom2" ,"sum_qf",(i)=>List.Sum(Table.SelectRows(#"Added Custom" , each [bl 2]=i[bl 2]) [qf]), type number )
in #"Added Custom3"

Power Query - COUNTIFS copycat with performance issue. Which is a better approach?

The scenario and data structure is very simple.
I have a list with the product code and the month that this product have been retailed. A example of such data can be seen at the first two columns in green at the image below.
Then I need to check for each product If it was retailed also on the last month, on the last 3 months or in the last 12 months. The result would be the next three columns in yellow on the image.
These calculations (yellow columns) are easy to be computed at Excel by using some IF and COUNTIFS formulas, but when migrating it to Power BI I'm struggling with the performance of my code at Power Query. As there are thousands of products for each month, the Power Query calculation is taking too long.
Check below the code I've designed. The code snapshot would be for the second yellow column, to advise whether there was a retail on the last 3 months or not of that product.
In essence what I'm doing is adding a calculated column that is counting the rows of a table that is being filtered with the product code information and relevant date.
What would be a better approach in terms of performance to get the information I need?
Thank you.
Code:
// Add a calculated column.
AdicionarHits03Meses = Table.AddColumn(
AdicionarHits01Mes,
"Hit nos últimos 3 meses?",
(r)=>
// Check if...
if
// Returns the rows count of a table.
Table.RowCount(
// Returns a table with the condition.
Table.SelectRows(
ChangeType,
// Condition:
(q)=>
// Same Product Code.
q[#"Product Code"] = r[#"Product Code"]
and
// Check the retail month.
q[#"Retail month"] <= Date.AddMonths(r[#"Retail month"], -1) and
q[#"Retail month"] >= Date.AddMonths(r[#"Retail month"], -3)
)
)
= 0 then
// No retail found.
0 else
// Retail found.
1
,
Int64.Type
)
You can likely improve performance with some clever self-merges.
See if this makes sense to you:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMlTSUfJKzFMwMjAyVIrViVYyQhcwRhcAaXFLTUIV8E0sQjXDN7ESzdDSHKhALAA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Product Code" = _t, #"Retail Month" = _t]),
Original = Table.TransformColumnTypes(Source,{{"Product Code", Int64.Type}, {"Retail Month", type date}}),
#"Added Offset Lists" = Table.AddColumn(Original, "Offset", each {1..12}),
#"Expanded Offset Column" = Table.ExpandListColumn(#"Added Offset Lists", "Offset"),
#"Added Prev Column" = Table.AddColumn(#"Expanded Offset Column", "Prev", each Date.AddMonths([Retail Month], -[Offset] ), type date),
#"Inner Join Prev to Original" = Table.NestedJoin(#"Added Prev Column", {"Product Code", "Prev"}, Original, {"Product Code", "Retail Month"}, "Retail", JoinKind.Inner),
#"Merge Original to Prev" = Table.NestedJoin(Original, {"Product Code", "Retail Month"}, #"Inner Join Prev to Original", {"Product Code", "Retail Month"}, "Min Offset", JoinKind.LeftOuter),
#"Expanded Min Offset" = Table.TransformColumns(#"Merge Original to Prev", {{"Min Offset", each List.Min([Offset]), Int64.Type}}),
#"Added Last Month" = Table.AddColumn(#"Expanded Min Offset", "Retail last month", each if [Min Offset] = 1 then "Yes" else "No", type text),
#"Added Last 3 Months" = Table.AddColumn(#"Added Last Month", "Retailed since last 3 months", each if [Min Offset] <> null and [Min Offset] <= 3 then "Yes" else "No", type text),
#"Added Last 12 Months" = Table.AddColumn(#"Added Last 3 Months", "Retailed since 12 months", each if [Min Offset] <> null and [Min Offset] <= 12 then "Yes" else "No", type text)
in
#"Added Last 12 Months"
I don't have time to fully explain it but the outline is roughly as follows:
Expand each row to 12 rows of prior months.
Match up those prior months with rows from the original table.
Join the original rows with all of the matches found for that row.
Find the most recent match (minimal offset) for each matching set.
Define the 1, 3, and 12-month lookback columns using this offset.

How to calculate frequency received data from time column in Power BI?

I have a table with received time like the below:
As you see, the frequency of received data in some rows is different and they are 1000ms, 1001ms, 998ms.
How can I calculate the average frequency of received time in ms?
I suggest using Power Query to
add a column that is the original date_time column offset by 1
Then add another column showing the difference between current row and previous row
This method, at least in M Code, is faster than using an Index column to refer to previous row
Then you can do your mathematical analyses.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("dc27DcAgDAXAVRA1An9iO2EVxP5rgEQZXn3FjZGjsTUh4cRdqctTwy3P8heD4lv8KgHl3RJX+dCjBIXRo3JkLg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [date_time = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"date_time", type datetime}}),
//add column containing previous rows data
//usually faster than using and INDEX column
prevList = {null} & List.RemoveLastN(#"Changed Type"[date_time]),
tbl1 = Table.FromColumns(
{#"Changed Type"[date_time],prevList},
{"date_time","Prev Row"}
),
#"Added Custom" = Table.AddColumn(tbl1, "Difference", each /*if [Prev Row] = null then 0
else*/ Duration.TotalSeconds([date_time] - [Prev Row])),
#"Changed Type1" = Table.TransformColumnTypes(#"Added Custom",{
{"date_time", type datetime}, {"Prev Row", type datetime}, {"Difference", type number}})
in
#"Changed Type1"
In PowerQuery add an Index column to your data so that you can address singe rows. Then with DAX add a calulated column to the table:
Milliseconds =
VAR NextIndex = 'Table'[Index] + 1
VAR MaxIndex = MAX(Table[Index])
VAR Difference =
IF(
'Table'[Index] = MaxIndex,
BLANK(),
'Table'[date_time] -
CALCULATE(
VALUES('Table'[date_time]),
Filter(
ALL('Table'),
'Table'[Index] = NextIndex
)
)
)
RETURN
CONVERT(ABS(Difference * 3600 * 24 * 1000), INTEGER)
Now you are looking for AVERAGE( Milliseconds ).
Note: Things would have been easier if you provided copyable data instead of a screenshot.

PowerBi timeseries duration

In PowerBI, I have a simple table with 3 columns:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("TYzBCcAwDAN38dugyk5bdxaT/deICzXN77hDyhSKCkHYwafwbJyaYiUc9I4XaH/1MgHeXQO+bcf7ux0XW3x5Lg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [id = _t, start = _t, end = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"id", Int64.Type}, {"start", type date}, {"end", type date}})
in
#"Changed Type"
From which I can create the following visual
My challenge is to calculate the total duration in days of the times series based on filter selected above. Any help would be appreciated.
I have tried the following DAX formula but it gives me crazy results as shown above.
YTDDuration =
var start_Date=FIRSTDATE(ALLSELECTED('CALENDAR'[Date]))
var end_Date=LASTDATE(ALLSELECTED('CALENDAR'[Date]))
var current_Start=MAX(Table2[start])
var current_end=MAX(Table2[end])
var bigif=IF(current_end>start_Date&&current_Start<end_Date,DATEDIFF(MAX(start_Date,current_Start),MIN(end_Date,current_end),DAY),0)
RETURN
CALCULATE(SUMX(Table2, bigif),FILTER(ALL(Table2), Table2[start] <= max(Table2[end])))
Expected output would be:
The key here is to account for gaps in dates and consolidate overlapping dates.
You can iterate through your calendar table and count the number of days where that day falls into one of the id time periods.
YTDDuration =
var current_Start = CALCULATE(MIN(Table2[start]), ALL(Table2))
var current_end = MAX(Table2[end])
RETURN
SUMX(
FILTER('CALENDAR', 'CALENDAR'[Date] <= current_end),
MAXX(ALL(Table2), IF([Date] > [start] && [Date] <= [end], 1, 0))
)
This starts at the minimal start date and adds 1 for each day where that Date is between start and end for some row in Table2. If none of the rows contains that Date, the max is over just zeros and returns zero. If one or more matches, you get a max of one and we count that day.
YTD Duration by end: