Rolling Avg for Last 21 days - powerbi

How can I create a measure to calculate the avg spend for the last 21 days? My formula below returns a null
Average of Spend rolling average =
VAR __LAST_DATE = LASTDATE('Raw_Data'[Date_1].[Day])
RETURN
AVERAGEX(
DATESBETWEEN(
'Raw_Data'[Date_1].[Day],
DATEADD(__LAST_DATE, -21, DAY),
DATEADD(__LAST_DATE, 0, DAY)
),
CALCULATE(AVERAGE('Raw_Data'[Spend]))
)
I can't generate a value. It returns a 0

Other than using [Date_1].[Day] and an off-by-one error with DATESBETWEEN that looks fine. eg this
Average of Spend rolling average =
VAR __LAST_DATE = LASTDATE('Raw_Data'[Date_1])
RETURN
AVERAGEX(
DATESBETWEEN(
'Raw_Data'[Date_1],
DATEADD(__LAST_DATE, -20, DAY),
DATEADD(__LAST_DATE, 0, DAY)
),
CALCULATE(AVERAGE('Raw_Data'[Spend]))
)
Appears to work correctly for me with
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjIwMtI1MAQiJR0lU6VYHSQhI6CQoQGqmDFQzBhNzASkFU3MFKwXKBgLAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Date_1 = _t, Spend = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Date_1", type date}, {"Spend", Int64.Type}})
in
#"Changed Type"

Related

calculating seconds from a numbers/text column which represents the time in hours/minute/seconds

In Power Query i have a column which, for example looks like this
9h8m4s
this means 9 hours, 8 minute and 4 second. the challenge now is that i want to convert this value in the column to be the sum up of the hour, minute and second to be only second which actually equals to 32884 seconds.
and ideas about how to convert it in PowerQuery for Power Bi?
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WssywyDUpVoqNBQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Custom",
each let
h = Number.FromText(Text.BeforeDelimiter([Column1],"h")),
m = Number.FromText(Text.BetweenDelimiters([Column1],"h","m")),
s = Number.FromText(Text.BetweenDelimiters([Column1],"m","s"))
in (h*60*60)+(m*60)+s)
in
#"Added Custom"
Split the column by the separators "h", "m" and "s" and combine them as a new column by multiplying minutes with 60 and hours with 3600.
let
Source = Table.FromValue("9h8m4s"),
#"Added Custom1" = Table.AddColumn(
Source, "Seconds", each
Number.FromText(Text.BeforeDelimiter([Value],"h")) * 3600
+ Number.FromText(Text.BetweenDelimiters([Value], "h", "m")) * 60
+ Number.FromText(Text.BetweenDelimiters([Value], "m", "s"))
),
#"Changed Type" = Table.TransformColumnTypes(
#"Added Custom1",{{"Seconds", type number}})
in
#"Changed Type"

How to handle data over time without a date column?

I have a dataset which has many columns listing multiple years worth of values, example:
Country
2020 Rank X
2020 Rank Y
2021 Rank X
EU
1
2
3
USA
2
3
4
Etc. Each year has about 6 values for each country and there is 4 years of data, approx 160 rows.
My problem is when attempting to display over time data, there is no functioning "year" column or any data Power BI recognises as a date. How do i convert from the year info in the column name to use-able/able to be filtered year information?
you should unpivot and pivot again on the Power Query side. then you can use DATE(<year>, <month>, <day>) on the powerbi side...
try :
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45Wcg1VUNJRMgQRRiDCWClWJ1opNNgRIQIiTJRiYwE=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Country = _t, #"2020 Rank X" = _t, #"2020 Rank Y" = _t, #"2021 Rank X" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Country", type text}, {"2020 Rank X", Int64.Type}, {"2020 Rank Y", Int64.Type}, {"2021 Rank X", Int64.Type}}),
#"Unpivoted Other Columns" = Table.UnpivotOtherColumns(#"Changed Type", {"Country"}, "Attribute", "Value"),
#"Added Custom" = Table.AddColumn(#"Unpivoted Other Columns", "Year", each Text.Start([Attribute],4)),
#"Changed Type1" = Table.TransformColumnTypes(#"Added Custom",{{"Year", type date}}),
#"Pivoted Column" = Table.Pivot(#"Changed Type1", List.Distinct(#"Changed Type1"[Attribute]), "Attribute", "Value")
in
#"Pivoted Column"

Power BI How to remove duplicate rows under specific conditions keeping the latest entry?

Hello I need help removing duplicate rows under certain conditions.
Raw File
ANI
Date
Time
111-111-1111
8/7/2022
10:34:00 AM
111-111-1111
8/7/2022
12:00:00 PM
111-111-1111
8/7/2022
12:03:00 PM
222-222-2222
8/8/2022
10:50:00 AM
222-222-2222
8/8/2022
10:52:10 AM
333-333-3333
8/9/2022
12:29:00 PM
333-333-3333
8/9/2022
12:32:00 PM
333-333-3333
8/9/2022
12:33:00 PM
444-444-4444
8/10/2022
1:50:00 PM
444-444-4444
8/10/2022
1:51:00 PM
Raw File contains ANI column which shows different phone numbers called into my system,
Date and Time columns matching the time which the calls came in.
I want to remove the earliest entries of the back-to-back calls based on the same number and date only if called in within 3 minutes after the initial call.
This is the end result that I wish to my Power BI would see and count:
Result
ANI
Date
Time
111-111-1111
8/7/2022
10:34:00 AM
111-111-1111
8/7/2022
12:03:00 PM
222-222-2222
8/8/2022
10:52:10 AM
333-333-3333
8/9/2022
12:33:00 PM
444-444-4444
8/10/2022
1:51:00 PM
At the end, I want it to count the back-to-back calls just once if called in within 3 min time frame and leave alone singular calls made outside of that condition.
Please help.
Here is another way of doing this using the Query Editor:See the comments to help understand the algorithm
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jc8xCoAwDIXhq0hni8lLRO3mAQT30vtfw1qiZCodfsjwwSM5B2aOFoc57Mu2gIB6MiXRRDSdVyhzV6KyV94jUpwEEC00ubv1ldx6XyLxL0UkWtLk4dZxuPWuFAxL/5GqRkubZPqpfTQk+ZPlAQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [ANI = _t, Date = _t, Time = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"ANI", type text}, {"Date", type date}, {"Time", type time}}),
//create a column with the combine datetime
#"Added Custom" = Table.AddColumn(#"Changed Type", "DateTime", each [Date]&[Time], type datetime),
//group by ANI
// for each ANI group
// Sort by datetime
// Add a shifted column to compare one row to the next
// Retain only those rows where the difference between the original and the next column is greater than 3 minutes
// or the last row which will have a null in the shifted column
#"Grouped Rows" = Table.Group(#"Added Custom", {"ANI"}, {
{"Filtered", (t)=>
let
sorted = Table.Sort(t,{"DateTime", Order.Ascending}),
shifted = Table.FromColumns(
Table.ToColumns(t) & {List.RemoveFirstN(t[DateTime],1) & {null}},
type table[ANI=text, Date=date, Time=time, DateTime=datetime, Shifted=datetime]),
deleteRows = Table.SelectRows(shifted, each [Shifted] = null or Duration.TotalMinutes([Shifted] - [DateTime]) > 3)
in
deleteRows, type table[ANI=text, Date=date, Time=time]}
}),
//re-expand the groups
#"Expanded Filtered" = Table.ExpandTableColumn(#"Grouped Rows", "Filtered", {"Date", "Time"})
in
#"Expanded Filtered"
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("lc8xCoAwDIXhq0hnS5OXiNrNAwju4v2voaVRs1WHHzJ8w8u+B2aOFnehD1MaEwgoN1MWzUTdsoajb1hcsNjtmxVnAUQL1U5+w0BuQ8si82NFJFpS7ew3YHYbGlbww/rfVDVaWi3Ti+23j5Zve5w=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [ANI = _t, Date = _t, Time = _t]),
#"Merged Columns" = Table.CombineColumns(Source,{"Date", "Time"},Combiner.CombineTextByDelimiter(" ", QuoteStyle.None),"Merged"),
#"Changed Type with Locale" = Table.TransformColumnTypes(#"Merged Columns", {{"Merged", type datetime}}, "en-US"),
#"Sorted Rows" = Table.Sort(#"Changed Type with Locale",{{"ANI", Order.Ascending}, {"Merged", Order.Descending}}),
#"Added Index" = Table.AddIndexColumn(#"Sorted Rows", "Index", 0, 1, Int64.Type),
#"Added Custom" = Table.AddColumn(#"Added Index", "Custom", each
try
if [ANI] = #"Sorted Rows"[ANI]{[Index] - 1} and #"Sorted Rows"[Merged]{[Index] - 1} -[Merged] <= #duration(0,0,3,0) then true else false
otherwise false
),
#"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([Custom] = false)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Index", "Custom"}),
#"Inserted Date" = Table.AddColumn(#"Removed Columns", "Date", each DateTime.Date([Merged]), type date),
#"Inserted Time" = Table.AddColumn(#"Inserted Date", "Time", each DateTime.Time([Merged]), type time),
#"Removed Columns1" = Table.RemoveColumns(#"Inserted Time",{"Merged"})
in
#"Removed Columns1"

Is there a solution to get a list of columns that rows are equal to 0?

I am building a Power Bi Q&A Dashboard that shows pass or fail within specific criteria. 1 meaning Pass, 0 meaning Fail. If any of the Categories Fail, the entire row fails.
Example:
Rep Name
Categories 1
Categories 2
Categories 3
Pass/Fail
Bob Smith
1
1
1
1
Tyler Jones
1
0
0
0
What I am looking for is a way to say, If (Pass/Fail) = 0 then list all columns have a value = 0.
In this example, I should get a result of Tyler Jones Failed in Criteria 2 & 3
What is the best way in either Dax or Mcode to do this?
You can unpivot your Category and use a CONCATENAX. Below example.
M transformation:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcspPUgjOzSzJUNJRMkTBsTrRSiGVOalFCl75eanFUHEDOI6NBQA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Rep Name" = _t, #"Categories 1" = _t, #"Categories 2" = _t, #"Categories 3" = _t, #"Pass/Fail" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Rep Name", type text}, {"Categories 1", Int64.Type}, {"Categories 2", Int64.Type}, {"Categories 3", Int64.Type}, {"Pass/Fail", Int64.Type}}),
#"Unpivoted Columns" = Table.UnpivotOtherColumns(#"Changed Type", {"Rep Name", "Pass/Fail"}, "Attribute", "Value"),
#"Removed Columns" = Table.RemoveColumns(#"Unpivoted Columns",{"Pass/Fail"})
in
#"Removed Columns"
DAX measure:
FailAts = CONCATENATEX(CALCULATETABLE ( VALUES ( 'Table (2)'[Attribute] ),'Table (2)'[Value] = 0 ), 'Table (2)'[Attribute]," ")

How to add new column from calculation on two rows in power BI?

I have following table:
I want to add a new column "Total" using query editor(power Query) such that when "GL" is 'Gross Margin' then "Total" should be 'Gross Margin' on "Total India Market" multiplied by 'Total Net Sales' on "Total India Market" i.e 0.11*65687 and if "GL" is not 'Gross Margin' then "Total India Market"+"Export".
Desired output should look like below:
I do not want calculated column it should be in a query editor i.e power query.
You can achieve this using power query
First add an index column then calculate previous row for Total India Market and then add custom column to calculate total as per requirement
let
Source = Excel.Workbook(File.Contents("C:\Users\anumua2\Downloads\im_25_oct_21.xlsx"), null, true),
Data_Sheet = Source{[Item="Data",Kind="Sheet"]}[Data],
#"Promoted Headers" = Table.PromoteHeaders(Data_Sheet, [PromoteAllScalars=true]),
#"Changed Type" = Table.TransformColumnTypes(#"Promoted Headers",{{"Date", type date}, {"GL", type text}, {"Domestic Product", type number}, {"Total India Market", type number}, {"Export", type number}}),
#"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 1, 1, Int64.Type),
#"Prev" = Table.AddColumn(#"Added Index", "Custom", each #"Added Index"{[Index]-2}[#"Total India Market"]),
#"Renamed Columns" = Table.RenameColumns(Prev,{{"Custom", "Prev_Total India Market"}}),
#"Replaced Errors" = Table.ReplaceErrorValues(#"Renamed Columns", {{"Prev_Total India Market", 0}}),
#"Added Custom" = Table.AddColumn(#"Replaced Errors", "Custom", each if[GL] = "Gross Margin" then [Total India Market]*[Prev_Total India Market] else [Total India Market] + [Export]),
#"Renamed Columns1" = Table.RenameColumns(#"Added Custom",{{"Custom", "Total"}})
in
#"Renamed Columns1"
It seems to be that Total Net Sales is always on the row before Gross Margin.
If that is the case, then in order to multiply a value by the value in the same column preceding row, you'll need to either add an Index column, or, my preference because it calculates faster, add a "shifted column" where the value from the previous row is now on the same row.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bZA/C4MwEMW/ijhLzWlyf0anTu2idBEHBykFacHY799LAqXELnnc8Xv3HhnHEkwNNVBZlZ33y14M7+2pw8kIR0FKE5VT9UsPr31ei6s6+nldvG6IBVXQIQegbZx1mem8vbwvLvN2f8QQaDCKiSEAeUh/68JhxxQ4JomX9eUv2fwrDwYCL2hMiuEMP7YPcMhCS6JqWxaTmQ7tQZLY+EWMGZ/a62EJHECb6oNzjrTQ9AE=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Date = _t, GL = _t, #"Domestic Product" = _t, #"Total India Market" = _t, Export = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Date", type date}, {"GL", type text},
{"Domestic Product", type number}, {"Total India Market", type number}, {"Export", type number}}),
//add a column = offset column from Total India Market
//This will calculate faster than using an INDEX column
offsetTotalIndiaMarket =
let
tbl = #"Changed Type",
ShiftedList = {null} & List.RemoveLastN(tbl[Total India Market]),
Custom1 = Table.ToColumns(tbl) & {ShiftedList}
in
Table.FromColumns(Custom1, Table.ColumnNames(tbl) & {"Shifted Tot India Market"}),
//add custom column to execute the described calculation
#"Added Custom" = Table.AddColumn(offsetTotalIndiaMarket, "Total", each
if [GL] = "Gross Margin"
then [Total India Market]*[Shifted Tot India Market]
else [Total India Market]+[Export]),
//Remove shifted column
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{"Shifted Tot India Market"})
in
#"Removed Columns"
Results
If it is NOT the case that Total Net Sales is always on the line preceding Gross Margin, then we can use a different algorithm. But also have to know if Gross Margin will always have a preceding Total Net Sales, or if we have to account for an inability to calculate also.
If this is the case, respond with a comment to this answer