Opening & closing balances in powerBI - powerbi

I have a table for GL transactions for a whole period over the years. Now I need to develop a table in which I can see the opening balance and closing balance for each month.
Ex :
enter image description here
result I need is,
enter image description here
I tried "closingbalancemonth" function, but it gives the total of transactions done on the last day of the month as the closing balance.

this is not exactly what you want but at least it gives correct results.
Closing Balance
Closing Balance =
CALCULATE(sum(Transactions[GL Transactions]),
filter(all(Transactions),
EOMONTH(min(Transactions[date]),0)>Transactions[date]))
Opening Balance
Opening Balance =
CALCULATE(sum(Transactions[GL Transactions]),
filter(all(Transactions),
EOMONTH(min(Transactions[date]),-1)>Transactions[date]))
there are also two more calculated columns. one is for the report and one is for sorting..
for the report :
Year & Month = FORMAT(Transactions[date],"MMM - YY")
for sorting :
YearMonth = year(Transactions[date])&MONTH(Transactions[date])
see also the attached sample file

You can also try this which has been solved on the Power Query side...
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bY3BCcAwDAN38TsRioObdJbg/dcoFAwq9GfuJPkcG7RmAww4/b1p2YqHiB4iNniVcG3cIvpna4Hr78kU7pJ3cBeflvkA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"GL account" = _t, date = _t, #"GL Transactions" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"date", type date}, {"GL Transactions", type number}}),
#"Sorted Rows" = Table.Sort(#"Changed Type",{{"date", Order.Ascending}}),
#"Added Custom1" = Table.AddColumn(#"Sorted Rows", "Eomonth", each Date.EndOfMonth([date])),
#"Grouped Rows" = Table.Group(#"Added Custom1", {"GL account", "Eomonth"}, {{"Sum GL Transactions", each List.Sum([GL Transactions]), type nullable number}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "Closing Balance", each Function.Invoke((current as date,tb as table)=>
List.Sum(Table.SelectRows(tb,each [Eomonth]>= List.Min(#"Grouped Rows"[Eomonth]) and [Eomonth]<=current)[Sum GL Transactions])
,{[Eomonth],#"Grouped Rows"})),
#"Added Custom2" = Table.AddColumn(#"Added Custom", "Opening Balance", each Function.Invoke((current as date,tb as table)=>
List.Sum(Table.SelectRows(tb,each [Eomonth]<current)[Sum GL Transactions])
,{[Eomonth],#"Grouped Rows"})),
#"Changed Type1" = Table.TransformColumnTypes(#"Added Custom2",{{"Closing Balance", type number}, {"Opening Balance", type number}, {"Eomonth", type date}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type1",{"Sum GL Transactions"}),
#"Unpivoted Columns" = Table.UnpivotOtherColumns(#"Removed Columns", {"GL account", "Eomonth"}, "Attribute", "Value")
in
#"Unpivoted Columns"
--->
New Version

Related

Power Query : Split Table Every n Columns

I have many columns of same table
Table A
Monthly Plan
Weekly plan
Actual
Monthly_Plan1
Weekly_plan 2
Actual_3
A
B
C
D
E
F
I want them as :
Monthly Plan
Weekly plan
Actual
A
B
C
D
E
F
I can not create a separate table and append it because there are so many columns and I cant create too many tables.
Here you go.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WclRQ0lFyAhHOIMIFRLiCCDel2FgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Monthly Plan " = _t, #"Weekly plan " = _t, #"Actual " = _t, #"Monthly_Plan1 " = _t, #"Weekly_plan 2 " = _t, Actual_3 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Monthly Plan ", type text}, {"Weekly plan ", type text}, {"Actual ", type text}, {"Monthly_Plan1 ", type text}, {"Weekly_plan 2 ", type text}, {"Actual_3", type text}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Custom", each List.Split( Record.ToList( _), 3)),
#"Removed Other Columns" = Table.SelectColumns(#"Added Custom",{"Custom"}),
#"Expanded Custom" = Table.ExpandListColumn(#"Removed Other Columns", "Custom"),
#"Extracted Values" = Table.TransformColumns(#"Expanded Custom", {"Custom", each Text.Combine(List.Transform(_, Text.From), "|"), type text}),
#"Split Column by Delimiter" = Table.SplitColumn(#"Extracted Values", "Custom", Splitter.SplitTextByDelimiter("|", QuoteStyle.Csv), {"Monthly", "Weekly", "Actual"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Monthly", type text}, {"Weekly", type text}, {"Actual", type text}})
in
#"Changed Type1"
Add a custom column
Remove other columns
Expand
Extract values concatenating with a |
Split the column and rename
If you don't care about row order this also works
let Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
Combo = List.Split(Table.ColumnNames(Source),3),
#"Added Custom" =List.Accumulate(Combo, #table({"Column1"}, {}),(state,current)=> state & Table.Skip(Table.DemoteHeaders(Table.SelectColumns(Source, current)),1))
in #"Added Custom"

Adding random rows within a group in power bi

I would like to know how to create the last column, Audit. To determine whether it is Included or Not Included the sum of any rows in "Average meeting per Day" within the same Employee/Day should match the "Actual Meeting" the employee attended.
I've taken this as an Example dataset -
Further, use Group By and Conditional column to have the Audit added.
Group by -
Conditional Column -
Result -
Code -
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45W8s3PS0msVNJRMgRiAz1TMGkEJBOVYnVwSBtjShuBJQzhZBJYOqQ0tRhZuwlcHqg9=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Day = _t, Emp_ID = _t, #"Actual Meeting" = _t, #"Average Meeting/Day" = _t, Emp_Name = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Day", type text}, {"Emp_ID", Int64.Type}, {"Actual Meeting", type number}, {"Average Meeting/Day", type number}, {"Emp_Name", type text}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Day", "Emp_ID"}, {{"Sum of avg", each List.Sum([#"Average Meeting/Day"]), type nullable number}, {"Actual Meeting", each List.Min([Actual Meeting]), type nullable number}}),
#"Added Conditional Column" = Table.AddColumn(#"Grouped Rows", "Audit", each if [Actual Meeting] = [Sum of avg] then "Included" else "Not Included")
in
#"Added Conditional Column"
Not sure this is going to help, but here goes. Note: not set up to work with duplicate values among the potential items to analyze. No idea if it will work with so many decimals, so that the sum of the decimals might be 0.0000001 off from total and thus not be recognized. Works for me with whole numbers
let
Process=(x as table) as table =>
// Bill Szysz 2017, all combinations of items in list, blows up with too many items to process due to large number of combinations
let ComboList=x[AMD],
find=x[AM]{0},
Source=Table.FromList(List.Transform(ComboList, each Text.From(_))),
AddIndex = Table.AddIndexColumn(Source, "Index", 0, 1),
ReverseIndeks = Table.AddIndexColumn(AddIndex, "RevIdx", Table.RowCount(AddIndex), -1),
Lists = Table.AddColumn(ReverseIndeks, "lists", each
List.Repeat(
List.Combine({
List.Repeat({[Column1]}, Number.Power(2,[RevIdx]-1)),
List.Repeat( {null}, Number.Power(2,[RevIdx]-1))
})
, Number.Power(2, [Index]))
),
ResultTable = Table.FromColumns(Lists[lists]),
#"Replaced Value" = Table.ReplaceValue(ResultTable,null,"0",Replacer.ReplaceValue,Table.ColumnNames(ResultTable )),
#"Added Index" = Table.AddIndexColumn(#"Replaced Value", "Index", 0, 1, Int64.Type),
totals = Table.AddColumn(#"Added Index", "Custom", each List.Sum(List.Transform(Record.ToList( Table.SelectColumns(#"Added Index",Table.ColumnNames(ResultTable )){[Index]}) , each Number.From(_)))),
#"Filtered Rows" = Table.SelectRows(totals, each ([Custom] = find)),
#"Kept First Rows" = Table.FirstN(#"Filtered Rows",1),
#"Unpivoted Other Columns" = Table.UnpivotOtherColumns(#"Kept First Rows", {"Custom", "Index"}, "Attribute", "Value"),
#"Filtered Rows1" = Table.SelectRows(#"Unpivoted Other Columns", each ([Value] <> "0")),
#"Removed Other Columns" = Table.SelectColumns(#"Filtered Rows1",{"Value"}),
FoundList = Table.TransformColumnTypes(#"Removed Other Columns",{{"Value", type number}}),
#"Merged Queries" = Table.NestedJoin(x, {"AMD"}, FoundList, {"Value"}, "output1", JoinKind.LeftOuter),
#"Expanded output1" = Table.ExpandTableColumn(#"Merged Queries", "output1", {"Value"}, {"Value"}),
#"Calculated Absolute Value" = Table.TransformColumns(#"Expanded output1",{{"Value", each if _=null then "Not Included" else "Included", type text}})
in #"Calculated Absolute Value",
Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Day", type text}, {"Employee_ID", Int64.Type}, {"AM", Int64.Type}, {"AMD", Int64.Type}}),
#"Grouped Rows" = Table.Group(#"Changed Type", {"Day", "Employee_ID"}, {{"data", each _, type table [Day=nullable text, Employee_ID=nullable number, AM=nullable number, AMD=nullable number]}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "Custom", each Process([data])),
#"Removed Other Columns" = Table.SelectColumns(#"Added Custom",{"Custom"}),
#"Expanded Custom" = Table.ExpandTableColumn(#"Removed Other Columns", "Custom", {"Day", "Employee_ID", "AM", "AMD", "Value"}, {"Day", "Employee_ID", "AM", "AMD", "Value"})
in
#"Expanded Custom"

Power BI How to remove duplicate rows under specific conditions keeping the latest entry?

Hello I need help removing duplicate rows under certain conditions.
Raw File
ANI
Date
Time
111-111-1111
8/7/2022
10:34:00 AM
111-111-1111
8/7/2022
12:00:00 PM
111-111-1111
8/7/2022
12:03:00 PM
222-222-2222
8/8/2022
10:50:00 AM
222-222-2222
8/8/2022
10:52:10 AM
333-333-3333
8/9/2022
12:29:00 PM
333-333-3333
8/9/2022
12:32:00 PM
333-333-3333
8/9/2022
12:33:00 PM
444-444-4444
8/10/2022
1:50:00 PM
444-444-4444
8/10/2022
1:51:00 PM
Raw File contains ANI column which shows different phone numbers called into my system,
Date and Time columns matching the time which the calls came in.
I want to remove the earliest entries of the back-to-back calls based on the same number and date only if called in within 3 minutes after the initial call.
This is the end result that I wish to my Power BI would see and count:
Result
ANI
Date
Time
111-111-1111
8/7/2022
10:34:00 AM
111-111-1111
8/7/2022
12:03:00 PM
222-222-2222
8/8/2022
10:52:10 AM
333-333-3333
8/9/2022
12:33:00 PM
444-444-4444
8/10/2022
1:51:00 PM
At the end, I want it to count the back-to-back calls just once if called in within 3 min time frame and leave alone singular calls made outside of that condition.
Please help.
Here is another way of doing this using the Query Editor:See the comments to help understand the algorithm
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("jc8xCoAwDIXhq0hni8lLRO3mAQT30vtfw1qiZCodfsjwwSM5B2aOFoc57Mu2gIB6MiXRRDSdVyhzV6KyV94jUpwEEC00ubv1ldx6XyLxL0UkWtLk4dZxuPWuFAxL/5GqRkubZPqpfTQk+ZPlAQ==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [ANI = _t, Date = _t, Time = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"ANI", type text}, {"Date", type date}, {"Time", type time}}),
//create a column with the combine datetime
#"Added Custom" = Table.AddColumn(#"Changed Type", "DateTime", each [Date]&[Time], type datetime),
//group by ANI
// for each ANI group
// Sort by datetime
// Add a shifted column to compare one row to the next
// Retain only those rows where the difference between the original and the next column is greater than 3 minutes
// or the last row which will have a null in the shifted column
#"Grouped Rows" = Table.Group(#"Added Custom", {"ANI"}, {
{"Filtered", (t)=>
let
sorted = Table.Sort(t,{"DateTime", Order.Ascending}),
shifted = Table.FromColumns(
Table.ToColumns(t) & {List.RemoveFirstN(t[DateTime],1) & {null}},
type table[ANI=text, Date=date, Time=time, DateTime=datetime, Shifted=datetime]),
deleteRows = Table.SelectRows(shifted, each [Shifted] = null or Duration.TotalMinutes([Shifted] - [DateTime]) > 3)
in
deleteRows, type table[ANI=text, Date=date, Time=time]}
}),
//re-expand the groups
#"Expanded Filtered" = Table.ExpandTableColumn(#"Grouped Rows", "Filtered", {"Date", "Time"})
in
#"Expanded Filtered"
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("lc8xCoAwDIXhq0hnS5OXiNrNAwju4v2voaVRs1WHHzJ8w8u+B2aOFnehD1MaEwgoN1MWzUTdsoajb1hcsNjtmxVnAUQL1U5+w0BuQ8si82NFJFpS7ew3YHYbGlbww/rfVDVaWi3Ti+23j5Zve5w=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [ANI = _t, Date = _t, Time = _t]),
#"Merged Columns" = Table.CombineColumns(Source,{"Date", "Time"},Combiner.CombineTextByDelimiter(" ", QuoteStyle.None),"Merged"),
#"Changed Type with Locale" = Table.TransformColumnTypes(#"Merged Columns", {{"Merged", type datetime}}, "en-US"),
#"Sorted Rows" = Table.Sort(#"Changed Type with Locale",{{"ANI", Order.Ascending}, {"Merged", Order.Descending}}),
#"Added Index" = Table.AddIndexColumn(#"Sorted Rows", "Index", 0, 1, Int64.Type),
#"Added Custom" = Table.AddColumn(#"Added Index", "Custom", each
try
if [ANI] = #"Sorted Rows"[ANI]{[Index] - 1} and #"Sorted Rows"[Merged]{[Index] - 1} -[Merged] <= #duration(0,0,3,0) then true else false
otherwise false
),
#"Filtered Rows" = Table.SelectRows(#"Added Custom", each ([Custom] = false)),
#"Removed Columns" = Table.RemoveColumns(#"Filtered Rows",{"Index", "Custom"}),
#"Inserted Date" = Table.AddColumn(#"Removed Columns", "Date", each DateTime.Date([Merged]), type date),
#"Inserted Time" = Table.AddColumn(#"Inserted Date", "Time", each DateTime.Time([Merged]), type time),
#"Removed Columns1" = Table.RemoveColumns(#"Inserted Time",{"Merged"})
in
#"Removed Columns1"

Converting column type to time when over 24 hours

I am trying to change the format of a column from text to time in Power Bi:
When I change the format to time, anything that is over 24 hours shows as an error:
Does anyone know how to resolve this?
You can try this below Advanced Editor code for your purpose.
Conditions Values like (73:30) required a delimiter : for this code. You can also adjust the code if different delimiter in use.
Output will in dd:hh:mm:ss
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjSwMjBQitWJVjKGs8yNrYyBrFgA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Time = _t]),
#"Duplicated Column" = Table.DuplicateColumn(Source, "Time", "Time - Copy"),
#"Split Column by Delimiter" = Table.SplitColumn(#"Duplicated Column", "Time - Copy", Splitter.SplitTextByDelimiter(":", QuoteStyle.Csv), {"Time - Copy.1", "Time - Copy.2"}),
#"Changed Type" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Time", type text}, {"Time - Copy.1", Int64.Type}, {"Time - Copy.2", Int64.Type}}),
#"Renamed Columns" = Table.RenameColumns(#"Changed Type",{{"Time - Copy.1", "hour"}, {"Time - Copy.2", "minutes"}}),
#"Added Custom" = Table.AddColumn(#"Renamed Columns", "Second", each ([hour]*60*60) + ([minutes]*60)),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom", each #duration(0,0,0,[Second])),
#"Changed Type1" = Table.TransformColumnTypes(#"Added Custom1",{{"Custom", type duration}})
in
#"Changed Type1"
Input-
Output-

How to remove duplicate values in a list in powerbi report dataset?

Hope you are staying safe. I have a column in my powerbi report dataset which holds list values in a column.
Id Name
1 kevin,yona,rachel,kevin
2 bruce,miller,kim
3 adam,rita,adam,adam
As you can see there are duplicate values in the list on Name column. I wanted to write a query which will remove those duplicates and and keep one occurences. the result set i want is like this
Id Name
1 yona,rachel,kevin
2 bruce,miller,kim
3 adam,rita
Any ideas? thanks
You could split, remove duplicates and group by again.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("Hck7CoBADIThu6SeRj1O2CKuAcO+IKjg7V3TDD/zMdNCoKKPdbyjC1zyqRXxUALTOn33Oyua1aqOYi1gmyCHNLhdgqh/KKUP", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", Int64.Type}, {"Column2", type text}}),
#"Split Column by Delimiter" = Table.ExpandListColumn(Table.TransformColumns(#"Changed Type", {{"Column2", Splitter.SplitTextByDelimiter(",", QuoteStyle.None), let itemType = (type nullable text) meta [Serialized.Text = true] in type {itemType}}}), "Column2"),
#"Changed Type1" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Column2", type text}}),
#"Removed Duplicates" = Table.Distinct(#"Changed Type1"),
#"Grouped Rows" = Table.Group(#"Removed Duplicates", {"Column1"}, {{"Rows", each _, type table [Column1=number, Column2=text]}}),
#"Added Custom" = Table.AddColumn(#"Grouped Rows", "Custom", each [Rows][Column2]),
#"Extracted Values" = Table.TransformColumns(#"Added Custom", {"Custom", each Text.Combine(List.Transform(_, Text.From), ","), type text}),
#"Removed Columns" = Table.RemoveColumns(#"Extracted Values",{"Rows"})
in
#"Removed Columns"