I am new with power BI and need your help.
I have 3 columns which are:
date1
reviseDate
Shipped_Date
I need to compare column reviseDate to shipped_date if it is <> or =.
How to I do it, if reviseDate column is blank, must refer on column Date1 to compare with shipped_Date column?
Try this as a calculated column:
Status =
VAR DueDate =
IF(ISBLANK(Table1[Revised]), [Date1], [Revised])
RETURN
SWITCH(
TRUE(),
ISBLANK(DueDate), BLANK(),
DueDate = Table1[Shipped_Date], "On Time",
DueDate < Table1[Shipped_Date], "Late",
DueDate > Table1[Shipped_Date], "Early"
)
This defines the date you want to compare as a variable that we use to check different conditions.
The SWITCH(TRUE(),...) is a useful construction that returns the specified result for the first condition in the list that evaluates to TRUE().
You can define a new calculated column like this:
Status = IF (AND(ISBLANK(Table1[Date1]); ISBLANK(Table1[ReviseDate])); BLANK();
IF (Table1[Shipped_Date] = IF(ISBLANK(Table1[ReviseDate]); Table1[Date1]; Table1[ReviseDate]); "On Time";
IF (Table1[Shipped_Date] > IF(ISBLANK(Table1[ReviseDate]); Table1[Date1]; Table1[ReviseDate]); "Late";
IF (Table1[Shipped_Date] < IF(ISBLANK(Table1[ReviseDate]); Table1[Date1]; Table1[ReviseDate]); "Early"; BLANK())
)
)
)
The expression IF(ISBLANK(Table1[ReviseDate]); Table1[Date1]; Table1[ReviseDate]) will return Date1 value in case ReviseDate is empty. Then it is matter of comparing this reference date with Shipped_Date and returning appropriate status.
If you want to make status column using Power Query Editor, you can use Add Column -> Conditional Column. Make one helper column, named ReferenceDate or something, which will compute the date we need to use for comparison:
Then add the actual Status column like this:
The M code will look like this (the data is embedded in it):
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bY7RCQAhDEN36bcUUxTrLOL+aygenCkI/WjymtAxxKp2tQyXJHuunGnDFmFj+HldAQqUGzC89nPlsdYZuiITrL98PRRqQUkYCyqZCw==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [Date1 = _t, ReviseDate = _t, Shipped_Date = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Date1", type date}, {"ReviseDate", type date}, {"Shipped_Date", type date}}),
#"Added Conditional Column" = Table.AddColumn(#"Changed Type", "ReferenceDate", each if [ReviseDate] = null then [Date1] else [ReviseDate]),
#"Added Conditional Column1" = Table.AddColumn(#"Added Conditional Column", "Status", each if [ReferenceDate] = null then "" else if [ReferenceDate] = [Shipped_Date] then "On Time" else if [ReferenceDate] < [Shipped_Date] then "Late" else if [ReferenceDate] > [Shipped_Date] then "Early" else null),
Status = #"Added Conditional Column1"{2}[Status]
in
Status
Depending on your data source, you can even modify the query and retrieve this value from the database (e.g. with Transact-SQL query if your data source is SQL Server).
Related
I have a table with power plant capacities in different years. There are only entries when something changes in the capacities. In the years not listed, the last value applies.
Plant
Year
Capacity
Cottam
2003
800
Cottam
2009
600
Cottam
2015
800
Drax
2000
600
Drax
2005
1200
Drax
2010
1800
Drax
2013
1200
Drax
2020
0
Ironbridge
2007
500
Ironbridge
2015
0
Now I would like to transform the initial table, so that I also have values for all years in between and can display them in a stacked column chart, for example. The result should look like shown in the table below. Marked in yellow are the numbers from the initial table.
You can do this easily in the Query Editor in M code.
To reproduce, paste the code below into a blank query:
let
//change next line to reflect your actual data source
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45Wcs4vKUnMVdJRMjIwMAZSFgYGSrE6qOKWQMoMU9zQFEm9S1FiBUS1AZJqhChIraERurAhSLEhhhmGxlhVG4FUQ8Q8i/LzkooyU9JTIcabAylTA6xyYGcCZWIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Plant = _t, Year = _t, Capacity = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Plant", type text}, {"Year", Int64.Type}, {"Capacity", Int64.Type}}),
//generate Table of all years
#"All Years" = Table.FromColumns(
{List.Numbers(List.Min(#"Changed Type"[Year]), List.Max(#"Changed Type"[Year])- List.Min(#"Changed Type"[Year]) + 1 )}),
//Group by Plant
// Aggregate by joining with the All Years table and "Fill Down" to replace blanks with previous year.
// then expand the grouped column
#"Group by Plant" = Table.Group(#"Changed Type","Plant",{
{"Joined", each Table.FillDown(Table.Join(#"All Years","Column1",_,"Year",JoinKind.FullOuter),{"Capacity"})}
}),
#"Expanded Joined" = Table.ExpandTableColumn(#"Group by Plant", "Joined", {"Column1", "Capacity"}, {"Column1", "Capacity"}),
//Replace nulls with zero's
#"Replaced Value" = Table.ReplaceValue(#"Expanded Joined",null,0,Replacer.ReplaceValue,{"Capacity"}),
//Pivot on year
// then set the data types
#"Pivoted Column" = Table.Pivot(Table.TransformColumnTypes(#"Replaced Value", {{"Column1", type text}}, "en-US"),
List.Distinct(Table.TransformColumnTypes(#"Replaced Value", {{"Column1", type text}}, "en-US")[Column1]), "Column1", "Capacity"),
//set data type
#"Changed Type1" = Table.TransformColumnTypes(#"Pivoted Column",
List.Transform(List.Sort(List.RemoveFirstN(Table.ColumnNames(#"Pivoted Column"),1), Order.Ascending), each {_, Int64.Type}))
in
#"Changed Type1"
Edit Note:
Actually, to create the graph in Power BI, you do NOT want to pivot the data, so the shorter code:
let
//change next line to reflect your actual data source
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45Wcs4vKUnMVdJRMjIwMAZSFgYGSrE6qOKWQMoMU9zQFEm9S1FiBUS1AZJqhChIraERurAhSLEhhhmGxlhVG4FUQ8Q8i/LzkooyU9JTIcabAylTA6xyYGcCZWIB", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Plant = _t, Year = _t, Capacity = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Plant", type text}, {"Year", Int64.Type}, {"Capacity", Int64.Type}}),
//generate Table of all years
#"All Years" = Table.FromColumns(
{List.Numbers(List.Min(#"Changed Type"[Year]), List.Max(#"Changed Type"[Year])- List.Min(#"Changed Type"[Year]) + 1 )}),
//Group by Plant
// Aggregate by joining with the All Years table and "Fill Down" to replace blanks with previous year.
// then expand the grouped column
#"Group by Plant" = Table.Group(#"Changed Type","Plant",{
{"Joined", each Table.FillDown(Table.Join(#"All Years","Column1",_,"Year",JoinKind.FullOuter),{"Capacity"})}
}),
#"Expanded Joined" = Table.ExpandTableColumn(#"Group by Plant", "Joined", {"Column1", "Capacity"}, {"Year", "Capacity"}),
//Replace nulls with zero's
#"Replaced Value" = Table.ReplaceValue(#"Expanded Joined",null,0,Replacer.ReplaceValue,{"Capacity"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Replaced Value",{{"Year", Int64.Type}, {"Capacity", Int64.Type}})
in
#"Changed Type1"
Then, in Power BI, you can generate this:
Note:
The code below presents the Table FillDown / Table Join sequence from the first code using variables and more comments. Should be easier to understand (might be less efficient, though)
...
{"Joined", each
let
//join the subtable with the All Years table
#"Joined Table" = Table.Join(#"All Years", "Column1", _, "Year", JoinKind.FullOuter),
//Fill down the Capacity column so as to fill with the "last year" data
// since that column will contain a null after the Table.Join for years with no data
#"Fill Down" = Table.FillDown(#"Joined Table",{"Capacity"})
in
#"Fill Down"
}
...
Here's how to solve this (more easily) in DAX:
Prerequisite is separate Calendar table with a 1:many relation on the year
Calendar =
SELECTCOLUMNS(
GENERATESERIES(
MIN(Plants[Year]),
MAX(Plants[Year])
),
"Year", [Value]
)
Next calculate the Last Given Capacity per year
Last Given Capacity =
VAR current_year =
MAX(Calendar[Year])
VAR last_capacity_year =
CALCULATE(
MAX(Plants[Year]),
'Calendar'[Year] <= current_year
)
RETURN
CALCULATE(
MAX(Plants[Capacity]),
Calendar[Year] = last_capacity_year
)
Finally put it all together in a Stacked Column Chart with
X-axis: 'Calendar'[Year]
Y-axis: [Last Given Capacity]
Legend: 'Plants'[Plant]
I have a table with received time like the below:
As you see, the frequency of received data in some rows is different and they are 1000ms, 1001ms, 998ms.
How can I calculate the average frequency of received time in ms?
I suggest using Power Query to
add a column that is the original date_time column offset by 1
Then add another column showing the difference between current row and previous row
This method, at least in M Code, is faster than using an Index column to refer to previous row
Then you can do your mathematical analyses.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("dc27DcAgDAXAVRA1An9iO2EVxP5rgEQZXn3FjZGjsTUh4cRdqctTwy3P8heD4lv8KgHl3RJX+dCjBIXRo3JkLg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [date_time = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"date_time", type datetime}}),
//add column containing previous rows data
//usually faster than using and INDEX column
prevList = {null} & List.RemoveLastN(#"Changed Type"[date_time]),
tbl1 = Table.FromColumns(
{#"Changed Type"[date_time],prevList},
{"date_time","Prev Row"}
),
#"Added Custom" = Table.AddColumn(tbl1, "Difference", each /*if [Prev Row] = null then 0
else*/ Duration.TotalSeconds([date_time] - [Prev Row])),
#"Changed Type1" = Table.TransformColumnTypes(#"Added Custom",{
{"date_time", type datetime}, {"Prev Row", type datetime}, {"Difference", type number}})
in
#"Changed Type1"
In PowerQuery add an Index column to your data so that you can address singe rows. Then with DAX add a calulated column to the table:
Milliseconds =
VAR NextIndex = 'Table'[Index] + 1
VAR MaxIndex = MAX(Table[Index])
VAR Difference =
IF(
'Table'[Index] = MaxIndex,
BLANK(),
'Table'[date_time] -
CALCULATE(
VALUES('Table'[date_time]),
Filter(
ALL('Table'),
'Table'[Index] = NextIndex
)
)
)
RETURN
CONVERT(ABS(Difference * 3600 * 24 * 1000), INTEGER)
Now you are looking for AVERAGE( Milliseconds ).
Note: Things would have been easier if you provided copyable data instead of a screenshot.
I am having some issues with how to approach my query so any help would be greatly appreciated.
I have a date column that I need to increase based on two other columns values.
e.g. Date Reported column - 17/12/2018
If my Impact Column = "Urgent" and my Department = "Stores" I would need to increase my Date Reported Column to 18/12/2018
However if my Impact Column = "Standard" and my Department = "Floor" I would need to increase my Date Reported Column to 20/12/208
I would ideally like to not touch the original Date Reported Column but move this new value to another column.
So Far I have created a custom column and this is my code however it doesnt work.
AmendedDateReported = if(And(SurveyCorrectiveAction[Impact] = "Urgent", SurveyCorrectiveAction[LookUp] = "Stores"), Date.AddDays([DateReported],1),Blank ())
Thanks
Paula
Updated code, The formula seems to be pulling ok but the date part wont update:
#"Sorted Rows" = Table.Sort(Source,{{"DateReported", Order.Ascending}}),
#"Changed Type" = Table.TransformColumnTypes(#"Sorted Rows",{{"DateReported", type date}}),
#"Sorted Rows1" = Table.Sort(#"Changed Type",{{"DateReported", Order.Descending}}),
#"Added Custom" = Table.AddColumn(#"Sorted Rows1", "Date Repaired", each ""),
#"Changed Type1" = Table.TransformColumnTypes(#"Added Custom",{{"Date Repaired", type text}}),
#"Duplicated Column" = Table.DuplicateColumn(#"Changed Type1", "DateReported", "DateReported - Copy"),
#"Renamed Columns" = Table.RenameColumns(#"Duplicated Column",{{"DateReported - Copy", "AmendedDateReported"}}),
#"Merged Amendments" = Table.NestedJoin(#"Renamed Columns",{"Impact", "Department"},TLU_FaultTimeScales,{"Impact", "Department"},"TLU_FaultTimeScales",JoinKind.LeftOuter),
#"Expanded Amendments" = Table.ExpandTableColumn(#"Merged Amendments", "TLU_FaultTimeScales", {"Amendment Day"}, {"Amendment Day"}),
AmendedDateReported = Table.AddColumn(#"Expanded Amendments", "AmendedDateReported", each try Date.AddDays([DateReported],[Amendment Day]) otherwise [DateReported], type date)
in
#"Renamed Columns"
You could try:
AmendedDateReported =
Table.AddColumn(
#"Previous Step",
"Amended Date Reported",
each Date.AddDays(
[Date Reported],
if [Impact] = "Urgent" and [Department] = "Stores" then 1
else if [Impact] = "Standard" and [Department] = "Floor" then 3
else 0
),
type date
)
If you have several combinations of Impact / Department which have variable effect on amending the date, it would make more sense to put those in a separate table:
+----------+------------+----------------+
| Impact | Department | Amendment Days |
+----------+------------+----------------+
| Urgent | Stores | 1 |
| Standard | Floor | 3 |
+----------+------------+----------------+
You can then join this table to retrieve the amendment days:
#"Merged Amendments" = Table.NestedJoin(#"Previous Step",{"Impact", "Department"},tblAmendments,{"Impact", "Department"},"tblAmendments",JoinKind.LeftOuter),
#"Expanded Amendments" = Table.ExpandTableColumn(#"Merged Amendments", "tblAmendments", {"Amendment Days"}, {"Amendment Days"}),
AmendedDateReported = Table.AddColumn(#"Expanded Amendments", "Amended Date Reported", each try Date.AddDays([Date Reported],[Amendment Days]) otherwise [Date Reported], type date)
in
AmendedDateReported
Remember to update the final variable name after the in clause.
In PowerBI, I have a simple table with 3 columns:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("TYzBCcAwDAN38dugyk5bdxaT/deICzXN77hDyhSKCkHYwafwbJyaYiUc9I4XaH/1MgHeXQO+bcf7ux0XW3x5Lg==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [id = _t, start = _t, end = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"id", Int64.Type}, {"start", type date}, {"end", type date}})
in
#"Changed Type"
From which I can create the following visual
My challenge is to calculate the total duration in days of the times series based on filter selected above. Any help would be appreciated.
I have tried the following DAX formula but it gives me crazy results as shown above.
YTDDuration =
var start_Date=FIRSTDATE(ALLSELECTED('CALENDAR'[Date]))
var end_Date=LASTDATE(ALLSELECTED('CALENDAR'[Date]))
var current_Start=MAX(Table2[start])
var current_end=MAX(Table2[end])
var bigif=IF(current_end>start_Date&¤t_Start<end_Date,DATEDIFF(MAX(start_Date,current_Start),MIN(end_Date,current_end),DAY),0)
RETURN
CALCULATE(SUMX(Table2, bigif),FILTER(ALL(Table2), Table2[start] <= max(Table2[end])))
Expected output would be:
The key here is to account for gaps in dates and consolidate overlapping dates.
You can iterate through your calendar table and count the number of days where that day falls into one of the id time periods.
YTDDuration =
var current_Start = CALCULATE(MIN(Table2[start]), ALL(Table2))
var current_end = MAX(Table2[end])
RETURN
SUMX(
FILTER('CALENDAR', 'CALENDAR'[Date] <= current_end),
MAXX(ALL(Table2), IF([Date] > [start] && [Date] <= [end], 1, 0))
)
This starts at the minimal start date and adds 1 for each day where that Date is between start and end for some row in Table2. If none of the rows contains that Date, the max is over just zeros and returns zero. If one or more matches, you get a max of one and we count that day.
YTD Duration by end:
I have a query, which I'd like to filter by the new date column I created.
Basically, anything which is 42 days earlier than that date is accepted.
I've tried doing a filter by date, and then substracting it by 42, but it does not work?
let
#"SQL-JM" = let
Source = Sql.Databases("xxx.xxx.xxx.xxx"),
MNH = Source{[Name="DBT"]}[Data],
#"DBO-JM" = DBT{[Schema="dbo",Item="DBO-JM"]}[Data]
in
#"DBO-JM",
#"Added Custom1" = Table.AddColumn(#"DBO-JM", "Start_of_QTR", each Date.StartOfQuarter(DateTime.LocalNow())),
in
#"Filtered Rows"
If I understand your task correctly, you don't need custom column at all.
I'd do like that:
let
#"SQL-JM" = let
Source = Sql.Databases("xxx.xxx.xxx.xxx"),
MNH = Source{[Name="DBT"]}[Data],
#"DBO-JM" = DBT{[Schema="dbo",Item="DBO-JM"]}[Data],
GetFilterDate = Date.From(Date.StartOfQuarter(DateTime.LocalNow())), //You can use any logic to get that date
FilterRows = Table.SelectRows(#"DBO-JM", each [DateStamp] < GetFilterDate) //You can also add some modifications to GetFilterDate using each row's values, if you need
in
FilterRows
Assume that you want to filter by the DateStamp column which 42 days earlier than Start_of_QTR is accepted, you can add the following line after the #"Added Custom1" line:
#"Filtered Rows" = Table.SelectRows(#"Added Custom1", each [DateStamp] < Date.AddDays([Start_of_QTR], -42))