I am looking at the cost of each position from the perspective of total benefits and earnings amount to arrive at the benefit ratio for each employee at our company. I've determined the benefit rate by year for each position. This was formatted as:
I then created a new query and pivoted my fiscal year and benefit rate into columns data.
However I still have duplicate values of the position #. Ideally, I want distinct values for my position # field and for it to format as:
| Position# | Benefit Rate 2017 |Benefit Rate 2018 |
| 00001581 | 20.17% | 21.58%
| 00001852 | 35.00% | 40.50%
What is the best solution? Is this something I can do in PowerQuery or do I need to do Dax. Complicating things is that there are definetly null values in some Years because positions aren't always filled.
I created a new table using Distinct to isolate my Position #
Then I tried using LookupValue:
2018 = LookupValue('PositionNumberBenefitRateByYear'[2018],PositionNumberBenefitRateByYear[Employee Position.Position Number], 'Table'[Employee Position.Position Number])
Where PositionNumberBenefite by Year holds the values that I want to pull over and 'Table'[Employee Position.Position Number]) is the related value between each table.
But I get this error: A table of multiple values was supplied where a single value was expected.
Please check my code:
You need to do the actual work on PQ Editor: Here is the full M Code:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bdBBDoUwCATQu3TdxQxtKZzFeP9r6Lf8qBE2LF5mEti2gnMISqlFIL+F2gQse/3j6BfSL2TT+UXBQuUjOYiVtKi1DGegtxuN/YU+9EaHPlGGe4IWCEtwnSLa8MU4RVQzZNTOnqAE2vmh/QA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Position Number" = _t, #"Fiscal Year" = _t, #"Benefit Ratio" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Fiscal Year", Int64.Type}, {"Benefit Ratio", type number}}),
#"Pivoted Column" = Table.Pivot(Table.TransformColumnTypes(#"Changed Type", {{"Fiscal Year", type text}}, "tr-TR"), List.Distinct(Table.TransformColumnTypes(#"Changed Type", {{"Fiscal Year", type text}}, "tr-TR")[#"Fiscal Year"]), "Fiscal Year", "Benefit Ratio", List.Sum),
#"Renamed Columns" = Table.RenameColumns(#"Pivoted Column",{{"2022", "Benefit Rate 2022"}, {"2019", "Benefit Rate 2019"}, {"2020", "Benefit Rate 2020"}, {"2018", "Benefit Rate 2018"}, {"2017", "Benefit Rate 2017"}, {"2021", "Benefit Rate 2021"}})
in
#"Renamed Columns"
After making above listed steps of transformation, Your data should actually look like this:
Then Go to Main Screen, Create a Table Visual, and then put columns into appropriate places. (Do not forget to format values to (%) percentage). If you do it, resulting screen will look like:
I hope This is what you are looking for!
Related
I have a table which contains three columns. Column 1 - Employees (A & B), Column 2 - List of dates (different for each employees), Column 3 - Change in price for each list of dates.
I have to calculate the standard deviation for each employee based on the change in price for the two different list of dates.
when i'm using standard deviation formula in power query or in power bi. The standard deviation is getting results for all the dates and not for the specific list of dates. i.e., if the total dates is from 1st January to 31st January and for employee A, the list of dates is from 1st to 10th and for employee B the list of dates is from 20th to 31st. the formula calculates the standard deviation for 1st to 31st and not for the specific dates for each employee.
Is it possible for me to do this in power query? Any help would be appreciated.
PowerQuery is made for ETL, DAX is made for Analysis.
If you need a result column use:
Stdev of change = STDEV.S('Table'[Change in price])
If you want to have single results:
Stdev A =
CALCULATE(
STDEV.S('Table'[Change in price]),
'Table'[Employee] = "A"
)
Stdev B =
CALCULATE(
STDEV.S('Table'[Change in price]),
'Table'[Employee] = "B"
)
Sample data:
In PowerQuery use:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ddAxDoAgDIXhuzBrgi1FGPUahPtfwz4dBFsTl/cNpP6thSMsgSLxGjf97hH68nHSsTnOOtjxhCHWBe84nuHF+o5B1svPnVWHfV4v1zMfPwcnuETrQ4bJkaE4zm+eyZGhOo4M4tyDDMnx/c08OTIkx5EhW+efDoz/zer9Ag==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Employee = _t, Date = _t, #"Change in price" = _t]),
#"Changed Type" = Table.TransformColumnTypes(
Source,
{
{"Employee", type text},
{"Date", type date},
{"Change in price", Int64.Type}
}
),
#"Grouped Rows" = Table.Group(
#"Changed Type", {"Employee"},
{
{"Stdev", each List.StandardDeviation([Change in price]), type nullable number}
}
)
in
#"Grouped Rows"
Basically you start with the "Group By" GUI
and then go into the M-Code and replace List.Average with List.StandardDeviation, since this operation is not directly available in the GUI.
I'm working with Dynamics CRM project operations. I am currently putting together a report based on task effort that is being assigned to resources.
The data being pulled contains a 'planned work' column. These are strings with multiple dates and hours included in the payload.
A single cell example is:
[{""End"":""\/Date(1660838400000)\/"",""Hours"":8,""Start"":""\/Date(1660809600000)\/""},{""End"":""\/Date(1660924800000)\/"",""Hours"":9,""Start"":""\/Date(1660892400000)\/""},{""End"":""\/Date(1661184000000)\/"",""Hours"":9,""Start"":""\/Date(1661151600000)\/""}]
What I need to do is, pull the dates and hours for each entry and add it to a new table so they each have their own rows. Example desired output for this cell:
Start Date
End Date
Hours
1660809600000
1660838400000
8
1660892400000
1660924800000
9
1661151600000
1660924800000
9
The cells can vary in length with multiple entries so it needs to take that into account.
Is there anyone how can point me in the right direction on how this can be done in Power BI?
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45Wiq6OUYpRcs1LAVFWIELfJbEkVcPQzMzAwtjCxAAENPVBEjogwiO/tKgYrNYCzA8uSSwqwabXwNIMSW+tDh57LI1MLHDbY4nfHqBmIu0xNIR6hwx7DA1NDVH8E6sUGwsA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t]),
#"Replaced Value" = Table.ReplaceValue(Source,"""""","""",Replacer.ReplaceText,{"Column1"}),
#"Parsed JSON" = Table.TransformColumns(#"Replaced Value",{},Json.Document),
#"Expanded Column1" = Table.ExpandListColumn(#"Parsed JSON", "Column1"),
#"Expanded Column2" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1", {"End", "Hours", "Start"}, {"End", "Hours", "Start"}),
#"Replaced Value1" = Table.ReplaceValue(#"Expanded Column2","/Date(","",Replacer.ReplaceText,{"End", "Start"}),
#"Replaced Value2" = Table.ReplaceValue(#"Replaced Value1",")/","",Replacer.ReplaceText,{"End", "Start"})
in
#"Replaced Value2"
Adding to #David's answer above, you can also do the two replace value transforms in a single step:
#"Replaced Value1" = Table.ReplaceValue(
#"Expanded Column2",
"blah",
"blah",
(column_value, new, old) => Text.Select(column_value, {"0".."9"}),
{"End", "Start"}
)
I am trying to create a new table that contains a column with start of week and the hours estimated to be spent for that week on a per project basis.
Start of Week
Project
Hours
6/20/2022
ABC_XXX
10
6/27/2022
ABC_XXX
10
6/20/2022
ABC_YYY
40
6/27/2022
ABC_YYY
40
I have a table of dates representing the start of week for every project in the project table.
week start date = [date]-weekday([date],2)+1
Start of Week
6/20/2022
6/27/2022
7/4/2022
The project table contains (among other things) the project name, estimated start date, duration, and hours per week.
Project Name
Estimated Start Date
Duration in weeks
Hours Per Week
ABC_XXX
6/13/2022
8
10
ABC_YYY
6/04/2022
27
40
I am having trouble getting off the starting line. I know I need to evaluate on a per project basis and loop through all of the dates in my date table but can't find a good method to start with. I have done a lot of more simple things with creating new tables and calculations but this one is a little more complicated for me to get started. Any advice would be greatly appreciated.
The ultimate goal for this data is to present a trend showing estimated project demand over time that can be filtered by project or summed across all projects as well as filtered by timeline and displayed in a calendar view but it all starts with getting the data into this format I believe.
Here's a Power Query solution. The steps in the code below are:
use Date.AddWeeks to calculate the end date
List dates between two dates
Expand the list of dates and convert to date format
Use Date.DayOfWeek to create a day of week column
filter the table for day of week = 1 to include only weekly values (starting on Monday)
.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcnRyjo+IiFDSUTLTNzTWNzIwMgKyLYDY0EApVgeiIDIyEqzABCZvZA4kTBAKoqKigALmCAVAOaAqpdhYAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Project Name" = _t, #"Estimated Start Date" = _t, #"Duration in weeks" = _t, #"Hours Per Week" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Project Name", type text}, {"Estimated Start Date", type date}, {"Duration in weeks", Int64.Type}, {"Hours Per Week", Int64.Type}}),
#"Added Custom1" = Table.AddColumn(#"Changed Type", "Estimated End Date", each Date.AddWeeks([Estimated Start Date],[Duration in weeks])),
#"Added Custom" = Table.AddColumn(#"Added Custom1", "dates", each {Number.From([Estimated Start Date])..Number.From([Estimated End Date])}),
#"Expanded Custom" = Table.ExpandListColumn(#"Added Custom", "dates"),
#"Changed Type1" = Table.TransformColumnTypes(#"Expanded Custom",{{"dates", type date}}),
#"Added Custom4" = Table.AddColumn(#"Changed Type1", "day of week", each Date.DayOfWeek([dates])),
#"Filtered Rows" = Table.SelectRows(#"Added Custom4", each ([day of week] = 1))
in
#"Filtered Rows"
I have a table containing sales data, but in different currencies. I would like to have my sales data to show sales in GBP only and also total sales in GBP
Input table of sales
Currency exchange rate table
Final Output table]
Hope to find a solution to this as the sales are per date so conversion has to be as per rate on that date.
Load the second table into powerquery (data .. from table/range ... [x] headers) , name the query Currency Conversion Rate in the upper right, and file ... close and load to ... only connection
Load the first table into powequery same methodology. Use any name
home... merge queries, choose Currency Conversion Rate table from drop down on bottom
Click currency on top and Currency name on bottom to match them. Hold down CTRL key and repeat for Invoice date and exchange rate date. Leave join kind as left outer. Hit okay to accept other options
Use arrows atop new column to expand [x] exchange rate field
Add column .. custom column ... and insert forumula to multiply the relevant fields such as
= [price per unit]*[Exchange rate]
Add column .. custom column ... and insert forumula to multiply the relevant fields
= [unit sold]*[price per unit]*[Exchange rate]
file close and load
I get that yours was a quick example, but note that powequery is case sensitive, and your two table examples use euro in one table and EURO in the other table so normally they will not match on the merge. It is also sensitive to spelling, so using "curreny name" as a column header and then coding for "currency name" would not work
sample full code that could be dumped into home ... advanced editor...
let Source = Excel.CurrentWorkbook(){[Name="Table2"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Book ID", Int64.Type}, {"Invoice date", type datetime}, {"Currency", type text}, {"unit sold", Int64.Type}, {"price per unit", Int64.Type}}),
#"Merged Queries" = Table.NestedJoin(#"Changed Type", {"Invoice date", "Currency"}, #"Currency Conversion Rate", {"Exchange rate date", "Currency Name"}, "Currency Conversion Rate", JoinKind.LeftOuter),
#"Expanded Currency Conversion Rate" = Table.ExpandTableColumn(#"Merged Queries", "Currency Conversion Rate", {"Exchange rate"}, {"Exchange rate"}),
#"Added Custom" = Table.AddColumn(#"Expanded Currency Conversion Rate", "Unit price in GBP", each [price per unit]*[Exchange rate]),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Total in GBP", each [unit sold]*[price per unit]*[Exchange rate])
in #"Added Custom1"
I do Power BI for a logistics company. We want to show performance by stop location. The data is currently a table of all orders by Order ID, so -- ID, Rev $, Pickup Stop, Delivery Stop. Everything is a 2-stop load, fortunately.
What I am struggling with is building a calculated table that looks at the Pickup Stop AND the Delivery Stop at the same time while ALSO respecting filters set on the page. I would like the stops table to say something like: Stop Location, X Pickups, $X Pickup Revenue, X Deliveries, $X Delivery Revenue.
How would I go about this? I've tried a number of approaches but every time it either misses filters or can only handle one stop at a time.
Thanks!
Current Datacall it Orders
The calculated table I'm trying to makecall it Stops
One method of creating your Stops, given your Orders is by using Power Query, accessed via Queries=>Transform Data on the Power BI Home Tab.
The Table.Group function is where the magic happens. Unfortunately, it needs to be done by coding in the Advanced Editor, as the UI does not provide for these custom aggregations.
When the PQ Editor opens: Home => Advanced Editor
The first three lines should be replaced by whatever you are reading in your own Orders table with.
Paste the rest of M Code below in place of what is below your setup lines in your own query
Read the comments and explore the Applied Steps to understand the algorithm
M Code
let
//Input data and set datatypes
//These lines should be replaced with whatever you need to
//set up your data table
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bYzBCoMwEER/Zck5BxPRu1RaLJaW6qEQPIS4tEExoonQv+/a0oLQyw5vZnaUYuepxQmKnHFWO697uOKCQ0DiizVdGKHybiTKsbcLTs8PN1wxIZMooiR938z3evCawyFbKczeDhzq268qyBZpsg23f9+qJF+Skuwe1ui741CU/2djsmO53lJ3SFsth/3aPWrTzY7Kp4o1zQs=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Promoted Headers" = Table.PromoteHeaders(Source, [PromoteAllScalars=true]),
dataSource = Table.TransformColumnTypes(#"Promoted Headers",{
{"Order ID", Int64.Type}, {"Total Revenue", Int64.Type},
{"Pickup Stop", type text}, {"Delivery Stop", type text}}),
//Unpivot to get single column of Stops
#"Unpivoted Columns" = Table.UnpivotOtherColumns(dataSource, {"Order ID", "Total Revenue"}, "Attribute", "Stop"),
//Group by stop and do the aggregations
#"Grouped Rows" = Table.Group(#"Unpivoted Columns", {"Stop"}, {
{"Orders Picked Up", (t)=> List.Count(List.Select(t[Attribute], each _ = "Pickup Stop" )), Int64.Type},
{"Total Revenue Picked Up", (t)=> List.Sum(Table.SelectRows(t, each [Attribute]="Pickup Stop")[Total Revenue]), type number},
{"Orders Delivered", (t)=> List.Count(List.Select(t[Attribute], each _ = "Delivery Stop" )), Int64.Type},
{"Total Revenue Delivered", (t)=> List.Sum(Table.SelectRows(t, each [Attribute]="Delivery Stop")[Total Revenue]), type number}
})
in
#"Grouped Rows"
Orders
Stops