I have a table containing sales data, but in different currencies. I would like to have my sales data to show sales in GBP only and also total sales in GBP
Input table of sales
Currency exchange rate table
Final Output table]
Hope to find a solution to this as the sales are per date so conversion has to be as per rate on that date.
Load the second table into powerquery (data .. from table/range ... [x] headers) , name the query Currency Conversion Rate in the upper right, and file ... close and load to ... only connection
Load the first table into powequery same methodology. Use any name
home... merge queries, choose Currency Conversion Rate table from drop down on bottom
Click currency on top and Currency name on bottom to match them. Hold down CTRL key and repeat for Invoice date and exchange rate date. Leave join kind as left outer. Hit okay to accept other options
Use arrows atop new column to expand [x] exchange rate field
Add column .. custom column ... and insert forumula to multiply the relevant fields such as
= [price per unit]*[Exchange rate]
Add column .. custom column ... and insert forumula to multiply the relevant fields
= [unit sold]*[price per unit]*[Exchange rate]
file close and load
I get that yours was a quick example, but note that powequery is case sensitive, and your two table examples use euro in one table and EURO in the other table so normally they will not match on the merge. It is also sensitive to spelling, so using "curreny name" as a column header and then coding for "currency name" would not work
sample full code that could be dumped into home ... advanced editor...
let Source = Excel.CurrentWorkbook(){[Name="Table2"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Book ID", Int64.Type}, {"Invoice date", type datetime}, {"Currency", type text}, {"unit sold", Int64.Type}, {"price per unit", Int64.Type}}),
#"Merged Queries" = Table.NestedJoin(#"Changed Type", {"Invoice date", "Currency"}, #"Currency Conversion Rate", {"Exchange rate date", "Currency Name"}, "Currency Conversion Rate", JoinKind.LeftOuter),
#"Expanded Currency Conversion Rate" = Table.ExpandTableColumn(#"Merged Queries", "Currency Conversion Rate", {"Exchange rate"}, {"Exchange rate"}),
#"Added Custom" = Table.AddColumn(#"Expanded Currency Conversion Rate", "Unit price in GBP", each [price per unit]*[Exchange rate]),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Total in GBP", each [unit sold]*[price per unit]*[Exchange rate])
in #"Added Custom1"
Related
I am looking at the cost of each position from the perspective of total benefits and earnings amount to arrive at the benefit ratio for each employee at our company. I've determined the benefit rate by year for each position. This was formatted as:
I then created a new query and pivoted my fiscal year and benefit rate into columns data.
However I still have duplicate values of the position #. Ideally, I want distinct values for my position # field and for it to format as:
| Position# | Benefit Rate 2017 |Benefit Rate 2018 |
| 00001581 | 20.17% | 21.58%
| 00001852 | 35.00% | 40.50%
What is the best solution? Is this something I can do in PowerQuery or do I need to do Dax. Complicating things is that there are definetly null values in some Years because positions aren't always filled.
I created a new table using Distinct to isolate my Position #
Then I tried using LookupValue:
2018 = LookupValue('PositionNumberBenefitRateByYear'[2018],PositionNumberBenefitRateByYear[Employee Position.Position Number], 'Table'[Employee Position.Position Number])
Where PositionNumberBenefite by Year holds the values that I want to pull over and 'Table'[Employee Position.Position Number]) is the related value between each table.
But I get this error: A table of multiple values was supplied where a single value was expected.
Please check my code:
You need to do the actual work on PQ Editor: Here is the full M Code:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bdBBDoUwCATQu3TdxQxtKZzFeP9r6Lf8qBE2LF5mEti2gnMISqlFIL+F2gQse/3j6BfSL2TT+UXBQuUjOYiVtKi1DGegtxuN/YU+9EaHPlGGe4IWCEtwnSLa8MU4RVQzZNTOnqAE2vmh/QA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Position Number" = _t, #"Fiscal Year" = _t, #"Benefit Ratio" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Fiscal Year", Int64.Type}, {"Benefit Ratio", type number}}),
#"Pivoted Column" = Table.Pivot(Table.TransformColumnTypes(#"Changed Type", {{"Fiscal Year", type text}}, "tr-TR"), List.Distinct(Table.TransformColumnTypes(#"Changed Type", {{"Fiscal Year", type text}}, "tr-TR")[#"Fiscal Year"]), "Fiscal Year", "Benefit Ratio", List.Sum),
#"Renamed Columns" = Table.RenameColumns(#"Pivoted Column",{{"2022", "Benefit Rate 2022"}, {"2019", "Benefit Rate 2019"}, {"2020", "Benefit Rate 2020"}, {"2018", "Benefit Rate 2018"}, {"2017", "Benefit Rate 2017"}, {"2021", "Benefit Rate 2021"}})
in
#"Renamed Columns"
After making above listed steps of transformation, Your data should actually look like this:
Then Go to Main Screen, Create a Table Visual, and then put columns into appropriate places. (Do not forget to format values to (%) percentage). If you do it, resulting screen will look like:
I hope This is what you are looking for!
I am trying to create a new table that contains a column with start of week and the hours estimated to be spent for that week on a per project basis.
Start of Week
Project
Hours
6/20/2022
ABC_XXX
10
6/27/2022
ABC_XXX
10
6/20/2022
ABC_YYY
40
6/27/2022
ABC_YYY
40
I have a table of dates representing the start of week for every project in the project table.
week start date = [date]-weekday([date],2)+1
Start of Week
6/20/2022
6/27/2022
7/4/2022
The project table contains (among other things) the project name, estimated start date, duration, and hours per week.
Project Name
Estimated Start Date
Duration in weeks
Hours Per Week
ABC_XXX
6/13/2022
8
10
ABC_YYY
6/04/2022
27
40
I am having trouble getting off the starting line. I know I need to evaluate on a per project basis and loop through all of the dates in my date table but can't find a good method to start with. I have done a lot of more simple things with creating new tables and calculations but this one is a little more complicated for me to get started. Any advice would be greatly appreciated.
The ultimate goal for this data is to present a trend showing estimated project demand over time that can be filtered by project or summed across all projects as well as filtered by timeline and displayed in a calendar view but it all starts with getting the data into this format I believe.
Here's a Power Query solution. The steps in the code below are:
use Date.AddWeeks to calculate the end date
List dates between two dates
Expand the list of dates and convert to date format
Use Date.DayOfWeek to create a day of week column
filter the table for day of week = 1 to include only weekly values (starting on Monday)
.
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WcnRyjo+IiFDSUTLTNzTWNzIwMgKyLYDY0EApVgeiIDIyEqzABCZvZA4kTBAKoqKigALmCAVAOaAqpdhYAA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Project Name" = _t, #"Estimated Start Date" = _t, #"Duration in weeks" = _t, #"Hours Per Week" = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Project Name", type text}, {"Estimated Start Date", type date}, {"Duration in weeks", Int64.Type}, {"Hours Per Week", Int64.Type}}),
#"Added Custom1" = Table.AddColumn(#"Changed Type", "Estimated End Date", each Date.AddWeeks([Estimated Start Date],[Duration in weeks])),
#"Added Custom" = Table.AddColumn(#"Added Custom1", "dates", each {Number.From([Estimated Start Date])..Number.From([Estimated End Date])}),
#"Expanded Custom" = Table.ExpandListColumn(#"Added Custom", "dates"),
#"Changed Type1" = Table.TransformColumnTypes(#"Expanded Custom",{{"dates", type date}}),
#"Added Custom4" = Table.AddColumn(#"Changed Type1", "day of week", each Date.DayOfWeek([dates])),
#"Filtered Rows" = Table.SelectRows(#"Added Custom4", each ([day of week] = 1))
in
#"Filtered Rows"
I am having a problem in PowerBi, where I want to get results for Actual vs Plan. I have 3 tables , Book Sales Table, Plan table and Titles table. I would like the result as in Desired Output table.
Book Sales
Titles
Plan
Desired Output
You can do this in Power Query M Code
Join the Sales and Titles tables to extract the publishing unit for each book type
Add a column representing the start of each month for each sales number
Group by BOM and Unit and aggregage with Sum in case you have multiple sales/unit in a month
Join this last table with the Plan table based on Unit and Date
M Code
let
//Join the Sales and Titles to get the Publishing units
Source = Table.NestedJoin(#"Book Sales", {"Book ID"}, Titles, {"Book Id"}, "Titles", JoinKind.LeftOuter),
#"Expanded Titles" = Table.ExpandTableColumn(Source, "Titles", {"Publishing Unit"}, {"Publishing Unit"}),
//add start of month column to merge with Plan
bom = Table.AddColumn(#"Expanded Titles", "BOM", each Date.StartOfMonth([Invoice Date]),type date),
//Group by unit and bom columns in case you happen to have multiple sales in the same month
#"Grouped Rows" = Table.Group(bom, {"Publishing Unit", "BOM"}, {{"Sales amount", each List.Sum([Sales amount]), type number}}),
//Join with Plan using bom and date as the key
salesVplan = Table.NestedJoin(Plan,{"Business Unit","Date"}, #"Grouped Rows", {"Publishing Unit","BOM"},"joined",JoinKind.FullOuter),
#"Expanded joined" = Table.ExpandTableColumn(salesVplan, "joined", {"Sales amount"}, {"Sales amount"})
in
#"Expanded joined"
This could be done in several different ways, but I would recommend doing this in SQL personally. This means that you can:
Select from all the tables you want to show, into the same table. Lets call it Products.
Then you join in the values that you want on that.
This way, there is no dax calculation that takes time, but rather just something that takes time while you refresh the dataset itself.
I have a source table that has projects with a start date and duration in months. I'm looking to write a PowerQuery for PowerBI that will create a row for each month of the project, counting up the months. For example:
Source:
Project(string) | Date (ms timestamp) | Duration (integer)
A | Jan-2022 | 3
B | Sep-2022 | 2
Result:
Project | Date
A | Jan-2022
A | Feb-2022
A | Mar-2022
B | Sep-2022
B | Oct-2022
Not sure where to start or what this query should look like. Any ideas?
Edit: Changed sample tables to make them readable
Edit: Dates in the source table are provided in millisecond timestamp format (eg 1641024000000). My intent in the result table is to have them in a human-readable date format.
Here is one way to do this in Power Query.
Paste the code into a blank query.
Then Change the Source line so as to load your actual data table.
I used an Excel table for the source, but you may use what ever.
I also have the unix time stamp in the Source table, converting it to a PQ date in the M Code.
If all of your time stamps do not equate to the start of the month, some additional logic may be required.
Read the code comments and explore the Applied Steps to understand the algorithm
let
//Read in the Source data
Source = Excel.CurrentWorkbook(){[Name="Table27"]}[Content],
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Project", type text}, {" Date", Int64.Type}, {" Duration", Int64.Type}}),
//convert date from unixTime in milliseconds to a PQ date
unixTime = Table.TransformColumns(#"Changed Type",{" Date", each #duration(0,0,0,_/1000)+#date(1970,1,1)}),
//add custom column with a List of the desired dates
#"Added Custom" = Table.AddColumn(unixTime, "Months", each
List.Accumulate(
{0..[#" Duration"]-1},
{},
(state,current)=> state & {Date.AddMonths([#" Date"],current)})),
//Remove unneeded columns
//Expand the list and set the data thype
#"Removed Columns" = Table.RemoveColumns(#"Added Custom",{" Date", " Duration"}),
#"Expanded Months" = Table.ExpandListColumn(#"Removed Columns", "Months"),
#"Changed Type1" = Table.TransformColumnTypes(#"Expanded Months",{{"Months", type date}})
in
#"Changed Type1"
For some reason sqlfiddle was down for me so I made an example in db-fiddle using postgres instead of ms-sql.
What you're looking to accomplish can be done with a recursive CTE, the syntax in MS-SQL is slightly different but this should get you most of the way there.
WITH RECURSIVE project_dates AS(
SELECT
start_date as starting_date,
CAST(start_date + duration*INTERVAL '1 month' as date) as end_date,
project
FROM projects
UNION
SELECT
CAST(starting_date + INTERVAL '1 month' as date),
pd.end_date,
p.project
FROM projects p
JOIN project_dates pd ON pd.project = p.project
WHERE CAST(starting_date + INTERVAL '1 month' as date) < pd.end_date
)
SELECT starting_date, project FROM project_dates
ORDER BY project, starting_date
My results using your date look as such.
You can check out my answer on db-fiddle with this link: https://www.db-fiddle.com/f/iS7uWFGwiMbEmFtNmhsiWt/0
try below
Divide your milliseconds by 86400000 and add that to 1/1/1970 to get date
Create an array based on Duration, expand to rows, add that to the start date
Remove extra columns
let Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
ConvertToDays = Table.TransformColumns(Source,{{"Date", each Number.RoundDown(Number.From(_) / 86400000)}}),
#"Added Custom" = Table.AddColumn(ConvertToDays, "Custom", each Date.AddDays(#date(1970,1,1),18993)),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "Custom.1", each List.Numbers(0,[Duration])),
#"Expanded Custom.1" = Table.ExpandListColumn(#"Added Custom1", "Custom.1"),
#"Added Custom2" = Table.AddColumn(#"Expanded Custom.1", "Custom.2", each Date.AddMonths([Custom],[Custom.1]), type date),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom2",{"Date", "Duration", "Custom", "Custom.1"}),
#"Renamed Columns" = Table.RenameColumns(#"Removed Columns",{{"Custom.2", "Date"}}),
TextDate = Table.AddColumn(#"Renamed Columns", "TextDate", each Date.ToText([Date],"MMM-yy"))
in TextDate
I do Power BI for a logistics company. We want to show performance by stop location. The data is currently a table of all orders by Order ID, so -- ID, Rev $, Pickup Stop, Delivery Stop. Everything is a 2-stop load, fortunately.
What I am struggling with is building a calculated table that looks at the Pickup Stop AND the Delivery Stop at the same time while ALSO respecting filters set on the page. I would like the stops table to say something like: Stop Location, X Pickups, $X Pickup Revenue, X Deliveries, $X Delivery Revenue.
How would I go about this? I've tried a number of approaches but every time it either misses filters or can only handle one stop at a time.
Thanks!
Current Datacall it Orders
The calculated table I'm trying to makecall it Stops
One method of creating your Stops, given your Orders is by using Power Query, accessed via Queries=>Transform Data on the Power BI Home Tab.
The Table.Group function is where the magic happens. Unfortunately, it needs to be done by coding in the Advanced Editor, as the UI does not provide for these custom aggregations.
When the PQ Editor opens: Home => Advanced Editor
The first three lines should be replaced by whatever you are reading in your own Orders table with.
Paste the rest of M Code below in place of what is below your setup lines in your own query
Read the comments and explore the Applied Steps to understand the algorithm
M Code
let
//Input data and set datatypes
//These lines should be replaced with whatever you need to
//set up your data table
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bYzBCoMwEER/Zck5BxPRu1RaLJaW6qEQPIS4tEExoonQv+/a0oLQyw5vZnaUYuepxQmKnHFWO697uOKCQ0DiizVdGKHybiTKsbcLTs8PN1wxIZMooiR938z3evCawyFbKczeDhzq268qyBZpsg23f9+qJF+Skuwe1ui741CU/2djsmO53lJ3SFsth/3aPWrTzY7Kp4o1zQs=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t, Column2 = _t, Column3 = _t, Column4 = _t]),
#"Promoted Headers" = Table.PromoteHeaders(Source, [PromoteAllScalars=true]),
dataSource = Table.TransformColumnTypes(#"Promoted Headers",{
{"Order ID", Int64.Type}, {"Total Revenue", Int64.Type},
{"Pickup Stop", type text}, {"Delivery Stop", type text}}),
//Unpivot to get single column of Stops
#"Unpivoted Columns" = Table.UnpivotOtherColumns(dataSource, {"Order ID", "Total Revenue"}, "Attribute", "Stop"),
//Group by stop and do the aggregations
#"Grouped Rows" = Table.Group(#"Unpivoted Columns", {"Stop"}, {
{"Orders Picked Up", (t)=> List.Count(List.Select(t[Attribute], each _ = "Pickup Stop" )), Int64.Type},
{"Total Revenue Picked Up", (t)=> List.Sum(Table.SelectRows(t, each [Attribute]="Pickup Stop")[Total Revenue]), type number},
{"Orders Delivered", (t)=> List.Count(List.Select(t[Attribute], each _ = "Delivery Stop" )), Int64.Type},
{"Total Revenue Delivered", (t)=> List.Sum(Table.SelectRows(t, each [Attribute]="Delivery Stop")[Total Revenue]), type number}
})
in
#"Grouped Rows"
Orders
Stops