I was having a candid conversation with Phil Seamark from DAX.tips about Aggregation Tables. During that conversation, I was asking about patterns in using Aggregation tables. Within that 10 minute conversation I was blow away by all the possible patterns. Because of this, we pleaded for Phil to present these patterns to the Milwaukee Brew City User Group.
While these patterns are described in detail here are the various patterns that can be used for Aggregation tables. Also, Phil includes a great introduction, found here. For each of these articles Phil describes proper usage for the pattern.
The Filtered Aggs would contain multiple Aggregation tables of the same data. But, each Agg table could contain different gains of data. For Example, data aggregated by Week, Month or Year.
Finally, the Incremental Aggs. This type of aggregation would be used when aggregating transactions per day by store, and or product.
Thanks Phil
A special thanks to Phil for presenting. Since, we know you are a busy guy doing tons of great work. Thank you for taking time out of your day to present this wonderful topic. We hope you enjoy this exploration into Agg Tables.
If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
Understand more about your model then you thought possible! This is part two in our series on DAX Studio. If you missed part one be sure to check out the Introduction to DAX Studio. In this session Marco Russo shares the fundamentals for model performance in DAX Studio and dives into key areas that you absolutely need to understand. Marco is an accomplished trainer and you won’t find a better resource to guide you on this journey.
Using DAX Studio in conjunction with Vertipaq Analyzer Marco shows us how we can discover and explore the key statistics about our models. Use Vertipaq Analyzer against your open Power BI model or any tabular model. This tool only exports the tabular metadata, not the data! So we can do offline analysis on a model we don’t have access to by having people with permissions export these metrics. Better yet, we can download this into Excel and everything gets broken down for you all the way to your relationships and measures… AMAZING!
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
Darren Gosbell & Marco Russo join PowerBI.Tips in a 4 part series on how and why to use DAX Studio! They show us why DAX Studio is the ultimate tool for working with DAX queries!
In this first session we are joined by Darren who gives us a little history on DAX Studio and how we can get started with downloading the tool and getting it up and running. He covers many different areas of the tool including connecting to our various models, the basic functions of the tool, how you can output the results, and everything you need to know to get started. Be sure to check out this video and all the rest in the series.
If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
This article examines using the advanced editor in Power Query to better handle when things go wrong. It will also allow custom actions to be triggered using Power Automate, allowing the rest of the data to be refreshed while alerting the owner by email of data source errors.
Note that this article requires edit Power Query (M) code. If you want to review this, consider reading this article: Query Editor – Editing M Code
Data Sources
Data can be messy. Thankfully, Power Query has given us an
excellent and efficient way to extract, transform and load the data to
manipulate it to the way we want it. It is extremely versatile and can connect
to a huge host of data sources. These can include third party sources as well
as internal data.
While beneficial, we need to ask what happens if the third-party source suddenly shuts down or changes its access policies. What if there is bad data recorded in the on-premise excel sheet or databases? An error in any Power Query step can stop the entire report refreshing. We can opt to receive alerts on the service, but these can be unspecific and require us to dig deep into the report.
The technique laid out here allows to receive a specific alert of the exact step the error occurred. What’s more, we can ensure it won’t break our queries and continue to refresh the rest of the data.
Step 1 – Base Query
First we need to set up the query that we want to error handle. For this example, I’m going to send a web request to get some information about the Microsoft stock price. For more information on this API or to pull other stock data, check out this article.
Open Power Query and Select the New Source > Web on the Home ribbon.
This will automatically return a JSON object and parse it for us.
Note: This link returns 5 years of daily daily historical stock price.
For simplicity, I will just return the meta data to ensure the API call is working. The automatic parsing will return a table with clickable elements. To explore into the JSON we, click through the following steps:
chart:Record > result:list > Record > Meta:Record
Note: See sample of nested structure below for chart:Record
Once we have expanded all the way down to the Meta level, Press the Convert Into Table Button found on the newly appeared Convert ribbon.
Here is the final code, which you can see by Clicking the Advanced Editor button on the Home ribbon revealingthe advanced editor:
let
Source = Json.Document(Web.Contents("https://query1.finance.yahoo.com/v8/finance/chart/MSFT?range=5y&interval=1d")),
chart = Source[chart],
result = chart[result],
result1 = result{0},
#"meta" = result1[meta],
#"Converted to Table" = Record.ToTable(#"meta")
in
#"Converted to Table
Rename this “Stock Query” by editing the name in the properties pane on the right.
Step 2 – Create the flow
Next we create the Power Automate flow that will alert us something is wrong. Navigate to the Power Automate website. Once logged in, Click on the new Instant flow button.
Give the flow an appropriate name. For the trigger, select the option When an HTTP request is received. Next press thebutton Create to make the flow.
Once we save the flow will supply us with a URL. This URL will trigger the flow any time it is visited. You can use any action you want, but I will have the flow send me an email to let me know the refresh failed.
I’ll search for the item Send an email (V2). Make sure you fill in the email address you want to send it to, and write an appropriate message and subject.
That’s our flow! Hit Save. After you have saved, click on the When a HTTP request is received step to expand it. You’ll see that a URL has been generated. Copy the link using the Copy button on the right. You’ll need this to enter it into Power BI.
Step 3 – Make the Flow Trigger Query
Next, set up the query in Power BI to call this flow. In Power Query, make a new query by selecting New Source > Web. Paste in the Web URL you copied in the previous step. Open the advanced editor. Inside you see the code uses the Web.Contents() function. You’ll need to copy this code in a later step.
This will send an email as it runs the query. For testing, if you press the Refresh Preview icon, you can trigger the actions to run again. If you don’t want to wait for the email or chose a different action, you can check if the flow ran by checking it on the power automate site. Click My Flows on the left, open the flow and scroll down to Runs. Press the refresh button on the runs section to check when the flow ran.
Step 4 – Set up the Error Handling Code
Now we need to add the logic to have the Power Automate run
on error, which is going to take a little coding.
Back in Power Query, start by adding a blank query by clicking New Source > Blank Query on the Home Ribbon. Next, open the Advanced Editor and the code should look like this:
Now we will employ the try … otherwise statement. This is Power Query’s error handing statement. We can add a placeholder for now. Replace the step:
Source = ""
with the following code:
Source = try 1+1 otherwise “error”
How this works
Any code between the try and otherwise keywords will be attempted to be executed. If this is successful, the result will be returned and the next step will be evaluated, in this case the number 2. If this returns an error, however, the result will be discarded and the query after the word otherwise will be returned, in this case the word “error” .
We can add this statement as the Source step. I’ll also wrap both statements in parenthesis as this will come in handy as we add more steps. It’s important to keep good syntax to make it readable, so here is my code:
As 1+1 is
valid, this will return the value 2. If you change the 1+1 to 1+“A”, this is not valid so it will return the word error.
Now we can see its working, delete everything between the parenthesis in both the try and otherwise statement to set up for the next step. Do not worry if you get an error as we will be adding code to it.
Step 5 – Update the Error Handling Code
Now we’ve got the placeholder set up we can copy our previous code into each step. Open the Stock Query we made in steps 1 and 2. Click Advanced Editor and copy the entire code. Now, come back to the try..otherwise query. Open the advanced editor in this step and make sure you delete anything between the parenthesis in the try statement if you did not before. Paste the entire code you just copied.
Next, go to the Flow Error Query, open the advanced editor and copy all the text. Go back to the try..otherwise and paste everything between the two parenthesis for the otherwise statement.
Step 6 – Tidying the Code Up
The URLs are prone to change so it is code practice to separate them as a variable. To do this, you can simply state them at the beginning of the query as a step. Here, add the URLs to variables called WebURL and FlowURL. To add WebURL, at the beginning of query after the let keyword, add the line:
Don’t forget the comma at the end. Do the same with the FlowURL. Then, replace the URLs with the variable name.
Additionally, it will help to change the word Source outside the try..otherwise to Output. This makes it easier to follow, as we have several steps called source. Also update it after the keyword in.
Output:
Final code to copy (you will need to update to your correct URLS):
let
flowURL ="https://prod-68.westus.logic.azure.com:443/workflows/ RestofURL",
webURL = "https://query1.finance.yahoo.com/v8/finance/chart/MSFaaT?range=5y&interval=1d",
Output =
try
(
let
Source = Json.Document(Web.Contents(webURL)),
chart = Source[chart],
result = chart[result],
result1 = result{0},
#"meta" = result1[meta],
#"Converted to Table" = Record.ToTable(#"meta")
in
#"Converted to Table"
)
otherwise
(
let
Source = Web.Contents(flowURL)
in
Source
)
in
Output
Optional Step 7.1 – Adding POST Parameter – Flow
Now we have set up our workflow, it may be useful to reuse this on multiple datasets. Instead of setting up a different flow for each dataset, we can allow the flow to accept inputs, such as the dataset name.
Navigate back to Power Automate site and on the flow set up previously, click edit.
Open the step and paste in the following code into Request body JSON.
Next, expand advanced options and change it to POST.
This will create a variable called datasetName. This is the name of the dataset we will pass from Power BI. We can use datasetName in the email to inform us which dataset had an error. The variable will appear in a list when you click to edit the email message – click on the name to add it.
In addition, I added the expression utcNow(). You can find this by searching in the Expression tab of the pop up. This just displays the time it is sent. Thus we can see in the email when it failed. Notice the variable from the first step is green, while the expression is pink. Personalize this message as you wish.
Optional Step 7.2 – Adding POST Parameter – Power BI
The final step is to pass this in from Power BI. In our query, open the advanced editor and add three more parameters before webURL and flowURL:
dataset name is the name of the datset we pass into the flow
and ultimately appear in the email. Right now it’s set to Stock Query, but this
is what we can edit for each dataset we use this query to pass a different name.
content and headers build the correct format to pass into
the API call. This will create a POST request and provide the correct format.
Now we can edit the Web.Contents(webURL) step to include our parameters:
Final code to copy (you will need to update to your correct URLS):
let
datasetName = "Stock Query",
content = " { ""datasetName"": """ & datasetName & """}",
headers = [ #"Content-Type"="application/json"],
flowURL ="https://prod-68.westus.logic.azure.com:443/workflows/RestofURL",
webURL = "https://query1.finance.yahoo.com/v8/finance/chart/MSFaaT?range=5y&interval=1d",
Output =
try
(
let
Source = Json.Document(Web.Contents(webURL)),
chart = Source[chart],
result = chart[result],
result1 = result{0},
#"meta" = result1[meta],
#"Converted to Table" = Record.ToTable(#"meta")
in
#"Converted to Table"
)
otherwise
(
let
Source = Web.Contents(flowURL, [Headers = headers, Content=Text.ToBinary(content)])
in
Source
)
in
Output
Limitations and Considerations
This technique uses premium Power Automate features and a valid license is required. However, only one service account license is needed.
This query has been designed to return blank data if the query fails. This could break your model if this data is required.
This blog does not examine securing the Power Automate endpoint, so be aware if the URL was discovered people could execute the end action (in this case sending an email).
If you like the content from PowerBI.Tips please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
This post will walk through how to pull an estimated household income from a US address. It will be completed all in the Power Query Editor. We will convert this to a function to reuse on any address we want.
This is the second part in a series on free API calls. The first part demonstrated how to pull daily stock price from Yahoo! Finance, which is available here.
Note: The content in this blog was first presented at the Power Platform Summit North America on October 18th, 2019.
Introduction to the Data
We cannot get exact income of a US address, but the US census releases data which is aggregated at different groups of households (called Geography Hierarchies). There are different levels of Geography Hierarchies, explained on the census website. The lowest level of granularity available for this data is Block Groups. This is a small group of blocks, and usually consists of about a few hundred to over 1000 people. A block group is expressed as a code.
It is unlikely we will have the actual bock group code, but rather a street address. In order to retrieve information, we need to find which block group the address falls into. The first step is to convert the address into geographic coordinate, a process called Geocoding. You are probably familiar with using this feature – for example when you enter an address into a map app in order to get directions.
The census.gov website offer a free service to do this (US Census TIGER). However, in my experience the match rate (percentage of addresses geocoded) is not as good as some other services – this means that some of the addresses you enter will not be found. There are many other companies that offer Geocoding services, such as Bing, who often provide better match rates. However, these tend to come at a cost. These can be used instead, but for this example we will focus on the free US Census TIGER service.
Create an API Key
The first step will be to sign up for an API key from the census website. API keys allow organizations to monitor usage and can stop overloading their databases with requests. Some can be used to charge for paid API calls, however the census API is free and only requires an email address. If you do not work for an organization, you can write “none”.
Back in Power BI, on the home tab in Power Query, click Home, New Source, Blank Query.
In the formula bar, Paste in the API key value you received. Rename the query to P_APIKEY. This will store the API key so it can be changed, but will not set it as a parameter and require it to be entered by the user every time they use the function.
Setting up Parameters
In the power query window, under the Home ribbon, Click the bottom half of the Manage Parameters button. From the drop down Select the option New Parameter.
Name the parameter P_Address. Change the Type field to Text. Enter 15010 NE 36th St in the Current Value input box.
Repeat this step 3 more times, so you will have 4 parameters in total. Use the below table for the names and default values:
Name
Default Value
P_Address
15010 NE 36th St
P_City
Redmond
P_State
WA
P_ZIP
98052
Function Part 1: Geocoding
On the home tab, Click New Source > Web. Switch to the advanced tab, Then in the open dialogue box, first Click the button Add part. This will add a new box. Repeat this to add 9 web parts. Afterwards, locate the first window and Enter part 1 of the URL. In the second box, Change the abc symbol to a parameter. Fill in the boxes like below:
Note: Do not enter the rows that begin with P_ directly (rows 2,4,6,8). Switch the type to parameter and pick from the list.
This will return a result: Record. Click on the Record value to drill into it. In the convert tab, click To Table to transform to a table. We have extra information here, but we are only interested in the blockgroup data. Filter the column Name to include the following rows only: “BLKGRP” , “COUNTY”, “STATE” , “TRACT” (make sure you keep the correct columns with these exact names).
Now we have the correct columns, but for our function to waork we want them in one row. Highlight the Name column, navigate to the transform tabe and click Pivot Column. The symbol is:
Expand the Advanced Options, and change the Aggregate Value Function to Don’t Aggregate.
If you are following with the Microsoft head office, your data should look like this:
Now that we have the address geocoded, we can find out census information.
Function Part 2: Returning Household Income
To add the second API call, we can take advantage of the custom column. It is actually possible to use Power Query (M) code in the custom column. Click Add Column then Custom Column.
The code needed for the column is below. You can copy and paste this directly into the “Custom column formula” :
Breaking this code down: – Json.Document tells Power BI the value being returned is in JSON format, and to decode this to a list – Web.Contents will tell Power BI we are going to be sending a web (api) query – https://api.census.gov/data/2018/pdb/blockgroup?get=Tot_Population_CEN_2010,avg_Agg_HH_INC_ACS_12_16 is our base URL will some parameters. The parameters we’re giving are to return population and average household income -The second half of the URL takes in the location from the current row. anything in [Square Brackets] is a column. P_APIKEY is the parameter we set up earlier, that holds our API key
This will add a new column holding a list of lists. Click on the word list to drill into it
This will bring to the second level, two lists. Afterwards, Transform to a table by clicking on the “To Table” button in the ribbon, under list tools > Transform. After it is in table format, we expand the values to take them out the list. The data has two lists, the first is the column headers and the second is the values, so we need to transform a little to see the data in a nice format. Firstly, expand the values by clicking the expand button and select Extract Values.
This will expand to one column, separated by a character of our choice. I’m going to use the carat symbol (^), as I’m confident this will not be a character already in the data. To do this, change the first box to –Custom– then type in the carat symbol.
After extracting the data from the lists, we can split it into columns. We will split on the delimiter we added, the carat symbol (^). Click on Home Tab, Split Column, By Delimiter.
Same as before, change the first box to –Custom– then type in the carat symbol. Leave the default setting of “At each occurrence of the delimiter”.
Now it is split, promote the first row to headers by going to the Transform tab and use the “Use first row as headers”.
Finalizing the Function
As a result, this returns lots of columns. Highlight the Tot_Population_CEN_2010 and avg_Agg_HH_INC_ACS_12_16 column, right click and select “Remove other columns”. Rename the two columns to “Population” and “Household Income” by double clicking the column header and typing the new name. Finally, highlight both columns, enter the Transform tab and click “Detect Data Type” to convert the data type.
Optional: If you want to add the original address in, Click Add Column then Custom Column. Enter the below code, which will concatenate the address to one value. You can name the column Address.
This query uses a parameter which enables us to can convert it to a function. To do this, right click on the query in the Queries pane on the left and select make function.
Now we have a function where we can input any address and return the estimated household income, as well as population of this average is taken from(n value). To check multiple addresses, you can add your function to any list. This can be found in Add Column ribbon, then Clicking the button Invoke Custom Function. This will return a table for each row. Before expanding, it is important to handle errors, otherwise it could break the query. One option is to Right Click the column header, and select the Replace Errors option, and Type the text null.
Final Function
For those who like M code, here is the final function. You can copy and paste this directly into the advanced editor (See this article on how to do this).
let
Source = (P_Address as any, P_City as any, P_State as any, P_ZIP as text) => let
Source = Json.Document(Web.Contents("https://geocoding.geo.census.gov/geocoder/geographies/address?street=" & P_Address & "&city=" & P_City & "&state=" & P_State & "&zip=" & P_ZIP & "&benchmark=Public_AR_Census2010&vintage=Census2010_Census2010&layers=13&format=json")),
result = Source[result],
addressMatches = result[addressMatches],
addressMatches1 = addressMatches{0},
geographies = addressMatches1[geographies],
#"Census Blocks" = geographies[Census Blocks],
#"Census Blocks1" = #"Census Blocks"{0},
#"Converted to Table" = Record.ToTable(#"Census Blocks1"),
#"Filtered Rows1" = Table.SelectRows(#"Converted to Table", each ([Name] = "BLKGRP" or [Name] = "COUNTY" or [Name] = "STATE" or [Name] = "TRACT")),
#"Filtered Rows" = Table.SelectRows(#"Filtered Rows1", each ([Name] = "BLKGRP" or [Name] = "COUNTY" or [Name] = "STATE" or [Name] = "TRACT")),
#"Pivoted Column" = Table.Pivot(#"Filtered Rows", List.Distinct(#"Filtered Rows"[Name]), "Name", "Value"),
#"Added Custom" = Table.AddColumn(#"Pivoted Column", "Custom", each Json.Document(Web.Contents("https://api.census.gov/data/2018/pdb/blockgroup?get=State_name,County_name,Tot_Population_CEN_2010,avg_Agg_HH_INC_ACS_12_16&for=block%20group:" & [BLKGRP] & "&in=state:" & [STATE] & "%20county:" & [COUNTY] & "%20tract:" & [TRACT] & "&key=" & P_APIKEY))),
Custom = #"Added Custom"{0}[Custom],
#"Converted to Table1" = Table.FromList(Custom, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Extracted Values" = Table.TransformColumns(#"Converted to Table1", {"Column1", each Text.Combine(List.Transform(_, Text.From), "^"), type text}),
#"Split Column by Delimiter" = Table.SplitColumn(#"Extracted Values", "Column1", Splitter.SplitTextByDelimiter("^", QuoteStyle.Csv), {"Column1.1", "Column1.2", "Column1.3", "Column1.4", "Column1.5", "Column1.6", "Column1.7", "Column1.8"}),
#"Changed Type" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Column1.1", type text}, {"Column1.2", type text}, {"Column1.3", type text}, {"Column1.4", type text}, {"Column1.5", type text}, {"Column1.6", type text}, {"Column1.7", type text}, {"Column1.8", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type", [PromoteAllScalars=true]),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"State_name", type text}, {"County_name", type text}, {"Tot_Population_CEN_2010", Int64.Type}, {"avg_Agg_HH_INC_ACS_12_16", Currency.Type}, {"state", Int64.Type}, {"county", Int64.Type}, {"tract", Int64.Type}, {"block group", Int64.Type}}),
#"Removed Other Columns" = Table.SelectColumns(#"Changed Type1",{"Tot_Population_CEN_2010", "avg_Agg_HH_INC_ACS_12_16"}),
#"Renamed Columns" = Table.RenameColumns(#"Removed Other Columns",{{"Tot_Population_CEN_2010", "Population"}, {"avg_Agg_HH_INC_ACS_12_16", "Houshold Income"}})
in
#"Renamed Columns"
in
Source
If you like the content from PowerBI.Tips please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
This post will walk through how to pull daily stock price from Yahoo! Finance, then transform the data using a technique called a query branch. It will be completed all in the Power Query Editor. We will convert this to a function to reuse on any stock we want.
There are many API to pull stock information that get historical stock prices. Many come with a cost to get this information in a decent format. The technique described here is free but will require some data transformations to get the data in a usable format. The purpose is to explore parameters, web URLs and query branches to design a usable function. If you’re just interested in pulling stock information, skip to the end to grab the M code – just make sure you read the performance considerations.
Note: The content in this blog was first presented at the Power Platform Summit North America on October 18th, 2019.
Getting Started
This blog will use parameters to create functions in Power Query. Some experience using Power Query editor may be helpful, specifically: – Knowledge of tools such as merge and append queries – Familiar with query steps and the formula bar
For a detailed look at parameters or if you need to brush up, check out this post on parameters.
Before starting, you will need to ensure the formula bar in the query editor is open.
Open the Power Query Editor by Clicking the Edit Queries on the Home ribbon of Power BI desktop. Then, go to the View ribbon in the Query Editor and make sure the check box for Formula Bar is turned on.
Create the Parameter
First, Create a Parameter. This is a value that we can change and feed into our query, in this case the stock symbol.
In the power query window, under the Home ribbon, Click the bottom half of the Manage Parameters button. From the drop down Select the option New Parameter.
In the Name section, enter the text StockSymbol (without spaces – this makes it much easier to reference later). Give it a description if you like. If you share this report other people can read the description to understand what the parameter is being used for. Change the Type field to Text. Enter MSFT in the Current Value input box. By the way, MSFT is the stock symbol for Microsoft.
Making the Query
Now we have set up a parameter, we can use it to pull in some data. The data source is going to be a web URL. In Power Query editor window, Click the Home ribbon and the button Get Data. Select the item Web in the drop down. In the popup dialogue, Click on the button labeled Advanced.
You’ll notice this brings up a dialog called URL Parts. This allows us to break down the URL into parts. We can easily change individual parts of the URL using this dialog. It will then concatenate it all back together in the order it is entered. Our URL to connect to Yahoo! for a single stock will be split into three parts.
The base URL, which points us to Yahoo! Finance website and the correct data
The stock symbol, in this case will be our parameter
Our other settings (range and interval). These could also be changed in Power BI with a parameter, but we do not want to for this example
In the open dialogue box, first Click the button Add part. This will add a new box. Locate the first window and Enter part 1 of the URL. In the second box, Change the abc symbol to a parameter. Make sure Stock Symbol is selected. In the third box, enter part 3 of the URL. We’re setting the range to 5y (5 years of data) and the interval to 1d (daily). You can change these if you want at a later time.
Note: It is important to remember that Stock Symbol is a parameter – change the symbol to parameter and select from the drop down. Do not type Stock Symbol into the box.
Now Hit the button labeled OK. The request will be sent and returned to us in a JSON format.
Rename the query Stock Value. You can edit the name above the Applied Steps section on the right.
Making the Query Branch
The data returned is a little messy and not in the best format. We need to drill down and pull out the appropriate bits of information. Start by drilling down to the correct information. To drill down, Click the underlinedresult part in the following order: Chart: Record Result: List 1: Record
Your screen should look like the image below. If it doesn’t, simply delete the navigation step and start again.
Here, we are presented with three options of paths to drill down further:
Meta: holds some info about the stock, as well as the timeframe and granularity we chose Timestamp: a list of the dates in the range we selected Indicators: this holds the price information of stock
Right now, the dates and the price are in two different
lists. The price information is another layer down than the dates which makes
this more complicated. Extracting these individually would result in a list of
random prices and a big list of dates – not helpful if these two pieces of
information are not together.
To solve, we will create a Query Branch. The branch will split our query at this step into two paths. One will retrieve the dates, the other the prices. Then we will merge these branches back together to get the dates and prices in the same table.
To start this branch Right Click on the Navigation Step, then Select the option in the drop-down menu Insert Step After. This will reference the previous step and show the same data. Our newly created set is the start of the branch. Rename this new step StartBranch.
Note: the reason for this reference is that the “Navigation” step is not really a step at all. It is actually a collection of steps that Power Query editor groups together. You cannot reference “Navigation”, which will be needed later. You’ll see you cannot rename the Navigation step and if you open the advanced editor you can see the breakdown of individual steps. Another option is two perform any action after the Source step, before you drill down. This will cause Power Query to list each drill down step individually.
Branch 1: Dates
Our first branch we
will pull the dates.
Click on timestamp: List. This will drill down to a list of dates, but they are stored in a UNIX format. UNIX date format is the number of seconds past January 1, 1970 (midnight UTC/GMT), not counting leap seconds. Converting this is quite easy but will take a couple of steps.
First convert the list to a table so we can perform transformations. Click on Transform ribbon. Select the button To Table. Next, under the Add Column ribbon Click the button Custom Column. Change the name to Date and use the following formula in the formula window:
25569 + ( [Column1]/60/60/24 )
Then Select the Date column. Click the Transform ribbon. Under the Data section, Select the Date format. Note: do not select the Date/Time.
Now we have the date but need to preserve its order. This can be solved by adding an index. Go to the Add Column ribbon, Click the little Drop down on the right half of the Index Column button. Select the option From 0 from the drop down menu. Remove the column labeled Column1, as it is not needed anymore. To do this, Right Click on Column1 and select the option Remove from the drop down menu.
This finishes the branch for the dates. Rename this step EndBranchDate by Right Clicking the step in the APPLIED STEPS and Clicking rename.
Branch 2: Prices
Now we need to get the information for the stock prices. Right ClickEndDateBranch and Click the option Insert Step After to add the start of the branch. By checking the formula, we can see it says
=EndBranchDate
This step is doing is referencing the step before it, EndBranchDate. It is duplicating the output of this step. We need to get back to the StartBranch step in order to start the second branch. Change the value in the formula bar from = EndBranchDate to = StartBranch.
This now loads us back to this step to drill down to the stock prices. We will use the adjusted close – this is the stock price at the end of the day after adjusting for dividends. Here we need to drill down to this information, by drilling in the following order:
Indicators: Record adjclose: List 1: Record adjclose: List
Next, Covert our list to a Table. see above for this step. Here we have the list of prices and again need to preserve the order with an index column. Go to the ribbon labeled Add Column. Click the Index Column and select From 0 in the drop down.
This is the end of this step, so Rename it EndBranchPrice.
To summarize the query so far:
Pulled the information for a MSFT stock for 5 years on a daily basis.
Drilled down to the dates, converted them to a better format and added an index to preserve order.
Revert to an earlier step.
Drilled down to the daily prices and added an index column.
Merging the Branches
This leaves two separate tables, but it is only possible to output one of these results. We will need to add a final step to merge these two branches into one table.
On the Home Ribbon, Click the drop down arrow on the Merge Queries button. Then Select the option Merge Queries. This brings up the merge screen. Merge the query with itself. On the bottom half of the merge, Select StockValue (current). Click on the Index column for both top and bottom.
Clicking OK, will merge the data to itself. This is the formula in the formula bar:
This step uses the Table.NestedJoin formula, which has 6 arguments filled in:
Table.NestedJoin(table1 as table, key1 as any, table2 as any, key2 as any, newColumnName as text, optional joinKind )
In our example, table1 and table2 is the same (EndBranchPrice). This makes sense as we joined it to itself. You will notice that when joining from the same query, the table argument references a step in that query (EndBranchPrice). We really want to join EndBranchPrice to EndBranchDate. We can simply change the second table in the formula bar to EndBranchDate:
Change:
To:
Now, we are joining the EndBranchPrice to the step EndBranchDate. These both have a column named index that we added, which will join our data in the correct order.
Expand the merged table by Clicking the Expand button on the column name. The settings will just Select the Date and Deselect the option to Use original column name as prefix.
Remove the index column as it is not need this anymore. That completes our query with the branch.
Enabling Publish to the Service
If we want to publish this to the service (app.powerbi.com), we will need to make a small edit to our URL. The service will not allow parameters in the base URL. To get around this, we can split our URL using an option in Web.Contents called RelativePath. After Clicking on the Source in the applied steps window, Edit the URL as follows:
From:
To:
Make sure the brackets are correct. Here is the code you can copy and paste into the formula bar:
Now we have changed the URL, we need to make a change to the settings. This is because Power BI will try and check the base URL (https://query1.finance.yahoo.com/v8/finance/chart) before it runs the query and adds the second part in RelativePath. This isn’t a valid URL on its own, so it will fail.
To do this, publish the report to the service, and Navigate to the dataset settings. This is found in the service by Clicking the ellipsis in the top right, then the option called Settings in the drop down.
You should be in the tab called Datasets. Then Navigate to the published dataset. Under the option titled Data source credentials, next to Web, Click the option to Edit Credentials. Make sure to check the option to Skip connection test.
This query uses a parameter which enables us to can convert it to a function. To do this, right click on the query in the Queries pane on the left and select make function.
Now we have a function where we can input any stock symbol and return a list of daily prices. To check multiple stocks, you can add your function to any list of stock symbols. This can be found in Add Column ribbon. Then Clicking the button Invoke Custom Function. This will return a table for each row. Before expanding, it is important to handle errors, otherwise it could break the query. One option is to Right Click the column header, and select the Replace Errors option, and Type the text null.
Performance Considerations
While this query will quickly return single stocks, adding multiple stock will send a different query for each stock. Make sure you design the correct solution to what you are trying to achieve, and check out this article on API considerations.
Final Result
For those who like M code, here is the final function. You can copy and paste this directly into the advanced editor (See this article on how to do this).
Visually splitting the steps, the query can be visualized like this:
If you like the content from PowerBI.Tips please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
If you have spent any time working in Power BI, your very first step is to, wait for it… Get Data. Using Get Data will start loading your data into the Query Editor for Extracting, Transforming and Loading (ETL). When you start out in Power BI it is likely that you don’t spend much time in the Query Editor. However, the longer you use Power BI desktop, and the more you learn, you find that the Query Editor is highly under-rated. There are so many amazing transformations that you can perform on your data. After some time getting comfortable you’ll be building larger queries with many, many, steps. Eventually, it may look something like this:
Perhaps your queries are already long, or may be even longer. Wouldn’t it be nice to shorten the number of steps? It would make it easier to read. In this tutorial we are going to talk through how we can combine several steps when you create a new column. This is achieved by modifying the M scripts or equations slightly when new columns are created.
While doing this won’t cut down every query in half, but it will remove a couple of additional steps per query. This makes your queries easier to read and maintain. Also, using this best practice, will save you headaches in the future. At some point you will run into a data type error. This is seen when you try to join multiple tables on columns with different data types, or when you need a measure to create a SUM but the column data type is still text.
Let’s get to the tutorial.
Open up your Power BI Desktop program and on the Home ribbon click Enter Data. Using the dialog box for entering data enter the following table of data:
Sales
100
120
94
20
80
Once you’ve entered your data the Create Table screen should look like the following. Be sure to name your table, in this case I named my data table Data…. yea, feeling a lack of creativity today. Next, click Edit to modify the query before loading the data into Power BI.
This will open the query editor. Click on the Add Column ribbon, then select Custom Column. The Custom Column dialog box will open.
Note: When you bring in the created table that the Sales column already has the data transformed into a whole number. Also note in the right under Applied steps we have two steps, one for the source and one for Changed Type. This is because not every M equation (M language is the language used to perform the ETL in the query editor) can handle data types.
In the Custom Column dialog box enter the following, the column name, the equation below. Click OK to add the column.
Note: It is good practice to name the new column something meaningful. This helps when you are working in the query editor. When you come back to your query months later and wondered what you were creating, the column names will help! Trust me I learned this lesson the hard way…
Great, now we have our new column. Notice the image in front of our column named Increase Sales. This means Power BI thinks that the data type of this column could be Text or a Number. Let’s change it. Click on the ABC123 icon and select Whole Number. Now the column data type has changed to numbers only.
If we glance at the Query Setting under the Applied Steps, we now have 4 steps. Two were added, one for the added column and the second for the data type of the column. This is not what we want. Instead we would like the column to be added with the appropriate data type right from the beginning.
Let’s remove the very last step labeled Changed Type1. To do this we will click on the little X next to the step. This will remove the step. While highlighting the Added Custom step click in the formula bar and modify the equation to include the following statement in RED. Press the Enter to execute the new formula.
= Table.AddColumn(#"Changed Type", "Increase Sales", each [Sales] * 1.1, Int64.Type)
Note: if you don’t see the formula bar it can be toggled on or off in the View ribbon in the check box titled Formula Bar.
The query editor should now look like the following:
Without adding an extra step on the Query Settings, we have changed the data type. I know this might seem trivial, but when you are creating large queries, they can get difficult to read. For me, I find this technique quite useful, and it doesn’t only support whole numbers. This technique also supports the following data types:
Data Type
Syntax
Whole Number
Int64.Type
Decimal Number
Number.Type
Dates
Date.Type
Text
Text.Type
Thanks for following along. If you liked this tutorial, please share it with someone else who might find this valuable. Be sure to follow me in LinkedIn an Twitter for posts about new tutorials and great content from PowerBI.Tips
This week I had a number of team members tell me how difficult it was to share a PBIX file and the corresponding data between team members. The department hasn’t committed 100% to the idea of using PowerBI.com, and thus wanted to share the actual report. I pointed my team mates to my earlier tutorial about using variables for file locations. After reading the tutorial the team came back with a bunch of griping about how they didn’t want to go into the query editor, people name variables different things, and thus it’s hard to understand what another team member was doing, blah, blah, blah…
So, in order to make peace I took another look at trying to make the sharing of a PBIX file easier. This time I decided to use the Parameters feature within PowerBI. This feature within the Query Editor was released all the way back in April of 2016. I read about it, thought it was interesting but couldn’t find a really good use case. In my current predicament, the parameters feature is the perfect solution.
By using a parameter to store a folder and or file location you enhance the user experience by making it easier for a user to change the Query.
To modify a parameter after it has already been built click on the bottom half of the Edit Queries button, found on the Home ribbon. A menu will pop up and you can select Edit Parameters.
The Enter Parameters screen will pop up. This is where you would change the parameters for a query. In our example this is where we will enter a new folder location for our data.
Let’s begin the tutorial. To start we will need to work with some data. In my earlier tutorial, I used the Olympic Medals Count. We will reuse this dataset. Start by downloading the zipped folder Olympic Metal Count files from the following location. Once you have downloaded the sample data, extract the folder and subsequent files from the zipped file. In windows 10, which I am using, you can right click on the downloaded file and select Extract All..
A pop-up window will appear to complete the extraction process. By leaving the default settings the folder will be extracted into the same location as the downloaded file. Click Extract to complete the extraction.
We now have a folder with all our data.
With the data prepared we can now make the parameters within PowerBI. Open PowerBI and click the top half of the Edit Queries button. The Query Editor will open. Click the Mange Parameters found on the Home ribbon. The Parameters screen will open. Click the blue New button at the top left. Enter the name Folder, and include a brief description of this parameter. Since the folder parameter is essential to the query we must leave the Required box checked. Change the Type to Text, and leave the Suggested Values to Any value. Finally enter the Folder Location where our Medal-Count file is located into the Current Value. Click OK to complete the parameter.
Next, we will create the actual query to load the folder containing our Metal Counts. Click on the New Source button. This will open the Get Data window. Select the Folder option located in the list on the right. Click Connect to complete the Get Data.
The Get Data window will close, then a new window asking for the folder location will open. This is where the parameter implementation is slick. Click the ABC button to the far right of the Folder Path bar. Change the selection to Parameter and select our newly created parameter labeled Folder. Click OK to load the Folder.
Next the folder prompt opens allowing us to accept or edit the query. Click Combine & Edit to proceed. The Combine Files window will open. Here we can interrogate each file to inspect what is in each file. Clicking the drop-down menu next to Example File: allows us to see the list of all files. Select the 2004 Medals.xlsx file.
Click the Sheet1 item on the left side of the window. This exposes the data that is contained within this 2004 particular file. Click OK to proceed.
Note: For users that have been working with PowerBI for a while, this Combine Files screen flow is a newer feature. I feel that this greatly simplifies the process and makes it easier to detect issues when loading data.
PowerBI now loads the files into the Query Editor. The Query that we want to use is labeled Query2, change the Query2 name to Metal Count.
What you’ll also notice is there is a lot more queries on the left side of the screen under Queries. These queries were auto generated when we navigated through the menu screens. Click Close & Apply to complete the data load.
Build the following Stacked Column Chart:
Click the Ellipsis on the upper right hand corner of the chart and select Sort By Total to sort all the items in the bar chart according to how many metals each country won.
Great we have made a parameter, loaded multiple excel files, and then made a visual of our data. Let’s test it out. Save your work and close the PowerBI desktop. (don’t forget to save your PBIX)
Next we rename the folder where all the excel files are contained. Change the name to Olympic Medals.
Reopen your PBIX file, on the Home ribbon click the Refresh button. Ooops, since we renamed the folder we now have an error.
To fix this click the bottom half of the Edit Queries button on the home ribbon, and select Edit Parameters. Then modify the Folder field to include the new name of the other folder Olympic Medals. Click OK to proceed.
A message bar will appear under the Home ribbon. Click Apply Changes to complete the data load. PowerBI will reload all your data using the context of the new folder.
Now this is an elegant solution for sharing PBIX files and their corresponding data files with coworkers. This is just a simple example of how you would use parameters. There are a multitude of other really good use cases for using parameters within your queries. Thanks for reading along, be sure to share if you like this tutorial.
This week I encountered an issue when working with multiple queries in my data model. Here is the source files in case you want to follow along.
Here’s what happened. I had a PBIX file that had four queries in it, one file for the summer the Olympic metal count for the following years, 2000, 2004, 2008, and 2012.
After a bit of working I figured that my desktop screen was going to get to cluttered if I continued to collect Olympic metal data. Thus, I moved my excel files which were my source data into a folder called Olympic Medals.
By doing this I broke all the links for all four files. This was discovered when I tried to refresh my queries and noticed that all the queries failed. Power BI gave me a nice little message notifying me that there was a data source error.
DataSource.Error: Could not fine the file:
To fix this I had to open the query editor and change each file’s location to the new folder that I just made. Seeing that this is not an efficient use of my time, I decided to spend more time to figure out a way to make a variable that would be my file location for all my queries.
Lets begin by making a new blank query by clicking on the bottom half of the New Source button on the Home ribbon. Then click the item labeled Blank Query.
With the new query open type in the file location where you will obtain all your working files. For me my file location was on my desktop, thus the file location is listed below. Rename the new query to Folder.
Note: Since we are working on building a file structure for Power BI to load the excel files you will want to be extra careful to add a “\” back slash at the end of the file location.
Next on the query for Medals 2000, we click the Source under the applied steps window on the right. This will expose the code in the formula bar at the top of the window.
Note: If you don’t see the formula bar as I have illustrated in the image above, you can turn this feature on by click the View ribbon and checking the box next to the words Formula Bar. This will expose the formula bar so you can edit the source step.
This is where the magic happens. We can now insert our new blank query into this step. Our current file contents looks like the following:
Not only does this shorten our equation, it now uses the folder location we identified earlier and then we can pick up the file name 2000 Medals.xlsx. This makes is very easy to add additional queries with the same steps. Also, if you move your files to a new folder location, you only have to change the Folder query to reflect the new file location. To test this make a new folder on your desktop called New Folder. Move all the Olympic medal files to the new folder. Now in Power BI Desktop press the Refresh on the Home ribbon. This should result in the Data.Source.Error that we saw earlier. To fix this click the Edit Queries on the Home ribbon, select the Folder query and change the file directory to the new folder that you made on your desktop. It should look similar to the following:
Once you’ve modified the Folder query, click Close & Apply on the Home ribbon and all your queries will now reload. Success!!
Hope this tutorial helps and solves some of the problems when moving data files and storing information for Power BI desktop. Please Share if you like the tutorials. Thanks.
In this tutorial we’ll learn how to copy and paste queries to and from the Query Editor. When your working in Power BI Desktop often you will need to share and model the data before it can be applied to the visual. In my experience you’ll need to add a calculated column or break out a date such as 1/5/2016 into the the Year 2016 or Month 01, components to properly display a visual.
We will start off with from a prior example where we build a shaded region map. The tutorial to create this Power BI Desktop file is located here.
If you want to cheat and download the final PBIX file you can download and open the zipped file here: Regional Filled Map Example
This file was made in Power BI Desktop April 2016 version, 2.34.4372.322, download the latest version from Microsoft Here.
Open the zip file that you downloaded and extract the file inside labeled Regional Filled Map Example. Open the file. Once you’ve opened the file on page 1 of the you see a map of the united states that looks similar to the following.
Now we well enter the query editor. Click on the Edit Queries on the Home ribbon. You opened the Query Editor. In this window we shape and model the data so we can properly visualize it on the pages. Couple of things to notice here. Every time you press a button on the ribbon, the query editor generates an Applied Step. Each step writes a line of M code which transforms the data as it is loaded into the computer’s memory. In this case we have (7) seven steps starting at Source and ending with Changed Type1.
We want to expose the code that is begin generated at every step behind the scenes. Click on the View ribbon and then click on the button called Advanced Editor.
Opening this window reveals the M language code that is generating each Applied Step we saw earlier.
Note: some of the steps we saw earlier such as Filtered Rows had a space in it. In the query editor any applied step had a space in the name gets the added #”” around the applied step name. Thus, in the query editor Filter Rows would be #”Filtered Rows”. The hashtag and the quotations define the complete variable. If you changed the name of the applied step to FilteredRows, with no space. In the Advanced Editor you’d only see the step labeled as FilterRows, no hastag or quotations needed.
Now that the M language is revealed you can made modifications to the code. In cases where you want to make a function you would do so in the Advanced Editor. For our example today select all the code and copy it to the clipboard using the keyboard shortcut CTRL+C. Click Done to close the window.
Now lets copy that code into a brand new query. Click the Home ribbon, then click New Source, scroll all the way to the bottom of the list and select Blank Query. Click Connect to start a blank query.
A new Query will now open up. Click the View ribbon, then click Advanced Editor. A blank editor window will open.
Paste the code we copied earlier into this window. Now the new Query1 should look like the following:
Click Done and the new query will now load. It is that simple, now we have two identical queries.