Tag: M Language

  • Handle Errors in Data Refreshes with Power Automate

    Handle Errors in Data Refreshes with Power Automate

    This article examines using the advanced editor in Power Query to better handle when things go wrong. It will also allow custom actions to be triggered using Power Automate, allowing the rest of the data to be refreshed while alerting the owner by email of data source errors.

    Note that this article requires edit Power Query (M) code. If you want to review this, consider reading this article:
    Query Editor – Editing M Code

    Data Sources

    Data can be messy. Thankfully, Power Query has given us an excellent and efficient way to extract, transform and load the data to manipulate it to the way we want it. It is extremely versatile and can connect to a huge host of data sources. These can include third party sources as well as internal data.

    While beneficial, we need to ask what happens if the third-party source suddenly shuts down or changes its access policies. What if there is bad data recorded in the on-premise excel sheet or databases? An error in any Power Query step can stop the entire report refreshing. We can opt to receive alerts on the service, but these can be unspecific and require us to dig deep into the report.

    The technique laid out here allows to receive a specific alert of the exact step the error occurred. What’s more, we can ensure it won’t break our queries and continue to refresh the rest of the data.

    Step 1 – Base Query

    First we need to set up the query that we want to error handle. For this example, I’m going to send a web request to get some information about the Microsoft stock price. For more information on this API or to pull other stock data, check out this article.

    Open Power Query and Select the New Source > Web on the Home ribbon.

    Paste the following link:

    https://query1.finance.yahoo.com/v8/finance/chart/MSFT?range=5y&interval=1d

    This will automatically return a JSON object and parse it for us.

    Note: This link returns 5 years of daily daily historical stock price.

    For simplicity, I will just return the meta data to ensure the API call is working. The automatic parsing will return a table with clickable elements. To explore into the JSON we, click through the following steps:

    chart:Record > result:list > Record > Meta:Record

    Note: See sample of nested structure below for chart:Record

    Once we have expanded all the way down to the Meta level, Press the Convert Into Table Button found on the newly appeared Convert ribbon.

    Here is the final code, which you can see by Clicking the Advanced Editor button on the Home ribbon revealing the advanced editor:

    let 
        Source = Json.Document(Web.Contents("https://query1.finance.yahoo.com/v8/finance/chart/MSFT?range=5y&interval=1d")),
        chart = Source[chart],
        result = chart[result],
        result1 = result{0},
        #"meta" = result1[meta],
        #"Converted to Table" = Record.ToTable(#"meta")
    in
        #"Converted to Table
    

    Rename this “Stock Query” by editing the name in the properties pane on the right.

    Step 2 – Create the flow

    Next we create the Power Automate flow that will alert us something is wrong. Navigate to the Power Automate website. Once logged in, Click on the new Instant flow button.

    Give the flow an appropriate name. For the trigger, select the option When an HTTP request is received. Next press the button Create to make the flow.

    Once we save the flow will supply us with a URL. This URL will trigger the flow any time it is visited. You can use any action you want, but I will have the flow send me an email to let me know the refresh failed.

    I’ll search for the item Send an email (V2). Make sure you fill in the email address you want to send it to, and write an appropriate message and subject.

    That’s our flow! Hit Save. After you have saved, click on the When a HTTP request is received step to expand it. You’ll see that a URL has been generated. Copy the link using the Copy button on the right. You’ll need this to enter it into Power BI.

    Step 3 – Make the Flow Trigger Query

    Next, set up the query in Power BI to call this flow. In Power Query, make a new query by selecting New Source > Web. Paste in the Web URL you copied in the previous step. Open the advanced editor. Inside you see the code uses the Web.Contents() function. You’ll need to copy this code in a later step.

    This will send an email as it runs the query. For testing, if you press the Refresh Preview icon, you can trigger the actions to run again. If you don’t want to wait for the email or chose a different action, you can check if the flow ran by checking it on the power automate site. Click My Flows on the left, open the flow and scroll down to Runs.
    Press the refresh button on the runs section to check when the flow ran.

    Step 4 – Set up the Error Handling Code

    Now we need to add the logic to have the Power Automate run on error, which is going to take a little coding.

    Back in Power Query, start by adding a blank query by clicking New Source > Blank Query on the Home Ribbon.
    Next, open the Advanced Editor and the code should look like this:

    Now we will employ the try … otherwise statement. This is Power Query’s error handing statement. We can add a placeholder for now.
    Replace the step:

    Source = "" 

    with the following code:

    Source = try 1+1 otherwise “error”

    How this works

    Any code between the try and otherwise keywords will be attempted to be executed. If this is successful, the result will be returned and the next step will be evaluated, in this case the number 2. If this returns an error, however, the result will be discarded and the query after the word otherwise will be returned, in this case the word “error” .

    We can add this statement as the Source step. I’ll also wrap both statements in parenthesis as this will come in handy as we add more steps. It’s important to keep good syntax to make it readable, so here is my code:

    As 1+1 is valid, this will return the value 2. If you change the 1+1 to 1+“A”, this is not valid so it will return the word error.

    Now we can see its working, delete everything between the parenthesis in both the try and otherwise statement to set up for the next step. Do not worry if you get an error as we will be adding code to it.


    Step 5 – Update the Error Handling Code

    Now we’ve got the placeholder set up we can copy our previous code into each step.
    Open the Stock Query we made in steps 1 and 2.
    Click Advanced Editor and copy the entire code.
    Now, come back to the try..otherwise query.
    Open the advanced editor in this step and make sure you delete anything between the parenthesis in the try statement if you did not before.
    Paste the entire code you just copied.

    Next, go to the Flow Error Query, open the advanced editor and copy all the text.
    Go back to the try..otherwise and paste everything between the two parenthesis for the otherwise statement.

    Step 6 – Tidying the Code Up

    The URLs are prone to change so it is code practice to separate them as a variable. To do this, you can simply state them at the beginning of the query as a step.
    Here, add the URLs to variables called WebURL and FlowURL.
    To add WebURL, at the beginning of query after the let keyword, add the line:

    webURL = "https://query1.finance.yahoo.com/v8/finance/chart/MSFT?range=5y&interval=1d",

    Don’t forget the comma at the end. Do the same with the FlowURL. Then, replace the URLs with the variable name.

    Additionally, it will help to change the word Source outside the try..otherwise to Output. This makes it easier to follow, as we have several steps called source. Also update it after the keyword in.

    Output:

    Final code to copy (you will need to update to your correct URLS):

    let
        flowURL ="https://prod-68.westus.logic.azure.com:443/workflows/ RestofURL",
        webURL = "https://query1.finance.yahoo.com/v8/finance/chart/MSFaaT?range=5y&interval=1d",
        
        Output =   
        try 
        (
            let 
                Source = Json.Document(Web.Contents(webURL)),
                chart = Source[chart],
                result = chart[result],
                result1 = result{0},
                #"meta" = result1[meta],
                #"Converted to Table" = Record.ToTable(#"meta")
            in
                #"Converted to Table"
        )
        otherwise    
        (
            let
                Source =  Web.Contents(flowURL)         
            in
                Source
        ) 
    
    in
        Output
    

    Optional Step 7.1 – Adding POST Parameter – Flow

    Now we have set up our workflow, it may be useful to reuse this on multiple datasets. Instead of setting up a different flow for each dataset, we can allow the flow to accept inputs, such as the dataset name.

    Navigate back to Power Automate site and on the flow set up previously, click edit.

    Open the step and paste in the following code into Request body JSON.

    {
        "type": "object",
        "properties": {
            "datasetName": {
                "type": "string"
            }
        }
    }

    Next, expand advanced options and change it to POST.

    This will create a variable called datasetName. This is the name of the dataset we will pass from Power BI. We can use datasetName in the email to inform us which dataset had an error. The variable will appear in a list when you click to edit the email message – click on the name to add it.

    In addition, I added the expression utcNow(). You can find this by searching in the Expression tab of the pop up. This just displays the time it is sent. Thus we can see in the email when it failed. Notice the variable from the first step is green, while the expression is pink.
    Personalize this message as you wish.

    Optional Step 7.2 – Adding POST Parameter – Power BI

    The final step is to pass this in from Power BI. In our query, open the advanced editor and add three more parameters before webURL and flowURL:

        datasetName = "Stock Query",
        content = " { ""datasetName"": """ & datasetName & """}",
        headers = [ #"Content-Type"="application/json"],
    

    dataset name is the name of the datset we pass into the flow and ultimately appear in the email. Right now it’s set to Stock Query, but this is what we can edit for each dataset we use this query to pass a different name.

    content and headers build the correct format to pass into the API call. This will create a POST request and provide the correct format.

    Now we can edit the Web.Contents(webURL) step to include our parameters:

    Source =  Web.Contents(flowURL, [Headers = headers, Content=Text.ToBinary(content)])  

    Final output:

    Final code to copy (you will need to update to your correct URLS):

    let
        datasetName = "Stock Query",
        content = " { ""datasetName"": """ & datasetName & """}",
        headers = [ #"Content-Type"="application/json"],
        flowURL ="https://prod-68.westus.logic.azure.com:443/workflows/RestofURL",
        webURL = "https://query1.finance.yahoo.com/v8/finance/chart/MSFaaT?range=5y&interval=1d",
        
        Output =   
        try 
        (
            let 
                Source = Json.Document(Web.Contents(webURL)),
                chart = Source[chart],
                result = chart[result],
                result1 = result{0},
                #"meta" = result1[meta],
                #"Converted to Table" = Record.ToTable(#"meta")
            in
                #"Converted to Table"
        )
        otherwise    
        (
            let
                Source =  Web.Contents(flowURL, [Headers = headers, Content=Text.ToBinary(content)])         
            in
                Source
        ) 
    
    in
        Output
    

    Limitations and Considerations

    This technique uses premium Power Automate features and a valid license is required. However, only one service account license is needed.

    This query has been designed to return blank data if the query fails. This could break your model if this data is required.

    This blog does not examine securing the Power Automate endpoint, so be aware if the URL was discovered people could execute the end action (in this case sending an email).

    If you like the content from PowerBI.Tips please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • Crazy Table Gymnastics – Part 2 – Build Support Materials

    Crazy Table Gymnastics – Part 2 – Build Support Materials

    This is part 2 in the 3 part series on developing super cool tables using some fancy measures.  In part 1 we walked through how to build a table that uses an un-pivoted data source in the Query Editor.  This technique allows you to change the types of categorical values in a table.  If you missed part 1 and want to get caught up follow this link.  Now, continuing with the series, Part 2, we will build the supporting materials (Selector Table, What If Slicers, and measures) for the report.

    Once we are done the final product will look like the following:

    Part 2… Go.

    To make sure we are starting off on the correct step.  We left off part 1 when we had completed a Pivoted Data Table and included an Attribute Slicer that would allow us to toggle between the Manager and Region Categories.  Your table should look like the following diagram:  (If you don’t have this you might want to start with Part 1 found here)

    Pivoted Data Table
    Pivoted Data Table

    Note: I have also included a Slicer which is used with the Attribute field.

    Next, we will need to add a table that will allow us to use the SalesReps, PercentChange, and Margin column headers in our report.  On the Home ribbon click Edit Queries, then select New Source on the Home ribbon.  In the Get Data window select Blank Query, click Connect to proceed.  Open the advanced editor by clicking the Advanced Editor button found on the Home ribbon.  Enter the following M code into the Advanced Editor:

    let
       Source = #"Pivoted Data",
       #"Kept First Rows" = Table.FirstN(Source,1),
       #"Removed Columns" = Table.RemoveColumns(#"Kept First Rows",{"Attribute", "Value", "UniqueID"}),
       #"Unpivoted Columns" = Table.UnpivotOtherColumns(#"Removed Columns", {}, "Attribute", "Value"),
       #"Removed Columns1" = Table.RemoveColumns(#"Unpivoted Columns",{"Value"})
    in
       #"Removed Columns1"

    Click Done to close the Advanced Editor.  Rename the table to Selector.  When you are finished your table should look like the following:

    Create Selector Table
    Create Selector Table

    Click Close & Apply on the Home ribbon to close the Query Editor.  Add a slicer with the following selections:

    Add Selector Attribute Slicer
    Add Selector Attribute Slicer

    Now, we want to detect which of the Attributes have been selected from this table.  We can accomplish this by creating a measure using the DAX function SELECTEDVALUE.  Right Click on the table named Selector and from the drop down select New Measure.  Enter the following DAX equation:

    rankBy = SELECTEDVALUE(Selector[Attribute])

    In addition to the knowledge of which column was selected from the selector table, we will also want to detect to make sure at least one categorical value has been selected.  The categorical values we are talking about were generated earlier.  The values could be either the Manager or Region values of the Attribute column in the Pivoted Data table.  Using the ISFILTERED DAX function enables this section.  Add the following measure to the Pivoted Data table:

    Attribute Filtered = ISFILTERED('Pivoted Data'[Attribute])

    Next, we need to gather some user input in the form of a number from 1 to 10.  To input this information we need to produce a What If Parameter.  On the Modeling ribbon click New Parameter in the What If section of the ribbon.  Enter the following information into the What-if parameter dialog box:

    What If Parameter
    What If Parameter

    Note: Don’t forget to change the Name of the parameter.

    Click OK to proceed.  Power BI will automatically produce a measure table, a measure and a slicer on the report page.

    Slicer Produced by What-if
    Slicer Produced by What-if

    Note: By default there is nothing selected in the box.  However, you can adjust the slicer and a number will appear within the value box.  You can also type in a number between 1 and 10 to the box to adjust the value.

    This is where we go crazy with DAX.  This portion of DAX is where all the magic happens.

    We start off by building our totals measures.  Place all these measures in the Pivoted Data table.

    Total % Change = MAX( 'Pivoted Data'[PercentChange] )
    Total Margin = SUM( 'Pivoted Data'[Margin] )
    Total SalesReps = SUM( 'Pivoted Data'[SalesReps] )

    These will be used repeatedly in our next group of DAX formulas.

    The following measures will produce a calculated ranking for each numerical column.  OK, Pause, This part really excites me here because the next few measures are where the magic happens.  Pay close attention to what is happening here.  Un-Pause, by using the DAX Switch function we can dynamically tell Power BI to adjust which column we want to see ranked by the top items.  For example, if we select SalesReps in our attribute slicer.  The following measures will automatically rank all the items in the table by the column named SalesReps.  Thus, the items with the highest counts of SalesReps will be listed first.  When you select Margin, the table will automatically adjust and re-rank the items by the Margin column.  This is being done in the switch statement.  For each column we are calculating custom rankings and then hiding or replacing values with the Blank() DAX function to not show items we don’t want.

    Enter the following three measures into the Pivoted Data Table:

    Rank Margin = if( [Attribute Filtered],
        SWITCH( [rankBy]
            ,"Margin", CALCULATE( [Total Margin], FILTER('Pivoted Data', RANKX( ALLEXCEPT('Pivoted Data','Pivoted Data'[Value]), [Total Margin]) <= [Top # of Items Value]) )
            ,"PercentChange", CALCULATE( [Total Margin], FILTER('Pivoted Data', RANKX( ALLEXCEPT('Pivoted Data','Pivoted Data'[Value]), [Total % Change]) <= [Top # of Items Value]))
            ,"SalesReps", CALCULATE( [Total Margin], FILTER('Pivoted Data', RANKX( ALLEXCEPT('Pivoted Data','Pivoted Data'[Value]),[Total SalesReps]) <= [Top # of Items Value]))
        )
        , BLANK() )
    
    Rank PercentChange = if( [Attribute Filtered],
         SWITCH( [rankBy],
           "PercentChange", CALCULATE( [Total % Change], FILTER('Pivoted Data', RANKX( ALLEXCEPT('Pivoted Data','Pivoted Data'[Value]), [Total % Change]) <= [Top # of Items Value]))
           ,"Margin", CALCULATE( [Total % Change], FILTER('Pivoted Data', RANKX( ALLEXCEPT('Pivoted Data','Pivoted Data'[Value]), [Total Margin]) <= [Top # of Items Value]))
           ,"SalesReps", CALCULATE( [Total % Change], FILTER('Pivoted Data', RANKX( ALLEXCEPT('Pivoted Data','Pivoted Data'[Value]), [Total SalesReps]) <= [Top # of Items Value]))
        )
        , BLANK() )
    
    Rank SalesReps = if( [Attribute Filtered],
        SWITCH( [rankBy]
            ,"SalesReps", CALCULATE( [Total SalesReps], FILTER('Pivoted Data', RANKX( ALLEXCEPT('Pivoted Data','Pivoted Data'[Value]), [Total SalesReps]) <= [Top # of Items Value]))
            ,"Margin", CALCULATE( [Total SalesReps], FILTER('Pivoted Data', RANKX( ALLEXCEPT('Pivoted Data','Pivoted Data'[Value]), [Total Margin]) <= [Top # of Items Value]))
            ,"PercentChange", CALCULATE( [Total SalesReps], FILTER('Pivoted Data', RANKX( ALLEXCEPT('Pivoted Data','Pivoted Data'[Value]), [Total % Change]) <= [Top # of Items Value]))
        )
        , BLANK() )

    Whew, that was a ton of measures.  All the key components are complete now.  In part 3 we will clean up our report page and make it shine.  I hope you enjoyed this tutorial. Also, follow me on Twitter and Linkedin where I will post all the announcements for new tutorials and content.

    Linkedin Twitter
  • Adding Data Types Within Query Editor

    Adding Data Types Within Query Editor

    If you have spent any time working in Power BI, your very first step is to, wait for it… Get Data.  Using Get Data will start loading your data into the Query Editor for Extracting, Transforming and Loading (ETL).  When you start out in Power BI it is likely that you don’t spend much time in the Query Editor.  However, the longer you use Power BI desktop, and the more you learn, you find that the Query Editor is highly under-rated.  There are so many amazing transformations that you can perform on your data.  After some time getting comfortable you’ll be building larger queries with many, many, steps.  Eventually, it may look something like this:

    Multiple Query Transformations
    Multiple Query Transformations

    Perhaps your queries are already long, or may be even longer.  Wouldn’t it be nice to shorten the number of steps?  It would make it easier to read.  In this tutorial we are going to talk through how we can combine several steps when you create a new column.  This is achieved by modifying the M scripts or equations slightly when new columns are created.

    While doing this won’t cut down every query in half, but it will remove a couple of additional steps per query.  This makes your queries easier to read and maintain.  Also, using this best practice, will save you headaches in the future.  At some point you will run into a data type error.  This is seen when you try to join multiple tables on columns with different data types, or when you need a measure to create a SUM but the column data type is still text.

    Let’s get to the tutorial.

    Open up your Power BI Desktop program and on the Home ribbon click Enter Data.  Using the dialog box for entering data enter the following table of data:

    Sales
    100
    120
    94
    20
    80

    Once you’ve entered your data the Create Table screen should look like the following.  Be sure to name your table, in this case I named my data table Data…. yea, feeling a lack of creativity today.  Next, click Edit to modify the query before loading the data into Power BI.

     

    Create Table
    Create Table

    This will open the query editor.  Click on the Add Column ribbon, then select Custom Column.  The Custom Column dialog box will open.

    Note: When you bring in the created table that the Sales column already has the data transformed into a whole number. Also note in the right under Applied steps we have two steps, one for the source and one for Changed Type.  This is because not every M equation (M language is the language used to perform the ETL in the query editor) can handle data types.  

    Add Custom Column
    Add Custom Column

    In the Custom Column dialog box enter the following, the column name, the equation below.  Click OK to add the column.

    Insert Custom Column
    Insert Custom Column

    Note: It is good practice to name the new column something meaningful.  This helps when you are working in the query editor.  When you come back to your query months later and wondered what you were creating, the column names will help!  Trust me I learned this lesson the hard way… 

    Great, now we have our new column.  Notice the image in front of our column named Increase Sales.  This means Power BI thinks that the data type of this column could be Text or a Number.  Let’s change it.  Click on the ABC123 icon and select Whole Number.  Now the column data type has changed to numbers only.

    Change Column Type to Whole Number
    Change Column Type to Whole Number

    If we glance at the Query Setting under the Applied Steps, we now have 4 steps.  Two were added, one for the added column and the second for the data type of the column.  This is not what we want.  Instead we would like the column to be added with the appropriate data type right from the beginning.

    Let’s remove the very last step labeled Changed Type1.  To do this we will click on the little X next to the step.  This will remove the step.  While highlighting the Added Custom step click in the formula bar and modify the equation to include the following statement in RED.  Press the Enter to execute the new formula.

    = Table.AddColumn(#"Changed Type", "Increase Sales", each [Sales] * 1.1, Int64.Type)

    Note: if you don’t see the formula bar it can be toggled on or off in the View ribbon in the check box titled Formula Bar.

    The query editor should now look like the following:

    Desired Data Type
    Desired Data Type

    Without adding an extra step on the Query Settings, we have changed the data type.  I know this might seem trivial, but when you are creating large queries, they can get difficult to read.  For me, I find this technique quite useful, and it doesn’t only support whole numbers.  This technique also supports the following data types:

    Data Type Syntax
    Whole Number Int64.Type
    Decimal Number Number.Type
    Dates Date.Type
    Text Text.Type

    Thanks for following along.  If you liked this tutorial, please share it with someone else who might find this valuable.  Be sure to follow me in LinkedIn an Twitter for posts about new tutorials and great content from PowerBI.Tips

    Linkedin Twitter