Category: Sharing & Administration

  • Milwaukee Brew City PUG – July 2020

    Milwaukee Brew City PUG – July 2020

    Welcome to July! This month we have Tommy Puglia from the Chicago Power BI User Group joining us to discuss and talk about driving organizational success with Power BI.

    Tommy is an expert and has spent years deploying Power BI within various organizations. Check out the video below from the July 2020 Milwaukee Brew City Power BI user group.

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat


  • Buy and Apply Power BI Premium P license

    Buy and Apply Power BI Premium P license

    I am working on a project that uses Power BI embedded to display reports to external users via an application. I’ve used the progression of A sku’s (embedded license via Azure) to support the various reports. I love using the A sku for various reasons, it has a low point of entry offered in the A1. It is easy to scale up to the higher tiers as I need to. It has the ability to pause a capacity at any time. I also enjoy the flexibility of pay-by-the-hour the license provides. However, I just got to the point where one of our capacities is about to exceed the 10GB of RAM I get on the A3. As a result, I started to compare the A4 sku to the P1 sku. They are the same in terms of cores and RAM (8/25), but the P1 has an option to be cheaper.

    After researching how to buy and apply the Premium P license I realized there wasn’t an end to end explanation of what to expect and how to apply the P sku to my specific region. This is hugely important in order for some of the Service features to work correctly. When committing to large sums of money, I find its always nice to have these answers up front, so I hope the following walk through helps those decision makers out there.

    Analyze the A sku

    Before we jump into the P sku, lets take a quick moment to see how an A sku is purchased in Azure. There is documentation out there that explains how to sign up for Premium and includes the A sku in Azure. That can be found here ( https://docs.microsoft.com/en-us/power-bi/admin/service-admin-premium-purchase). However, I want to highlight the two areas that most interest me, that I couldn’t find answers too when trying to commit to buying the P sku. There are two key areas that I care about the most with this purchase. The first, is the location of the capacity (region). Second is who gets assigned as a capacity administrator. When you purchase the A sku, those are front and center. As a result, the license purchase is an easy process because I select them prior to committing any money.

    Purchase the P sku

    Unlike the A sku, you purchase the P sku in the Office 365 admin center. The glaring difference from the experience with the A sku is that you purchase the P1 license without any configuration… This can cause a bit of heartburn if you need to ensure that the capacity is applied to the right region upon purchase. For the moment, you can just assume things will come out smelling like roses and move on to the steps to purchase the P1 license. In the O365 Admin Center under Billing you will Select Purchase Services and then Search for Power BI. This pulls up the list of licenses you can choose from and you are going to Select the Power BI Premium P license.

    Selecting the license presents you with the options for payment type.

    Here is where we see the much cheaper price of $4,995.00, but it comes with the yearly commitment. (As a side note, I really wish we had the yearly commitment option with the A sku, with that option available, I wouldn’t even have to muck around with the P sku for my implementation.)

    After you complete the purchase process you can navigate to Purchase Services again and see that the Power BI Premium P license is now active.

    Assign and Configure on Setup

    Now what?

    Well, all you saavy Power BI Admins, we head over to the Power BI Service of course!

    Log in to your Service (app.powerbi.com) and because you’re all Global Administrators or Power BI Administrators you have access to the Admin Portal. For you first timers, that would be under

    Settings > Admin Portal

    (If you do not see the admin portal you will need to contact your IT or Security guys to grant you the Power BI Administrator Role.)

    Now normally when you go into the portal you would see a page that looks like this under Capacity settings.

    But after you purchase the license, in O365, and come back to the Power BI Service you will see this the first time you log in.

    Click on the Set up new capacity button and you get to the screen that myself and all of you wanted to see from the start. Where you add any additional capacity administrators as well as which region you want to use for your capacity.

    As the gray box outlines for you, the initial region is the home region of the Power BI tenant, but Clicking on the Region pops up all the other regions you can choose from.

    Make your selections. All that remains to be done is Clicking on the Set up button. Now your capacity is provisioned based on your configuration settings.

    Your new view when you log in to work with your capacity looks like this. You’ll be back often to monitor and alter any capacity configurations.

    Apply Capacity to Workspace

    I’ll close this out by showing you the final step of how you apply that new capacity to your workspaces.

    Jump back out into the home screen by Clicking Power BI in the upper left hand corner of the browser.

    Click on Workspaces and hover over the workspace that you want to add to the capacity.

    Click on the ellipses (3 dots) that appear to the far right and Select Workspace settings.

    The Settings dialogue will appear on the right hand side and you will Click on Premium in the header. Give the application a moment and you will be able to toggle the Dedicated capacity to On

    In the dialogue, Select the newly provisioned capacity and Click on Save.

    You will now see a little diamond next to your workspace name.

    After all that, you now have a Premium P license capacity supporting the datasets and reports in that workspace.

    Wrap Up

    I had too many unresolved questions during this process. I was looking for something like this blog to assure me I was headed in the right direction. Since I didn’t find it, I decided to write up my experience. I want to make sure others with those same questions can to see what it looks like to go through the process from an end to end. Hopefully this perspective helps when making the leap into Power BI Premium P1 licensing.

    If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.

    As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • Split an existing Power BI file into a model and report

    Split an existing Power BI file into a model and report

    Shared datasets are a great way to share data models across the organization. This enables users to maintain one source of the truth and increase efficiency. However, models should be built separate to reports. Meaning it can be difficult to transform an existing report into a shared dataset.

    This post describes how to split a pbix file with report and model, into two separate files One for the model and one for the report. This is accomplished using a PowerShell Script. By using this technique any report can quickly be split into a data model file and live connection report file.

    Note: that this script is not officially supported by Microsoft. This code is provided as is without any guarantees. The code will alter the internal files, so please keep a backup if you are unsure of anything.

    Think like the Business, Act like I.T

    Being a great Power BI developer can often mean more than just building visually impressive reports. Focus should be given to efficiency, reusing design and data modelling where possible. In addition, there should be “one source of the truth” – different reports should not have different methods to calculate the same KPI.

    This is accomplished by the creation of Shared Datasets. Users can publish a report that contains no visuals, only a data model. Multiple reports can then be built off this model by using a live connection and use the same data model. The model contains global measures. A Global measure is written and stored in the model file. Then the Global measures are re-used by other reports via the Live Connection. This ensures all reports have the same data model, logic and refresh status.

    There are many reasons you should consider this approach, which is out of scope for this article. If you are less familiar with shared datasets, I encourage you to visit the following resources:

    Think like the Business, Act like I.T user group – global models

    Power BI datasets: A Method to the Madness article

    Planning A Power BI Enterprise Deployment whitepaper DOWNLOAD
    ( Chris Webb and Melissa Coates) – section 9

    Use Cases

    Shared datasets are great, but what if you already have a file with a model and report in one? Currently in order to split a file this would need to be manual, by either copying all the visuals over to a new report (you would also need to re-record any bookmarks, filters, layouts etc. ) or to copy all the Power Query queries (you would then need to set up relationships and re write all measures). This can be time consuming, especially on a large report or model.

    Luckily for you, this code will do all the hard work for you. Simply run the code and select a PBIX file. It will create two new files, a report and a model.

    Running the code

    Right click and select the option Run with PowerShell in the menu.

    A menu will open up. Select the power bi file that you wish to split.

    Click the button OPEN to allow the script to modify your file.

    The script will then create two copies of the file and add the suffix _model and _report. Feel free to rename these, if you desire.

    Publishing to the Service

    Model File

    Open the _model file. It’s a good idea to add some text to the report explaining this is only a model file, so others understand the purpose. Example:

    Publish this file to the desired workspace.

    Report File

    Open the _report file. The script will leave all visualizations and report features intact, but all connections will be removed. When you open the report in power bi desktop, all visuals will appear broken:

    If prompted, make sure you discard changes – this will completely detach the report from the source.


    Click Get Data and Select Power BI Datasets.

    In the pop up window, select the model report you published in the previous step. This will now restore all visuals to display again.

    Publish the report to the desired location.

    That’s it! you can now share your new model file and continue to build reports off it.

    Download the Script

    You can download the script described below

    Download the PowerShell file and save it to your local machine.

    Script

    #This script was designed by Steve Campbell and provided by PowerBI.tips
    #BE WARNED this will alter Power BI files so please make sure you know what you are doing, and always back up your files!
    #This is not supported by Microsoft and changes to future file structures could cause this code to break
    
    #--------------- Released 6/2/2020 ---------------
    #--- By Steve Campbell provided by PowerBI.tips ---
    
    
    #Choose pbix funtion
    Function Get-FileName($initialDirectory)
    {
        [System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | Out-Null
        
        $OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
        $OpenFileDialog.initialDirectory = $initialDirectory
        $OpenFileDialog.filter = "PBIX (*.pbix)| *.pbix"
        $OpenFileDialog.ShowDialog() | Out-Null
        $OpenFileDialog.filename
    }
    
    
    #Error check function
    function IsFileLocked([string]$filePath){
        Rename-Item $filePath $filePath -ErrorVariable errs -ErrorAction SilentlyContinue
        return ($errs.Count -ne 0)
    }
    
    
    #Function to Modify files
    Function Modify-PBIX([string]$inputpath, [string[]]$filestoremove){
    
        #Make temp folder
        $temppth = $env:TEMP  + "\PBI TEMP"
        If(!(test-path $temppth))
        {New-Item -ItemType Directory -Force -Path $temppth}
    
        #Unpackage pbix
        $zipfile = ($inputpath).Substring(0,($inputpath).Length-4) + "zip"
        Rename-Item -Path $inputpath -NewName  $zipfile
                  
        #Initialise object
        $ShellApp = New-Object -COM 'Shell.Application'
        $InputZipFile = $ShellApp.NameSpace( $zipfile )
    
        #Move files to temp
        foreach ($fn in $filestoremove){ 
           $InputZipFile.Items() | ? {  ($_.Name -eq $fn) }  | % {
           $ShellApp.NameSpace($temppth).MoveHere($_)   }  
        }
        
        #Delete temp
        Remove-Item ($temppth) -Recurse
        
        #Repackage 
        Rename-Item -Path $zipfile -NewName $inputpath  
    }
    
    
    
    
    #Choose file
    try {$pathn = Get-FileName}
    catch { "Incompatible File" }
    
    
    #Check for errors
    If([string]::IsNullOrEmpty($pathn )){            
        exit } 
    
    elseif ( IsFileLocked($pathn) ){
        exit } 
    
    #Run Script
    else{    
    
        #set variables
        $modelfiles   = @( 'SecurityBindings', 'Report')
        $reportfiles   = @('Connections','DataModel',  'SecurityBindings')
        
        #Copy files
        $pathf = Get-ChildItem $pathn
        $reportname = [io.path]::GetFileNameWithoutExtension($pathn)
        $model = ($pathf).toString().Replace('.pbix', '_model.pbix')
        $report = ($pathf).toString().Replace('.pbix', '_report.pbix')    
        Copy-Item $pathn -Destination $model
        Copy-Item $pathn -Destination $report
    
        #modify files
        Modify-PBIX $model $modelfiles
        Modify-PBIX $report $reportfiles
        
    }
    
    
    

    Script Usage License

    Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

    The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

    THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat


  • Hot Swap Power BI Report Connections

    Hot Swap Power BI Report Connections

    Let’s say you have a Power BI file connected to an Analysis Services machine. Then you want to change the data source to PowerBI.com using a Live Connection. Well, you are out of luck. Until NOW!!

    This post describes how to remove a connection from a report using a PowerShell Script. This means any source you can just delete the data source and then re-point your report.

    Note: that this script is not officially supported by Microsoft. This code is provided as is without any guarantees. The code will alter the internal files, so please keep a backup if you are unsure of anything.

    Use Cases

    Image the following diagram. Here we have a Power BI Report Connected to an analysis services model. Now, you wish to re-point the model to a Power BI dataset.

    Note: we assume the analysis services model and the Power BI dataset model have the same definition. Meaning all the columns and measures are the same.

    The below script removes connections Power BI Report.

    Download Script

    Download the Power Shell Script here

    Note: Running it on a file without a live connection will not have an effect on the file.

    Running the code

    Download the PowerShell file and save it to your local machine.

    Right click and select the option Run with PowerShell in the menu.

    A menu will open up. Select the power bi file that you wish to remove the connection from.

    Click the button OPEN to allow the script to modify your file.

    The script will leave all visualizations and report features intact. But, all connections will be removed. When you open the report again in power bi desktop, all visuals will appear broken:

    This is because you have removed all data from the report. Select a new data source to connect the report to. If the new source matches the names of the columns and measures used in the visuals, they will all repopulate.

    How the Code Works

    Internally, PBIX files contain a selection of metadata and media files. The files are zipped and packaged into a pbix file.

    The PowerShell script will alter these internal files, removing the Connections file which holds the information to the live connection.

    It then removes a security file, SecurityBindings, that is necessary to remove in order to not corrupt the file. SecurityBindings will be repopulated the next time you edit and save the file.

    Script

    #This script was designed by Steve Campbell and provided by PowerBI.tips
    #BE WARNED this will alter Power BI files so please make sure you know what you are doing, and always back up your files!
    #This is not supported by Microsoft and changes to future file structures could cause this code to break
    
    #--------------- Released 5/28/2020 ---------------
    #--- By Steve Campbell provided by PowerBI.tips ---
    
    
    #Choose pbix funtion
    Function Get-FileName($initialDirectory)
    {
        [System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | Out-Null
        
        $OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
        $OpenFileDialog.initialDirectory = $initialDirectory
        $OpenFileDialog.filter = "PBIX (*.pbix)| *.pbix"
        $OpenFileDialog.ShowDialog() | Out-Null
        $OpenFileDialog.filename
    }
    #Error check function
    function IsFileLocked([string]$filePath){
        Rename-Item $filePath $filePath -ErrorVariable errs -ErrorAction SilentlyContinue
        return ($errs.Count -ne 0)
    }
    
    
    #Choose file
    try {$pathn = Get-FileName}
    catch { "Incompatible File" }
    
    
    #Check for errors
    If([string]::IsNullOrEmpty($pathn )){            
        exit } 
    
    elseif ( IsFileLocked($pathn) ){
        exit } 
    
    #Run Script
    else{    
       
        #Unzip pbix
        [Reflection.Assembly]::LoadWithPartialName('System.IO.Compression')
        $zipfile = $pathn.Substring(0,$pathn.Length-4) + "zip"
        Rename-Item -Path $pathn -NewName  $zipfile
    
        #Delete files
        $files   = 'Connections', 'SecurityBindings'
        $stream = New-Object IO.FileStream($zipfile, [IO.FileMode]::Open)
        $mode   = [IO.Compression.ZipArchiveMode]::Update
        $zip    = New-Object IO.Compression.ZipArchive($stream, $mode)
        ($zip.Entries | ? { $files -contains $_.Name }) | % { $_.Delete() }
    
        #Close zip
        $zip.Dispose()
        $stream.Close()
        $stream.Dispose()
    
        #Repackage and open
        Rename-Item -Path $zipfile -NewName $pathn 
        Invoke-Item $pathn 
    }

    Editing the Code

    The code runs on PowerShell. PowerShell is a cross-platform task automation and configuration management framework, consisting of a command-line shell and scripting language.

    If you want to view or edit the code, first open PowerShell ISE. We recommended to run as administrator. Afterwards, you can paste the code below into the editor.

    Other Considerations

    The script can be useful when using two power bi files for development and deployment to the PowerBI.com service. In this use case you will have one Power BI that is the bifurcated model and one report that has just report pages. To speed up development you can Open both the Model file and the Report file. The Report file can point to a LocalHost version of the running Analysis Services.

    Here is an example of what this architecture would look like.

    Script Usage License

    Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

    The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

    THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat


  • Milwaukee Brew City PUG – April 2020

    Milwaukee Brew City PUG – April 2020

    The Milwaukee Brew City PUG for April kicks off with some quick updates and highlights of upcoming events. We spend a quick minute on why we’re so excited about the fantastic line up of webinars that we were able to do recently. They highlight the top 3rd party tools to get the most our of your Power BI experience. We were excited to welcome a brand new Microsoft MVP – Chris Wagner to speak to us in this April PUG. He walks through How to Build a World Class Center of Excellence for Power BI. We also had some great conversation around a myriad of topics at the latter half of the meeting, so be sure to stick around to catch some of that at the end.

    Enjoy the Meeting!

    If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.

    As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • Power BI Field Finder

    Power BI Field Finder

    If you’re like me, building a data model in Power BI is an iterative process.  Sometimes, you try out different ways of writing measures before you hit on the one that’s right.  You end up with temporary measures that don’t actually end up being used in visuals.  You may also pull in more columns than you might end up needing, just in case.  The final result is your report masterpiece with measures and visuals, there are probably quite a few that you don’t need.  Two problems with this are that having extraneous columns and measures (1) can slow down your model and (2) can make it more difficult to maintain.  You may also want to know where on your report a change to a measure will have an impact. 

    The Solution

    The problem this led me to was that I had no easy way of finding out where columns and measures were used in the visuals of reports.  I could click on each separate visual to see what elements it used, but I couldn’t go the other way and click on a data element to see where it was used in visuals.  In researching the existing Power BI utilities, I learned a lot about how to improve my data models, but I couldn’t find anything that provided the ability to find where exactly data elements were used in reports.  So I decided to try to build one myself.  Based on the incredibly informative blog posts of Gilbert Quevauilliers, Reza Rad, Jese Navaranjan, Imke Feldman, and David Eldersveld, among others, I was able to create a tool that met some of my needs.  Hopefully it will help you, too!

    You can download the Field Finder here: 

    If you are interested in contributing to enhanced versions of the Field Finder feel free to check it out on Github here: https://github.com/stephbruno/Power-BI-Field-Finder

    The download contains both the pbix file and the template (pbit) file for you to use.  Use the template if you want to just get started and use the tool as-is. To see how it works, or make some tweaks of your own, use the pbix file.

    Setting up the Field Finder

    1. Double-click the Power BI Field Finder.pbit file wherever you downloaded it.

    2. After the file opens, you’ll be prompted to enter a value for the input parameter. This is the complete file path and name of the pbix file you’d like to analyze.  (thanks to Marco Russo, you no longer need to have your source file unzipped!)

    3. Click the Load button.  The file will start importing the layout information for your file.

    Field Finder Contents

    4. After loading the information, you’ll see four tabs. 

    • The first tab (Instructions) provides basic information and instructions.
    • The second tab (Columns and Measures) provides an overview of all pages of your report
    • The third tab (Page Details) provides more detail on each page of your report
    • The fourth tab (Helpful Resources) includes links of blog posts I used to make the tool

    Columns and Measures

    5. Let’s look at the Columns and Measures tab

    On this tab, you get a thumbnail along the top of each of the pages in your report. The types of visuals are color coded and the name of the page is above the thumbnail.  The table on the bottom left gives you the number of times each column or measure is used on each page of the report.  On the right you will see the page level and report level filters using columns and measures.
    Clicking on the thumbnail of a page will filter the tables below to show you exactly which elements are used on that page.  Similarly, by clicking on a column or measure name values in the tables will cross-filter the thumbnails above. This lets you quickly see which pages it’s used on.  For example, clicking Units from the Sales table in the bottom left shows that it’s used in the Pre-Formatted Visuals, P2, and Tooltip pages.  The thumbnails cross-filter to just those pages.

    Or, clicking on just thumbnail for P2, you can see which columns and measures are used on the entire page.

    Page Details

    6. For more detail on a specific page, click on the “Page Details” tab.

    The Page Details tab provides a bigger image of the visuals on the page, and includes more details on the types of visuals and what they’re using.  You can select one of your pages with the slicer on the left. The Visuals Legend provides a color code of the type of visual used and the name of the visual.  If you haven’t updated the name of your visual in the Selection Pane of your pbix file, you’ll get one of these long strings for the visual name that isn’t very helpful.  In addition, you will also see the list of all the columns and measures used in the visuals as well as any page level filters.
    For example, by clicking on Class in the table in the bottom right, can see at a glance where it’s used on the page.  The image of the page also makes it easy for me to go to my pbix file and find exactly which visuals it’s used on.

    7. If you make changes to your source pbix file, just save it and then refresh the Field Finder to get your updates.

    8. If you want to examine a different pbix file, all you need to do is change the parameter for the file name and path.  To do that, go to Transform data – Edit parameters and enter the path and name of the next file you want to look at.

    I hope you find the Field Finder useful.  Please let me know if you have any suggestions for future versions.

    Many thanks to Seth and Mike of powerbi.tips for creating the improved visual layout of the Field Finder!

  • Introducing Scrims

    Introducing Scrims

    PowerBI.tips is excited to announce our new tool to help you build the best looking reports, Scrims.

    We’ve built out a fast and easy solution for snapping visualizations into place while giving your reports the extra special look and feel that allows your audiences to be impressed without losing focus on the important stuff. We developed Scrims to give you this shortcut to amazing looking reports

    16:9 (1280 x 720) examples

    What are Scrims?

    A Scrim is a background image that you would use on your Power BI Report pages. Why the name Scrim? The term scrim is used in theater productions. It is a backdrop that is placed on the stage behind the actors. It adds context & engages the audience with the production. We thought this idea crosses over well with Power BI.

    A scrim can change the mood of a theater just like a well designed background image in your report. Scrims were developed to be easily adaptable to different color themes and in each set of offerings we release we’ll be providing you with as many color options as possible.

    Why Do I need One?

    Scrims solve several problems.

    1. The most important problem it solves is time. As BI practitioners ourselves we know the demands that are placed on you. More often then not the visual look and feel gets the least amount of attention due to deadlines. We want all the time you invest in the data and building visuals to impress your audiences by being presented in a beautiful way.
    2. Scrims are designed the same way we developed Layouts. By emphasizing proper design focused on the Gestalt design principles to ensure the end users experience enjoyable and non-distracting reports.
    3. Using a Scrim you will reduce the number of objects on the page. Less elements means faster rendering reports. Here is a blog that tests this by adding more visuals to a page “More Visuals Mo Problems”. Scrims add the illusion of a very large number of objects without a negative impact.

    Scrims come in a Bundle

    A Scrim bundle contains a series of images that you can use in any way you want in your Power BI file. Each Scrim will have different page sizes to best suit your report needs. Every Scrim will contain a default 16 x 9 (1280 x 720) aspect ratio. Most Scrim bundles will have additional ratios such as 8 x 9 (1280 x 1440) or 4:3 (960 x 720). You will see the sizes prominently displayed for each bundle with the red tags.

    8:9 (1280 x 1440) examples

    Each bundle contains 6 pages minimum for each size, which means on average you will receive at least 12 pages in each bundle. Each Scrim bundle also includes the color theme that corresponds with it in JSON format to easily upload into your Power BI Report.

    Theme Color example
  • Building a Gateway Cluster

    Building a Gateway Cluster

    Power BI requires a gateway for refreshing on premises data sources. There are a myriad of different data sources that you can create and two different ways you can set up the gateway. The first way you can install a gateway is in the “personal mode”. The second method for installation uses the “On-premises data gateway (recommended)” (OPDG), this used to be called the “Enterprise Gateway”. The second gateway method is what you need to set up and configure appropriately in order to manage permissions. The recommended gateway to use when deploying reports to a wider audience is the OPDG.

    Over the course of time enhancements to the gateway have increased the usability and functionality. Such as, being able to connect to cloud and on premises data in the same report, allowing custom connectors and the ability to create and distribute workloads across multiple gateways that you have clustered together. You can toggle all these settings in the gateway in the Power BI Service.

    By setting up a gateway cluster you ensure that your reports are going to refresh even if there is some maintenance activity or if a server goes offline unexpectedly. This article will explain how we can configure this set up and outline a few gotcha’s related to managing this setup that aren’t so intuitive.

    Setup

    For DR (Disaster Recovery)

                The name of the game here is to choose servers that are not near each other. While this may not be ideal from a refresh performance perspective this will provide a better recovery choice if something catastrophic happens to the server.

    For Load Balancing

                The goal in this gateway setup would be to choose several servers nearest the data sources and choose different servers to distribute the refresh load for improved performance.

    In an a perfect world we could combine DR with load balancing. At this point in time you can either distribute the load across all gateways or choose not to distribute load and it will default to DR. There is no combination of both methods.

    Install the Gateway

    The install of the gateway should be pretty straight forward provided you follow the guidelines here https://docs.microsoft.com/en-us/data-integration/gateway/service-gateway-install

    What I’ve found with installing the gateway is that if it isn’t easy, its hard. When it is hard, you will likely need to do a bunch of troubleshooting to resolve the issues. A great helper guide for troubleshooting a bunch of common and not so common things can be found here – https://docs.microsoft.com/en-us/power-bi/service-gateway-onprem-tshoot . Install the gateway on the initial server once you get past the initial couple screens you will hit this one

    You want to register a new one gateway and be sure to Set and SAVE your recovery key. The key is the most important part of how you both recover, and create a cluster.

    (Note: You can always add a new gateway to an existing gateway, this doesn’t have to be a net new process)

    Adding the Clustered Gateway

    After you have the first gateway installed, login to the second server, where you will follow the exact same process. Except this time when you hit the next screen you will want to toggle the “Add to an existing gateway cluster” and enter in the same recovery key.

    By adding the recovery key and checking the box you have declared to Power BI that this gateway should be linked to the other gateway and thus create a cluster. Like the first gateway, it now shows up in the Service and we can set up the data sources to match our other gateway. At this point we can configure the settings of the gateway to distribute or by default it will just be for DR.

    Once you have your gateways deployed you can manage them via the Power Platform Admin Center – https://admin.powerplatform.microsoft.com

    Be aware that it does require appropriate permissions to see and manage the gateways, so check more out in the documentation here – https://docs.microsoft.com/en-us/power-platform/admin/onpremises-data-gateway-management Since gateways can be used for other products, this is where you can see what is installed and where and figure out who has access.

    I’m hoping that now that we have an admin center that in the future we’ll be able to manage upgrading, clusters and all “admin” related activities from this interface. As of this writing, we are still relegated to using APIs and cmdlets to manage these objects. Just recently, the Power BI team did release a whole slew of PowerShell cmdlets to help out administrators manage gateways, be sure to check out the preview release notes here. https://powerbi.microsoft.com/en-us/blog/on-premises-data-gateway-management-via-powershell-public-preview/

    Removal

    If you are like me, when I installed a gateway and then needed to remove it, you receive all kinds of errors after the fact in the Service related to those defunct gateway.  Truth be told, you will need to manually remove the gateway from the cluster.  You can do this by utilizing a PowerShell script.  This does not mean uninstall the gateway, this means you need to run a PowerShell script Remove-OnPremisesDataGateway to detach the gateway from the Power BI Service. If you don’t, it will still remain there. I just recently uncovered this while upgrading a gateway and was still getting error messages saying that previous gateways that I had previously deleted after testing out clustered gateway were out of date… which of course they were, they were gone! But alas, to remove these errors you need to follow the instructions here to fully remove the gateway. https://docs.microsoft.com/en-us/data-integration/gateway/service-gateway-powershell-support

    Summary

    All in all, it is fantastic that we have the ability to provide more stability to our data refreshes, there is a ton more information that can be provided and outlined for data gateways. I hope this post gave you some good tips/tricks around gateway clustering and how to clean up after yourself when deploying clustered gateways.

    If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.

    As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • Updating PBIX Files in SharePoint: Do’s & Don’ts

    Updating PBIX Files in SharePoint: Do’s & Don’ts

    I recently encountered a really frustrating experience related to a set of reports seeming to not update after some data source changes. I’d done this change before, and had another co-worker take on the task of updating some of the other reports in the same fashion. After a bit he reached out and explained that a few of the reports wouldn’t update while the others had… This was odd since we were making the same type of change. Queue hours of testing, changing, fixing one report without any idea how I did it, and then hitting a brick wall with the remaining two.

    The Setup: The data sources for the original reports were Excel spreadsheets stored in a SharePoint Online folder. The PBIX file connected to Excel files in SharePoint online using a Web connection. Once the reports were created, we stored the PBIX files in a separate SharePoint Online folder. Next, we opened the Power BI Service and connected to the reports directly from Power BI to a SharePoint team site, if you aren’t familiar with that process you can read about it in depth in an article that Mike Carlo wrote -> here. (The TLDR -> Get data -> Files -> SharePoint Folder -> navigate to folder -> Click on PBIX -> Connect)

    By connecting to the PBIX files from the PBI Service the dataset, report and dashboard for the report are loaded and sync’d. Where this comes in very handy indeed, is linking the SharePoint folder to your windows explorer and accessing the file, making changes, and saving the PBIX in that “local” location no longer requires you to re-publish the report from the PBIX and the changes make their way to the Service without any further work. I had made several changes to the reports from an asthetics perspective, and maybe added a new measure here and there, but hadn’t done anything drastic until just recently when we updated the sources. Due to some changes on the backend processes that were generating the Excel files, it became apparent that it would be easier for the Dev team if we used CSV files instead of xlsx. The team went through and changed the reports and 3 of the 5 reports we were working on broke in the Service giving this error.

    “Failed to update data source credentials: Web.Contents failed to get contents from” after updating data source

    At first, I thought it was the use of parameters in the report, but after determining that wasn’t the issue, it actually helped me figure out that the report in the Service wasn’t updating because the parameters were still showing the “xlsx” file type from the files that I had already removed from the SharePoint location. After repeated attempts to change the datasources manually in the original file, delete and replace, and every other combination of things to try a refresh, I was stumped. What was more confusing, is that connecting to the file again from the PBI Service created another dataset & report of the exact same name!

    In my mind, that wasn’t supposed to happen, and I was getting more frustrated because things were not behaving as I would expect. My initial assumption here was that the PBI Service would look at the new PBIX of the same name and recognize the changes and replace the existing dataset and report, the same way that it does if I manually push a PBIX via the “Publish” action. But using this direct connect method, refreshing the dataset did nothing…(cue the “grrrr”).

    The solution:

    Thanks to the fantastic Power BI team, they were able to determine my issue and share with me the root cause of some of the datasets not updating. The issue is that when you connect to the PBIX files in this manner, the PBI Service looks at the root id of the PBIX file in the OneDrive location and that file id cannot be deleted and replaced with another one. If you do that, you get the above error on the dataset source because it cannot find the file id. If you remove the file out of SharePoint then when you insert it back into the folder the ID will be different. This is what breaks the dataset in Power BI, and also the reason it allows what appears to be the same named dataset and report. However there is good news, because you are in SharePoint you will be able to go back in version history of the folder and pull the previous versions of the reports and resolve the issue in the datasets. I did end up having to replace one of the reports, but lucky for us it was only a single user for the customer report and re-sending a shared link was a really low impact compared to what it could have been.

    Here are the following guidelines I can offer up after going through this.

    Do:

    • Sync the SharePoint folder to your local storage
    • Open the report directly from the synced folder and update it in location
    • Leave the originals file in the folder and replace it with the new one
    • Do try to resurrect your previous version of the report PBIX in the folder (if you are hitting this issue now)

    Don’t:

    • Move the file to your local machine, then back to the SharePoint folder
    • Remove the file from the folder (You can do this if you move it to a SharePoint folder, update and move it back, but if the dataset refreshes during that time in the Service it will break and you will need to fix the credentials when you have put the file back.

    I hope you found this post worthwhile, error messages are hard to track down to the root causes and due to the nature of this being a tricky one for me to solve I thought I would share the experience and hopefully help others avoid it or solve the errors that match above.

    When testing out this issue again, I did uncover this error message which is much more specific. It came up after I made the breaking change, but prior to an automated refresh.

    “We are unable to locate the file on OneDrive for Business. Please check that the file exists in this location.”

    If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.

    As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise

  • Power BI Refresh Overview

    Power BI Refresh Overview

    There are different ways you can connect to a multitude of different data sources. I’ve written about the different connection types before and you can find those articles here if you are unfamiliar with what I’m talking about.

    When you import data and publish a Power BI report to the Power BI Service you need to schedule a dataset refresh in order for the report data to be updated. This can be done manually by refreshing the PBIX and re-publishing the file, but for any production report that you want to set a schedule for, you need to ensure that you set up things to automatically update without further intervention.

    The first thing you need to check is whether or not you need a gateway. For any data source that is located within your company servers, on a local machine, or within certain types of cloud infrastructures are considered “on-premises”. When a data source is on-premises you require a gateway in order to refresh the data. If your data is in a cloud source such as OneDrive or SharePoint Online, then you will not need a gateway to refresh.

    Once you determine whether or not you require a gateway, you need to understand that there are two different types. The first is called a personal gateway, this gateway will allow you to use the majority of Power BI data sources with no configuration. After you install this gateway and publish your report using a data source that is supported by that gateway you can set up a dataset refresh in the Power BI Service without any additional setup. The second type of gateway used to be called the “Enterprise Gateway”, but now it is just the gateway that is not (personal mode).

    Install a gateway from here: https://powerbi.microsoft.com/en-us/gateway/

    gateway installer

    This gateway is designed for use in your enterprise wide reports. It has a lot more data sources that it supports for refresh (That list can be found here) It also requires some additional setup, configuration and management from an Administrator. Let’s walk through how you set one of these up so that you understand how to do it, or how you can request to have it done for your organization. It is highly recommended that you install the “Enterprise Gateway” on a server that will never shut down or be on a scheduled maintenance cycle. You can download this gateway to your local computer, but if the computer is off, so is your gateway, and the reports will not refresh.

    Installing the gateway:

    Installation is very similar to the personal gateway, except you have options to add the gateway to a cluster or incorporate a recovery key. ALWAYS set a recovery key and SAVE IT! This key will allow you to move the gateway at a later date if you need to without impacting any of your reports. This key is GOLD, treat it like it and you will save yourself a bunch of headaches in the future. Outside of that, the install of the gateway should be really straightforward, if you do run into issues there is a comprehensive guide to troubleshooting the problems that you can find here. I recommend using an administrative account to set up the gateway because the ownership of the gateway in the Service will begin with this user. If you have one account that manages all the Enterprise Gateways, then you’ll save yourself a ton of pain down the road from an administration standpoint in discover-ability. If you aren’t the admin, be sure to either have an admin set up an account for you or let them know that you have an Enterprise gateway set up using your account and have them securely store the gateway recovery key. Alternatively, if an admin account is not used, be sure to add an IT admin as a gateway administrator in the next step as you configure the gateway.

    Configuring the gateway:

    After the installation of a gateway you need to log in to the Power BI Service (app.powerbi.com) with the user that you configured the gateway with. Go to Settings > Manage gateways and you will see several different configuration options that will affect how end users can see and interact with data sources on this gateway. The main call-out I have in these options is the first one. More often then not, you will want to make sure that you allow users to “mix” cloud and on premises data sources.

    gateway options

    If Gateway Administrators want other people to be able to access and create new data sources on the gateway for others to use it requires that every on-premises data source be setup and configured (Unlike the personal gateway). This means that the data source can use a single user/permission set for a data source and the report and end users will inherit the read permissions of that user. (A caveat to that would be the application of row level security, or a live connection, but that is beyond this overview).

    data source

    After the data source has been configured, the administrator needs to add any users to the data source so that they can deploy reports using the gateway. This is an ideal use case for using Active Directory security groups, this allows administrators to apply a single group and add and remove users from that group verses managing all the data sources in Power BI. If a user is not added to the data source on the gateway, the report publisher will not be able to setup a refresh using that gateway.

    data source users

    Scheduling a Report dataset to refresh:

    Now that you have completed the set-up of the gateway and added your users (including yourselves) to the gateway, you can publish a Power BI Desktop report and schedule a refresh.

    First, publish the report from the Power BI Desktop to the workspace that you want the report to live in.

    publish

    Next navigate to the workspace and select datasets, follow the dataset ribbon to the right and click on the ellipses:

    dataset location
    dataset settings

    Navigate to Datasets and expand the Gateway connection section. Enable the use of the gateway if it is not enabled already. All your source will be listed here, and if you have any data source connections strings that do not match exactly in your PBIX file it will become apparent very quickly that you have a broken dataset. All data sources in the gateway need to match exactly to what is in your PBIX file.

    enable gateway

    Once the gateway is enabled and all your data sources are working, you can schedule a refresh.

    schedule refresh

    One thing in particular to note here, when you deploy your report from your local time to the Power BI Service all times are stored in UTC times. So, if you are doing time specific calculations you may need to take some time zone things into consideration. Reza Rad has a good article outlining some methods to sort this problem out, you can find that here.

    A recent update to the dataset refresh now also allows us to add in email enabled groups or other specific users to the refresh failure section. This is extremely helpful in ensuring a wider audience of users can be notified any refresh fails.

    I hope you found this overview helpful, stay tuned for more blogs that walk through specific data source connection and refresh setups that will help ease the process of connecting to, and refreshing all your various data sources in Power BI.

    If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.

    As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
    Store Merchandise