Author: tommy puglia

  • Welcome to Community Jam

    Welcome to Community Jam

    PowerBI.Tips LOVES community. And we are out to prove it.

    We are so proud to announce Community Jam by PowerBI.Tips, the all-inclusive, resource driven, and COMMUNITY led learning tool for members of the Power BI Community. Today is our first step – there is so much more to come.

    What is Community Jam

    Community Jam is the one-stop shop for what is important in the Power BI community and keeping up with the latest news. As of launch today, there are three main areas of Community Jam: Release Plan, Power BI Ideas, and Bookmarks.

    Release Plan

    We have incorporated Alex Power’s Power BI Release Plan app into Community Jam, allowing you to see what is on the roadmap for MIcrosoft releases around Power BI.

    Power BI Ideas

    Mike Carlo has done an incredible job feeding in all of the ideas on ideas.powerbi.com, allow you to easily search, find, and vote for new and hot ideas that should be part of the next wave and future of Power BI features.

    Ideas on Community Jam

    Bookmarks

    This is a cool one. Working with Tommy Puglia, the PowerBI.Tips team has now over 6,000 articles from across the Power BI community of authors and bloggers that you can use as your own knowledge center. Using the bookmarking service Raindrop.io, any new articles, videos, and resources that touch Power BI are saved here.

    Bookmarks in Community Jam – Your Go-To Resource Toolbox

    I want to talk more about the bookmarking feature here. I have been using Raindrop.io for over 3 years to save any relevant article posted around Power BI. With a combination of RSS feeds, Twitter links, and email subscriptions, a ridiculous amount of articles have been written around the how-to’s, what, and did you know in Power BI.

    Tips & Tricks

    You can either browse through the latest articles in the Community Jam website or go to the dedicated Raindrop page. On the Embed page, if you have a raindrop account, you can save articles to your own Raindrop service, or simply read the new articles within Community Jam.

    Use the “more” button to go to page on Power BI that really opens up the amazing features of Community Jam Bookmarks.

    find articles based on nested collections

    Nested-Collections

    Community Jam Bookmarks have nested collections that allow you to further refine the articles. Simply by choosing on the top of the page, you can fine focused articles (new and old) around Community, Power Query, DAX, Admin, or Data Visualization with just one click.

    DAX nested collection

    The top right has a search bar that will allows you to choose based on tags (more to come here) or by your own search. Not only will it search title and description, but Raindrop allows you to search the content of the article as well!

    Search in Community Jam Bookmarks

    Want to learn about variables in DAX? Use nested collections and search, and you got 235 articles to read up on.

    Need to read up on Power Query? How about 917 articles?

    Admin & Governance? 500+.

    What about Tabular Editor 3? 50 focused articles.

    How about Charticulator? 28 dedicated resources.

    Power BI Goals? We got 56 for you.

    The search feature is INCREDIBLE. We have more plans here to include a tagging system (using the # which you can see on the search page) that will further help you find the right resources.

    And More Resources

    Get your favorite PowerBI.tips resources using the app pane in the top left of Community Jam, including the Theme Editor, Charts, DAX Templates and more. Also, you can find the podcast and all training here too!

    Why Community Jam.

    We cannot stress this enough. This is just the beginning of what we see as what should be a central area for Power BI fans to learn and grow within the community.

    Community Jam is for users, for the community, by the community. Empower and grow, network and learn – Community Jam is ready to go!

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat


  • Using the Power BI Scanner API to Manage Tenant’s Entire Metadata

    Using the Power BI Scanner API to Manage Tenant’s Entire Metadata

    Much thanks must go to both Ferry Bouwman and Rui Romano for inspiration and building off the work they have done already for the use cases for the Scanner API. Ferry created the initial Microsoft Flow to automate the request and pull of data, and design ideas were taken by Rui Romano’s existing Report. Please give them a shoutout because this would not be possible without them!

    Recently the Power BI team announced a major advancement in the ability for Power BI admins to extract tenant-level metadata with the ability to collect information such as tables, columns, measures, and DAX expressions in datasets in the Power BI Service. This feature is a huge step and something that any Power BI Champion should strongly focus on the use cases and integrating this solution into their catalog.

    Let’s start with the what and the why of using the Scanner API as a Power BI Admin.

    What is the Power BI Scanner API?

    The Power BI Scanner API allows organizations to request and collect the entire metadata of a tenant’s Power BI schema and catalog. Using the Power BI REST API, users can push a scan and more importantly extract nearly all of a dataset’s information and schema. The Scanner API returns the entire tenant metadata such as:

    • Datasets & Workspaces
    • Data Sources
    • Dataflows
    • Dataset
      • Tables
      • Columns
      • Measures, including the actual DAX expressions
    • Table M Expressions

    Why Use the Power BI Scanner API

    The ability as a Power BI Admin or champion to consume and understand the datasets and information of their tenant is vital both from an Governance and Adoption perspective. Firstly, the Scanner API enables admins to discover and easily understand the workspaces, measures used, and what datasets are active in their tenant. Rather than relying on various methods of manual input of datasets into a system, the automated fashion to pull in this information positions admins to better enforce and manage the organization of datasets.

    Governance

    Along with dataset information, the newly updated Scanner API pulls in dataset metadata which creates more potential of how to better govern and unify the business logic used in datasets. A primary use case is to ensure that datasets and the tables being used are using the proper logic (columns, data sources, merges) by easily viewing the M code behind any table dataset. In the same fashion, champions can now ensure that datasets are 1) using Explicit Measures in their reports, and 2) those measures which are universal to the company are using the correct formulas (think Net New Members in multiple reports, ensuring that the correct relationship for date and Member ID is being used).

    Adoption

    There are many workarounds in the community to best provide discoverability of data for users. Unfortunately, many of these require manual input and do not synchronize with one’s active data. Using the Scanner API, admins can create automated solutions to easily provide datasets that are active for users to discover, and further can be integrated with other platforms to include custom fields.

    One idea is creating a Data Lexicon for an organization, which includes a company’s report catalog and terminology. A Data Lexicon should include helpful content for consumers, such as a report’s purpose, intended audience, and refresh schedule. Using the Scanner API, anytime a dataset is added to the tenant, report authors can easily integrate these custom fields with active datasets.

    Understanding the Goal

    This article is not going to cover the intricate details of the API requests and parameters. Rather, the TL;DR version of the required calls / steps of the API are:

    1. Call the Scanner API to trigger a Scan
      1. This call must include a body of what Workspaces to be scanned
    2. If more than 100 workspaces, than loop through the request (limit per call is 100 workspaces)
    3. Wait until a Scan is completed (depending on how many scans)
    4. Get the Scan Result and collect the array as JSON

    The goal here is then to try to accomplish the following:

    • Create an easy-to-use process to automate the API workflow
    • Store the scan results into a JSON file into SharePoint
    • Transform the metadata into a structured model (Tables, relationships, etc)
    • Use the structured tables in multiple products (Data Flows, Power BI, Power Apps)

    Building the Solution

    The majority of credit needs to go to Ferry Bouwman who initially created a viable solution that can easily be integrated into a report. He created a GitHub repo that included a Power Automate flow that truly covers the entire process of automating the API call.

    The following is building off Ferry’s solution, including the new metadata schema that is now available. There is more that I want to accomplish in this solution, but to get the Scanner API and a template to connect to the data, you can do so using the steps below.

    Pre-Requisites Before Use

    Before starting, you must have already completed the following in order to use the Scanner API at all. Please see the documentation for each to set up:

    The Solution Bundle

    The solution includes the following to implement:

    • A Power Automate Flow that handles the entire API request and call
    • A Scheduled Refresh Flow that refreshing daily and triggers the Flow above
    • A Power BI Template report to connect to the metadata results

    Download the Solution on GitHub.

    Installing & Using

    Import the API Scanner Flow

    The first step is to import the Flow pbitips_ScannerAPI into your tenant. Once you do this, there are a few variables and actions to update before running.

    • tenant: The tenant of your Active Directory
    • clientId: The Client ID of your registered App
    • clientSecret: The Client Secret value of your registered App
    • SharePoint Library: What SharePoint library you want to save the files
      • NOTE: Remember this location as it will be used in Power Query
    • Folder Location: The folder location to save all returned scans
      • NOTE: Remember this location as it will be used in Power Query
    • Folder Location Trigger: A different folder with a different name, to trigger the refresh run.

    Set up the Automation Flows

    The next part is we want to set up the automation of the Flow, so that it triggers on a daily basis, or even a manual basis.

    Import the Flow PBI_Scanner_Refresh into Power Automate. Once imported, you will need to grab parts of the initial Flow’s HTTP trigger and add them to the variables in the PBI_Scanner_Refresh Flow:

    • Initialize string URI-first-part: The first part of the HTTP Request Received, everything from the start up to modifiedsince/.
    • Initialize string URI-last-part: The parameters. Simply copy from the ? part of the URL to the end
    • Initialize string modifiedSince: write all
    Copy the HTTP Get URL from the initial Flow to grab the variables needed
    Paste the parts of the HTTP GET URL into the variables in the daily refresh Flow

    Additionally, The Power BI Template also includes a visual to trigger the Flow within the Report. You can simply copy and paste the variables and the HTTP Call other flow with all Power BI API logic actions using the When a Power BI Button was clicked as the trigger.

    Run the Flow: Ensure It is successful & saves the files

    Run the flow manually. Note that the first time you ever call the Scanner API, it will return a subset of the metadata. The more that you run it (daily) the more complete metadata will be returned.

    Once you can confirm that 3 files have been saved to the folder specified above (a MetaData_, a WorkspaceArrary_, and RequestStatus_ json file), you know the Flow works.

    Ensuring that the files were saved to the correct SharePoint Library and Folder

    Once you have verified the flow runs and saves to the correct file, you are ready to start using the Power BI Report.

    Connect to the Data – Power BI Template

    Using the Scanner Tenant Metadata Power BI Template file, open it and it will prompt you to input two parameters.

    • SharePoint Folder: The SharePoint Document Library url specified in the variable from the Flow
    • FolderFiter: The deepest subfolder that the files live (for example, if the files live in PBI_Admin/MetaData/MetaFiles/, then enter in “MetaFiles“)
    Entering in the Parameters in the Power BI Template

    Once you enter the parameters, click on load, and wait for the magic to happen!

    Using the Scanner API Report

    The Report contains tons of information across the entire organization’s content in Power BI. From Datasets all the way to the DAX expressions per table and report. The template and report is meant to be a starting point for authors to further build out additional functionality to meet their needs. Much thanks to Rui Romano’s Scanner Template as well:

    Summary Page

    The Template starts with the Summary Page, providing a high level overview of Workspaces and Datasets active in the tenant. Included in the high level overview is the created date of a particular dataset, the number of tables, data sources, and users who have access to it.

    Selecting a dataset will highlight a drill through button to navigate to a detailed dataset page.

    Summary Page in the Tenant Data Catalog

    Dataset Drill through Details

    The drill through page for a dataset provides vital information such as the tables, fields, and even the query and DAX expressions within a dataset. Along with this, an information panel of the ID, storage mode, and even users is available here.

    Selecting a table will display the M query in it’s entirety. Expanding the Measures & Calculated Columns displays the DAX expressions beneath it. Along with this, the list of data sources by type is available.

    Dataset Drill Through Page, showing expressions and Users who have access

    Datasets Page

    The Datasets page is a overview showing the number of entities (columns, calculated columns, and measures) within a dataset, including what Data sources are being used. Tracking datasets by created time is a helpful feature allowing admins to monitor the creation of new datasets overtime.

    Datasets Summary Page

    Tables Summary Page

    Tables allows admins to monitor what tables are being used throughout the tenant’s datasets. This a powerful feature, allowing admins to monitor tables that may be used across datasets.

    Tables Page allows admins to see the columns, along with what datasets the table may be included in

    Data Sources Page

    Looking at the metadata in another way, admins can monitor the type of datasources used throughout the tenant, including the information such as the data source type (SharePoint, SQL, etc) and even the source. Selecting a datasource will display what datasets they are included in.

    Datasources Page shows by type what datasets, the source of the source ,and even Dataflows

    Users Page

    The Users page is using the new ability to append to the Scanner API metadata, getArtifactUsers=true, to pull what users have access to various datasets. Again, the ability to select and display is a powerful feature for Admins.

    Users Page showing access

    Details Page

    Understanding needs to get the metadata displayed as a list, the Details page provides all of the underlying information about each artifact in the tenant, such as the ID’s used in Power BI, types, and who last configured an entity.

    Details Page showing all of the underlying information

    Conclusion

    The ability for a Power BI champion to have full visibility into the organization’s Power BI content has and will be a vital piece of adoption and governance of Power BI. The amount of information available and to act on will allow admins to readily understand the activity happening at all times.

    You can find the full solution here:

    This template is just a starting point. The community here should be able to take this and expand on this, and please provide your suggestions to the GitHub Repo here:

    Again, many thanks to Ferry Bouwman and Rui Romano for building the foundation.

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat


  • Standardizing KPI’s around a Business Intelligence Team

    Standardizing KPI’s around a Business Intelligence Team

    This article follows from Episode 5 of the new Explicit Measures Podcast, a whole new way to talk about Power BI. If this article strikes you as relevant, subscribe to the podcast on Spotify, Apple, or wherever you listen. You can also watch live at 7:30am CST every Tuesday and Thursday morning on YouTube.

    On the latest Explicit Measures Podcast (Episode 5), the team dived into what should a BI Team focus on for their own KPI’s. One theme was consistent across each host however: Any KPI for a BI Team starts with the question: how do you evaluate and define success? This idea of success and the value for a Power BI pro can fall into many different opinions, depending on the size, team, and current culture at an organization. We wanted to share an initial template of KPI’s that any BI Team or Pro should start using and integrating in their own workflow.

    Evaluating Success for Power BI

    How can you properly gauge whether reports and data is satisfying the role in a company? At least from the opinion of the Explicit Measures Podcast, the basis starts with the ability to provide value, trust, and insights to an organization through their data. Starting with this as the end-goal, a BI Team can and must strategize on translating success into measurable targets. Let’s break this out into three distinct elements of success, with examples of KPIs for a BI Team.

    Elements of Success

    Adoption

    Adoption has become a buzz word in our industry over the past few years, and with good reason. One could make the argument that the ability to drive adoption should take higher precedent than some of the reports themselves. For reference, we are defining adoption as the maturity, growth, and reliance an organization has on their data via Power BI.

    Value / Time

    While most BI professionals do not directly create revenue, there is no question that there is a cost. With an ever increasing workload and requests for our time, the ability to validate and choose to work on impactful and value-added reports is essential. If a pro is working on one report, there are five others that are being ignored. Further, are the reports that are being developed and deployed providing the expected insights and information to an organization?

    Data Quality

    Anyone who has worked in Business Intelligence can tell you – once teams lose trust in the data, it is an awfully long and difficult road to gain it back. If users cannot trust the data in Power BI reports, that both reverts adoption and users will find other means to get their data. BI teams must be able to monitor how up-to-date published reports are, and ensure that the content that is available is current and accurate.

    Examples of Success KPI’s

    The following are examples of what a Power BI team or Pro can use to evaluate their own success based the pillars of Adoption, Value, and Quality. This is by no means an exhaustive list – this is an amazing community that consistently contributes new and innovate ideas – however there is no current standard for a BI Team success KPIs.

    An Example BI Team Scorecard using the new Goals in Power BI

    Adoption – KPI’s

    Rolling 14 Days / 30 Days Report Views

    Just with a basic department metric, simply looking at the aggregate does not create a KPI. While Report Views are important, giving context to the current performance transforms how you view this. This KPI not only shows you your top reports on a 2 week and month period, but also compare with the previous 14 / 30 day period.

    Viewing Report Usage on a 30 Day Rolling Basis

    Active Users (Weekly, Monthly)

    The relationship between the number of Report Views and Users may not be as straightforward as you think. Keeping watch of engaged consumers should occur on a weekly and monthly timeframe. For this, you can simply use a filter on a minimum of X reports viewed per week or month. Depending on your data, you can gauge the current state.

    User Distribution by Report

    Do not be fooled by high usage numbers in your reports alone! By this, make sure you can identify power users who are hoarding the majority of views for a given report. For example, a great technique to understand this is using the Pareto Principle, or the 80/20 rule in your report views. For example, for your top report, try to track the 20% users, and how much of total views they make up for an entire user base.

    SAT Scores, Feedback

    The majority of the KPIs in this article focus on quantitative metrics. However, there should attention to create subjective feedback in Power BI. For example, creating a Power BI Feedback Survey can create high value. In regard to when to send out a Survey, the following scenarios are suggested:

    • 45 Days after New Reports Launched (per-Report Feedback)
    • Quarterly Feedback Surveys (Overall experience using Power BI)

    Collecting this data via Power Automate and integrating into Power BI becomes a powerful tool.

    Using Customer Voice to Send out Report Feedback Surveys using Variables for Report Name

    Value / Time – KPI’s

    New Reports Launched

    Like Supply Chain Management, ensure you can track newly published reports. Bear in mind, this is not a growth target. There should be some range depending on the size of the BI Team that should aimed for. For example, a consistent small number may show a backlog. However, to high of a number may be saturating the overall experience for users.

    New Report Usage

    In parallel with tracking newly published reports, keep an eye on the immediate interest from consumers for these new reports. Like with the New Reports Launched KPI, depending on your team and size, decide on a sweet spot regarding range of views you expect. Likewise, have a filter on this based on the date the report was launched, looking at 30 to 45 days forward. The only usage metrics that should be included are ones based on the date the report was published.

    Report Lifespan

    This is a favorite. Too many times has a BI Author worked on what was deemed an urgent report, imperative to the business. These types of projects involve stress, pressure, and most importantly time taken to get right. Despite this, some of these reports seem to lose their luster once completed, not to be heard from again.

    In short, the ability to understand the durability and longevity of reports is essential. This can be taken both from viewing at an individual report level or an aggregate of newly launched reports. Are the reports being built showing value to consumers, not just once, but giving them a reason to return to the report on a consistent basis?

    Data Quality – KPI’s

    Report Refresh Rate

    An obvious choice when referring to Data Quality, if your reports are consistently failing that causes multiple problems. For one, consumers are not receiving the most current data. Secondly, this should trigger within the BI Team an alert that a data model may need to be reviewed for best practice standards.

    What is the target rate? While there is no current industry standard, targeting anything near the 95% rate should not be over achievable.

    An Example of Report Refresh KPIs

    Days Since Report Views

    From a bird’s eye view of all the reports in an organization, flagging unused report becomes an actionable KPI. In addition, mapping this to also track duration on a per-user basis provides a wholistic scorecard to future decisions. Firstly, Reports with consistent low Days Since Views should be treated with extra care if any updates are needed. On the other hand, Reports that have not been viewed in over 2 weeks may indicate loss of interest. Depending on the report, a BI Team can decide either to re-promote a report or assess if a report is not providing the value it should.

    From the User perspective, tracking Days Since Views by User can provide value in multiple ways. For instance, Users who are “top customers” (i.e. those who overall and per-report have low Days Since Views) tell Authors who to reach out to or who knows what can enhance reports in the future. By contrast, Users with high Days Since Views provide the ability for push-back for requests for new builds. For example, any colleague that may be requesting the most report builds but do not return to their reports give support to Project Managers that this may not be worth the value.

    Flagging a Report with 40 Days since being viewed by User

    Reports Retired

    As we discussed monitoring how many Reports have been launched, what about Reports on their way out? That is to say, how many reports have been removed from the service and from the “public” view. The importance of keeping track of this KPI is all about quality for the consumer experience.

    Ensuring that any data published for an organization is current, has a clear objective, and provides clarity is paramount. Above all, this grows the trust and reliance on using Power BI for users. From a discovery standpoint, there is no confusion on reliable data.

    Taking the previous KPI (Days Since Views) into account, a BI Team can create a view to monitor “at-risk” reports. For example, any Report with over 45 Days Since Views should be strongly considered to be retired. Any report that meets the threshold should alert users on a pending retirement date. If there are no objections, then these reports should be moved to an Archived workspace.

    Getting the Data from Power BI

    This may be obvious, but a prerequisite of creating and using KPI’s is having the data. So where is this data coming from? If you are a Power BI Administrator in your tenant, you can import the data via PowerShell. Install the Power BI Module in PowerShell using the following command:

    Install-Module -Name MicrosoftPowerBIMgmt

    Once you installed the cmdlet, you can use the following script to pull in usage day (by day) into a specified folder on your PC.

    Login-PowerBI
    ## $a is the starting Day. Start with the you want it run and subtract 1
    $a = 17
    Do {
        "Starting Run $a"
        $a
        $a++
        $ab = "{0:00}" -f $a
        "Running Day $a"
        $daytype = "$ab"
        ## Update monthly the 05 for start date for the current month
        $startdate = '2021-05-' + $daytype + 'T00:00:00'
        ## Update monthly the 05 for end date for the current month
        $enddate = '2021-05-' + $daytype + 'T23:59:59'
        $activities = Get-PowerBIActivityEvent -StartDateTime $startdate -EndDateTime $enddate | ConvertFrom-Json
        ## Update the 05 with the current month
        $FileName = '2021' + '05' + $daytype + 'Export.csv'
        ## Add where you want the files to go
        $FolderLocation = 'C:\Users\PBIActivity\'
        $FullPath = Join-Path $FolderLocation $FileName
        $activities | Export-Csv -Path $FullPath -NoTypeInformation
        ## Change the number for what day of the month you want it to run until
    } Until ($a -gt 19)

    The script above collects activity data from your tenant and creates a CSV file per day. Note that this can only go back 30 days – make sure you run this on a weekly basis and change the variables. To learn more about what else you can do with the PowerShell cmdlets for Power BI, read the announcement from the Power Blog here.

    To collect refresh statistics, Marc Lelijveld (Data-Marc) has a great tutorial here.

    Conclusion

    The KPIs outlined should serve as a starting point to monitor performance. Power BI Pros without insight into their own performance are stunting their own growth. Not only are metrics for Pros essential for an organization, but it alters the way new reports are built in the future.

    Like the content here? Then you will love the Explicit Measures Podcast!  Subscribe to the Podcast on Spotify, Apple or multiple platforms on Anchor. Want to join us live? We stream every episode on YouTube Tuesdays and Thursdays at 7:30 am CST. You can also subscribe to new events on the PowerBI.tips LinkedIn Page

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat


  • Introducing the Explicit Measures Podcast

    Introducing the Explicit Measures Podcast

    Welcome to a new podcast from PowerBI.tips, Explicit Measures. We aim to discuss relevant topics and thoughts around Power BI. Join us Tuesdays and Thursdays at 7:30 am CST (-6 UTC) on our YouTube Channel and subscribe on Spotify Podcasts.

    Answering the Why

    For most of you who casually visit or frequently visit PowerBI.tips, we deal with many tools, features, languages, and situations in modeling, visualizing, and distributing our data and reports. We are the Power BI Power Users, capable and responsible for building complex solutions. We have to be continually acquiring new skills, and improving on our processes & standards.

    Additionally there are so many resources available out in the Power BI Community. So many amazing champions and industry leaders who share their knowledge and expertise on how to build faster, better, and more reliable reports. As a developer we increasingly tasked with, How to audit your data? or how to create complex calculations? We learn best practices and better workflows to implement in our daily tasks.

    One aspect of what we do that is not easily discovered in the community, the why of what we do. The why of building reports is universal across all Power BI Pros. As professional BI developers we know the how. If we don’t know the how, we learn the how. This begs the question about the why? Why this particular feature, tool, or product integrate with my organization or my team? Why would my users need this? What does this mean for me?

    The why is the question around every BI pro’s water cooler. You may ask this yourself. Possibly sitting at your desk after a meeting, or as you engage with the Power BI community, or User Group. No matter how you get to it, we all face common questions. Eventually all of us will need to ask these types of questions.

    Today this is why I am excited to introduce the new Explicit Measures Podcast, available every Tuesday and Thursday live at 7:30am CST.

    The Background of the Podcast

    Being a Power BI Author

    I have been part of the Power BI World since it was “Power BI Designer”. As soon as I was able to download the first application version of Power BI Designer, I was hooked. I felt it was intuitive, complex enough, and just worked. At the time, our organization was vetting new BI platforms. I strongly made the push that we adopt this new tool.

    I decided to put all my eggs in the Power BI basket. Believing this tool could easily be widely adopted as it’s part of Office 365, everyone has Office 365. The barrier to entry was low, and we already have strong community. A community of Excel gurus, Power Query and Power Pivot experts.

    Over the coming months, we moved Excel files to Power BI, and eventually became a Power BI shop. I soaked up any and all information and resources on how to create DAX measures. Learning what the heck Filter and Row context where. Then figuring out how to create my own function in Power Query. This process was love at first sight. It felt that I was bringing advanced data solutions to my users. I was able to create models and create relationships that otherwise would not have been able to exist. Created tables that finally bridged so many gaps in the data.

    I would jump in my chair when a new Desktop version was released. Any new feature that caught my eye would immediately be something I wanted to integrate into my reports. Drill Through (I think at the first Data Summit before being MBAS?) feature was a game-changer to me.

    The problem was what I thought was a game changer, which they are. This same excitement did not translate among colleagues who did not focus on data or felt overwhelmed by data. Many of our old Reports were built in SSRS, and users liked them for what they provided.

    Excitement vs. Expectations

    Drill through, interactive visuals, and other complex features that were in these Power BI reports became overwhelming to users. Not only that, but any data analyst was also now working in Power BI. They were building their own reports, with their own filters, and their own business logic.

    What arose from these implications was the matter of users losing trust in the reports. They lost trust in the data they needed to rely on. A department with one report would complain about a certain KPI being too low, while the defending department claimed their report was provided a more adequate number. Users did not know what reports to use, much less how to use drill through. They wanted what would provide them the value they needed.

    This brought me to an epiphany of sorts, or multiple over numerous situations. Not only about the importance of governance and adoption in Power BI, but at the end of the day, why do we do what we do? Why are we in this space, and what are we ultimately judged and measured on showing success and real impact at where we work?

    Focusing on the Why

    This brings us back to the Power BI Water Cooler. How many of us have delt with these sort of situations, problems, and trying to find a solution? I would put good money on the majority of us who have been working with Power BI for a while have gone through this type of arc.

    In conversations with other User Group Leaders, the community, and other Microsoft MVPs. I have learned time and time again this is not a siloed story. What can really separate a Power BI Tech vs. a Power BI Pro is the ability to think of alternative solutions. What is the ultimate impact of any feature, product, or visual on the most important audience, the consumer.

    We must think this way. We must be able to process all of the new capabilities that come out at rapid speed. Then understand who our consumers are. Finally, understand of not just Power BI but the data, and where can we further drive more and more trust into the data.

    Being Explicitly Measured

    I want all of you whom this article may hit close to home to join us in this ever-going discussion. Having the pleasure of knowing Mike and Seth of this site has shown this need is so prevalent. We want to bring to the surface these questions, topics, and discussions. There is more than one way to define a measure, but the importance is that you start with some definition.

    That is truly where the name of the podcast, Explicit Measures, comes from. Being able to start with a use case (take the technical situation of defining a measure), understand what is available to you (the functions), and what is best to apply (FILTER inside a CALCULATE, ALL or ALLSELECTED?).

    The Explicit Measure Podcast is meant to be first entertaining for users. We all need an outlet for some of the frustrations we feel by end users, and being with fellow users who understand the pain helps the feeling you are not alone!

    The heart is in the ability for us to debate, argue, and most of all inspire. Find solutions, figure out what impacts us and where can we go from here.

    Join us every Tuesday and Thursday at 7:30 am CST live, or follow along on the playlist on YouTube or subscribe on Spotify. You can also subscribe to new events on the PowerBI.tips LinkedIn Page

    If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials.  Subscribe to our YouTube Channel.  Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.

    Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:

    Check out the new Merch!

    Hasta La Vista Data
    Go Ahead Make My Data
    PBIX Hat