As a report author it is important that you build the necessary context for your end users. The main areas most often needing context either relate to the data or functionality contained within the report. The battle of space, usability and design all impact the choices we have. Lucky for us, Power BI has many different options and we have plenty of choices. This video will walk through the following options and explain methods on how to use them.
Text on Page
Visual Header Tooltip
Button or Imagefor
The Overlay
A Different Page
A Different Report
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
KPIs are a key visualization type used to convey high level metrics to the end users. They provide an at-a-glance metric that allows business users to know whether they are on track or off-track. Over the course of time that single metric number has been enhanced to include lines to showcase trends, date stamps, variance and many other metrics to convey key metrics in a compact and concise way. There are a number of different takes on what a KPI can or should look like. If you would like to take a look at all the different KPI visuals you can download from AppSource you can find those here there.
I’ve used several of the visuals from AppSource before, and
what I would recommend is that you always verify that the visual has the blue “Certified”
label if you plan on just using it. That means that the custom visual has met
certain code and verification criteria and the visual is not accessing or using
external resources. That is not to say that you can’t use the other visuals,
but I would recommend looking at them closer and potentially reaching out to the
owner of the custom visual to see if there are any extraction or third party
interactions that would not be acceptable risk for your company.
That being said, one of the great things about having custom visuals is that they typically offer more enhanced visual options or settings that you cannot find in the default visual set in Power BI. KPIs have been one of those visuals that needed a bit more enhancement. I’m happy to say in the latest December release of the Desktop it received it. If you’ve been looking elsewhere for your KPI visual, it might be high time to give the latest default version a little bit of a test run as it offers some of the key features that are plenty enough for me.
Basic Updates
Prior to the December update the KPI visual was a bit standard. An example of what that looks like is here.
However, if you look at the below example of the new one, there have been several key updates that pop this KPI now by default and extend it to make it look more pleasing. The first change is obvious, the Font is DIN and brings a bold pop to the overall number. We’re off to a good start! We can also now change the Font family of the larger metric which will allow us to mirror the overall fonts we’ve selected within our other visuals.
The Indicator and Goal properties have received the most prominent updates, but there are also some key changes that allow us to now set the font colors both statically and conditionally.
Indicator Properties
Let’s see how these changes impact things overall. First, lets take a look at the Indicator. The options went from this:
To this:
What does that do for our KPI? Aside from what we outlined above, it gives us a simple two click option to adjust where we want the number (Top & Left), which makes our trend seem a bit more trendy?
Another option we now have is to increase or decrease the transparency of the trend.
Goal Properties
The second property area that received a bunch of changes is the Goals area. That looked like this:
While the new properties pane looks like this:
This change is almost like the difference between the Edison bamboo filament light bulb and the latest LED that lets you pick between millions of colors… Ok, maybe that’s a bit extreme, but you get the point. This property is now extremely useful. Note: the Goal label name can be changed and the metric returned for the distance value can be updated to show the percentage, value or both.
The color and font family for both Goal and Distance are updateable
now as well along with a new property for the Distance direction.
Conditional Formatting
One of the “hidden” things to be aware of is that the above screen shots are the default view of all properties. To see if you can use a conditional setting you need to hover the mouse over the Font color area. Doing so will illuminate the “…”
Clicking that gives us :
And one more click opens the dialogue where we can set our conditions.
One final key property is the addition of the top level “Date” property. This is a great addition in that it takes away any doubt about what the current KPI context is without having to look at filters.
Showcase
Utilizing all the capabilities now given, we can create a series of KPIs that carry the vast majority of all our needs in the out of the box visual. Below is just an example of a bunch of different ways to format the KPI visual. The goal here is to show how many different ways we can see these metrics using the same KPI visual now that we have all these new properties at our disposal.
Sometimes the features we want to have added to visuals in Power BI aren’t being worked on. It isn’t for lack of effort on the Power BI team, believe me. But there are only a finite number of resources, and a backlog a mile long. Check out ideas.powerbi.com to see all the ideas being requested. I believe that is why they opened up a program to work with Power BI experts to engage with the Power BI team directly. These KPI features are a result of that program. The individual we can all shower with our thanks is James Dales. You can check out what James is up to on his blog – https://powerbi.jamesdales.com/, and be sure to hit him up on twitter and offer up a bit of thanks – @jamesdales
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
PowerBI.tips is excited to announce our new tool to help you build the best looking reports, Scrims.
We’ve built out a fast and easy solution for snapping visualizations into place while giving your reports the extra special look and feel that allows your audiences to be impressed without losing focus on the important stuff. We developed Scrims to give you this shortcut to amazing looking reports
What are Scrims?
A Scrim is a background image that you would use on your Power BI Report pages. Why the name Scrim? The term scrim is used in theater productions. It is a backdrop that is placed on the stage behind the actors. It adds context & engages the audience with the production. We thought this idea crosses over well with Power BI.
A scrimcan change the mood of a theater just like a well designed background image in your report. Scrims were developed to be easily adaptable to different color themes and in each set of offerings we release we’ll be providing you with as many color options as possible.
Why Do I need One?
Scrims solve several problems.
The most important problem it solves is time. As BI practitioners ourselves we know the demands that are placed on you. More often then not the visual look and feel gets the least amount of attention due to deadlines. We want all the time you invest in the data and building visuals to impress your audiences by being presented in a beautiful way.
Scrims are designed the same way we developed Layouts. By emphasizing proper design focused on the Gestalt design principles to ensure the end users experience enjoyable and non-distracting reports.
Using a Scrim you will reduce the number of objects on the page. Less elements means faster rendering reports. Here is a blog that tests this by adding more visuals to a page “More Visuals Mo Problems”. Scrims add the illusion of a very large number of objects without a negative impact.
Scrims come in a Bundle
A Scrim bundle contains a series of images that you can use in any way you want in your Power BI file. Each Scrim will have different page sizes to best suit your report needs. Every Scrim will contain a default 16 x 9 (1280 x 720) aspect ratio. Most Scrim bundles will have additional ratios such as 8 x 9 (1280 x 1440) or 4:3 (960 x 720). You will see the sizes prominently displayed for each bundle with the red tags.
Each bundle contains 6 pages minimum for each size, which means on average you will receive at least 12 pages in each bundle. Each Scrim bundle also includes the color theme that corresponds with it in JSON format to easily upload into your Power BI Report.
Over the course of time Power BI has come to encompass a wide variety of technologies and tools. One such product that has been integrated into the suite of Power BI is paginated reports. Before Power BI, paginated reports were served up in SQL Server Reporting Services. That product was the major reporting tool used by businesses using Microsoft products. Now, Power BI brings us the best in visualization, analysis and insight. Paginated reports give you pixel perfect outputs that can be easily printed, emailed, and output to Excel. It strikes me as a bit peculiar that over the years while the focus has been to push into Power BI visualizations heavily, almost every client I’ve ever worked with has always asked for outputs of Power BI reports that are best served up in a paginated way.
What is really exciting is now we have an ecosystem in which we can bring all these report types together. You can create “perfect world” reports where we can analyze and export in fashions that meet the needs of a wider audience. Today I want to talk specifically about the release of Paginated reports in external embedded scenarios in the November feature release cycle. The integration of paginated reports has been on a steady cadence of releasing new features. I want to serve up an overview of current state for embedding these report types, and where there is room for improvement. I’m also interested in the community feedback to determine if my experiences or blockers are similar or different. Be sure to comment below.
What you need
First off, in order for you to use paginated reports in Power BI, you will need to likely upgrade you license. It is a widely known request that something should be done about removing this feature from premium pricing. If you want to change that, throw your weight into the request to change that, you can do that via the community ideas section for Power BI to make this feature available to Pro users -> https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/35959420-paginated-reports-please-make-it-available-in-pr
Importantly, having Pro license won’t fix the cost for an embedded scenario completely, but one can hope that if it is available to Pro, then the A1 – A3 sku levels would work as well for future embedding.
Lets get back on track! You will need an A4 sku or P1 in order to use the paginated reports. A full walk-through of how you can develop this solution can be found here. The great thing about an A sku is that you can turn it on and turn it off. This means that if you want to kick the tires and showcase the capabilities before purchasing, you still have the ability to do so.
What Works
Aw yea! Look at this beauty, you see that in the embedded sample report? PDF, Excel, Word, PowerPoint! All there, and all exporting the pixel perfect greatness of paginated. I don’t have to assume how much businesses will salivate over this one, because I’ve had all these conversations. With paginated reports in our external facing applications we have the ability to merge all these report types into seamless products for our business users.
You want to test this out for yourself? You can! Check out
the playground in embedded and you can see just how these features serve up in
all your export dreams.
Here are the current data sources that are supported.
Not surprisingly, all the Azure sources feed us up all the data & security that we would want to use in terms of access. This will support existing data extraction methods used for paginated in other reporting tools. There is great news from a Power BI model perspective as well! If you want to integrate your Analysis and paginated reports to serve off the same model this is really easy to do now because I can connect to AAS or a Power BI Dataset.
What is Needed
There are a few things that you should be aware of that aren’t yet baked into the feature release. This is where I’m wondering how much of a blocker it may be for others as it is for me. There are likely many companies that have hundreds or thousands of paginated reports. The majority of your reports are likely served up via stored procedures with parameters in the SSRS report. These provide data context prior to execution of the procedure thereby limiting the results. This WILL work if you are using Azure SQL DB or Azure Managed instance. It WILL NOT work for on-premises SQL or a Platform as a Service (Paas) solution where you are hosting your own SQL Server in an Azure VM.
Export to Excel – Dump my data!… Personally, I don’t mind that this feature isn’t in the export options. I actually hope it doesn’t get added in the way it works in legacy tools. One challenge I’ve always had is trying to get business users out of using reports as export dumping tools. With the advent of Power BI, and all the ways we can surface up curated data sources, I want to see this type of practice stop. We need to get better at training people where and how to ingest data and make it easier for them. Rather than let them dump out loads of data and creating silo’d processes of their own.
Close
I’m really excited that paginated reports has been added as a supported report type for embedding. Paginated reports will certainly hold a place in my future deployments. Additionally, I would love full support for on-premises data and Paas solutions in order for major migrations to occur of paginated reports into Power BI. For me, I will need to consider the cost of refactoring all those stored procedures if I want to stay on-premises or make a case for moving all of reporting only to Azure SQL.
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
We do a ton of Layouts here at powerbi.tips and with the introduction of the new visual grouping feature I was looking for different ways I could utilize that functionality to make templates for different reporting scenarios. First, if you aren’t familiar with how to build a visual group, be sure to check out a blog Mike wrote on this a little while ago on the subject.
While I was exploring an idea of how I might leverage this new feature, a challenge I’d seen re-occurring in the Power BI Community forums popped to the forefront of my mind. This new feature gives a perfect solution for answering the question of Report page consolidation.
Time and again I’ve seen people talk about navigation issues related to the number of pages or tabs they have in their reports. Sometimes, particularly in embedded scenario’s, you can’t reduce the pages and break them up into smaller reports. This could already be accomplished in the past, but man oh man, good luck finding what you were looking for in the selection pane. Let me show you how easy this is to accomplish with the addition of the new visual grouping feature. By walking through these few examples you can extend this to however many pages you need following the same patterns I describe below.
Setup:
On your main
page, make some room for buttons that you can create to toggle your pages
on/off. You can see I did that in the
image below.
Next we’re going to click on the top object of the selection pane and hold SHIFT + Click the last object, this will highlight all the objects.
Right click and select Group from the drop down menu.
This will
create a “Group 1”
Let’s rename this to Page 1. You can do that by Double Clicking on the group name and typing in the new name, Page 1.
Next, we’ll
create a button called “Page 1”
Select the Home ribbon, then click the icon called Buttons.
Choose a Blank button from the drop down. In the formatting pane under the visualization area choose the section titled Button Text and toggle it on, then type in Page 1.
Place the button in on the right side of the screen.
Great, now since we just created this button, let’s create a second button. This one will be for our “Page 2”. Follow the same steps above or select the Page 1 button and do a CTRL + C (copy command) / CTRL +V (paste command) to create a new button. Rename it to Page 2 and you should end up with this
Prep Additional Page
Now, here is where this gets cool. Imagine you have N number of pages, all with roughly 10 objects on each page. We are going to go to each page and condense all the objects down to a single one. Go to your second page, in the same fashion that we created our page 1 group, we select all the objects in the selection pane.
An alternative method would be to just click on your report canvas and CTRL + A to select all objects, then right click and create group. We now see our new group in the selection pane and we can double click to rename it Page 2.
Bring It Together
Now Click on the Page 2 group and hold CTRL +C to copy the group
Navigate to page 1 and CTRL + V to paste the group.
This will
bring all the objects from your Page 2, to page 1 and it will look like a giant
mess similar to this
But you know what isn’t a mess? The Selection pane! Because we created the group, all we need to do is toggle the Page 2 visual icon to off
And our page
looks normal again. Now let’s hook up the buttons and bookmarks.
Go to the View ribbon and open the Bookmarks pane.
Click the Add button at the top of the window. This will create a snapshot of the current state of the report page. Double Click the bookmark that is created and change the name to Page 1.
Now, toggle the Page 1 visual group off from the Selection Window. Then turn the visibility of Page 2 visual group to on.
Create a second bookmark. Now, change the name to Page 2. You’ll likely notice that things just aren’t aligned right (at least in my case that is true because I’m using a background.
But
since the visual grouping is its own object, all I need to do is select it and
all the visuals resize for me! I don’t have to individually mess around with
each one! Super cool.
The last thing we need to do is connect our buttons with our bookmarks. We Select the Page 1 button. Choose the Action in the formatting and toggle it On.
Select the Bookmark from the Type drop down. Then Select the Page 1 bookmark we created.
Repeat the same thing for Page 2. You have just created navigation buttons to the two views of your report pages.
CTRL + Click in Power BI desktop will activate the actions on the buttons. Finally, you can see that you will toggle between the report pages on the single page. WHEW!
Closing
This was possible before visual grouping, but now it is an EXTREMELY clean and efficient way to consolidate your report pages and add a ton of objects into a single page. You do have a lot of objects on the page, but don’t worry there isn’t a performance impact because they don’t render until you click the button to make them appear. You can test this out by enabling the Performance Analyzer in the View ribbon.
Visual grouping coupled with bookmark grouping have made these features fun to use and easily manageable for all. This is just a simple use case where extending the use of visual grouping to help us manage our reports better can make a world of difference as we build. I hope you enjoyed this post and that it helps you clean up any reports you may have that got a little unwieldy.
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
Power BI requires a gateway for refreshing on premises data sources. There are a myriad of different data sources that you can create and two different ways you can set up the gateway. The first way you can install a gateway is in the “personal mode”. The second method for installation uses the “On-premises data gateway (recommended)” (OPDG), this used to be called the “Enterprise Gateway”. The second gateway method is what you need to set up and configure appropriately in order to manage permissions. The recommended gateway to use when deploying reports to a wider audience is the OPDG.
Over the course of time enhancements to the gateway have increased the usability and functionality. Such as, being able to connect to cloud and on premises data in the same report, allowing custom connectors and the ability to create and distribute workloads across multiple gateways that you have clustered together. You can toggle all these settings in the gateway in the Power BI Service.
By setting up a gateway cluster you ensure that your
reports are going to refresh even if there is some maintenance activity or if a
server goes offline unexpectedly. This article will explain how we can
configure this set up and outline a few gotcha’s related to managing this setup
that aren’t so intuitive.
Setup
For DR (Disaster Recovery)
The
name of the game here is to choose servers that are not near each other. While
this may not be ideal from a refresh performance perspective this will provide
a better recovery choice if something catastrophic happens to the server.
For Load Balancing
The goal in this gateway setup would be to choose several servers nearest the data sources and choose different servers to distribute the refresh load for improved performance.
In an a perfect world we could combine DR with load balancing. At this point in time you can either distribute the load across all gateways or choose not to distribute load and it will default to DR. There is no combination of both methods.
What I’ve found with installing the gateway is that if it isn’t easy, its hard. When it is hard, you will likely need to do a bunch of troubleshooting to resolve the issues. A great helper guide for troubleshooting a bunch of common and not so common things can be found here – https://docs.microsoft.com/en-us/power-bi/service-gateway-onprem-tshoot . Install the gateway on the initial server once you get past the initial couple screens you will hit this one
You want to register a new one gateway and be sure to Set and SAVE your recovery key. The key is the most important part of how you both recover, and create a cluster.
(Note: You can always add a new gateway to an existing gateway, this doesn’t have to be a net new process)
Adding the Clustered Gateway
After you have the first gateway installed, login to the second server, where you will follow the exact same process. Except this time when you hit the next screen you will want to toggle the “Add to an existing gateway cluster” and enter in the same recovery key.
By adding the recovery key and checking the box you have declared to Power BI that this gateway should be linked to the other gateway and thus create a cluster. Like the first gateway, it now shows up in the Service and we can set up the data sources to match our other gateway. At this point we can configure the settings of the gateway to distribute or by default it will just be for DR.
Be aware that it does require appropriate permissions to see and manage the gateways, so check more out in the documentation here – https://docs.microsoft.com/en-us/power-platform/admin/onpremises-data-gateway-management Since gateways can be used for other products, this is where you can see what is installed and where and figure out who has access.
I’m hoping that now that we have an admin center that in the future we’ll be able to manage upgrading, clusters and all “admin” related activities from this interface. As of this writing, we are still relegated to using APIs and cmdlets to manage these objects. Just recently, the Power BI team did release a whole slew of PowerShell cmdlets to help out administrators manage gateways, be sure to check out the preview release notes here. https://powerbi.microsoft.com/en-us/blog/on-premises-data-gateway-management-via-powershell-public-preview/
Removal
If you are like me, when I installed a gateway and then needed to remove it, you receive all kinds of errors after the fact in the Service related to those defunct gateway. Truth be told, you will need to manually remove the gateway from the cluster. You can do this by utilizing a PowerShell script. This does not mean uninstall the gateway, this means you need to run a PowerShell script Remove-OnPremisesDataGateway to detach the gateway from the Power BI Service. If you don’t, it will still remain there. I just recently uncovered this while upgrading a gateway and was still getting error messages saying that previous gateways that I had previously deleted after testing out clustered gateway were out of date… which of course they were, they were gone! But alas, to remove these errors you need to follow the instructions here to fully remove the gateway. https://docs.microsoft.com/en-us/data-integration/gateway/service-gateway-powershell-support
Summary
All in all, it is fantastic that we have the ability to provide more stability to our data refreshes, there is a ton more information that can be provided and outlined for data gateways. I hope this post gave you some good tips/tricks around gateway clustering and how to clean up after yourself when deploying clustered gateways.
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
As a user that builds Power BI reports, did you know the different
technologies that come into play when you interact with the tool? This is one
of the first questions I ask to new audiences or people I’m training on Power
BI. Why is this important? The reason is that it is crucial to understand what
part of the tool you are in so that you can separate out the different tasks that
you should be doing or trying to accomplish in each area. The other biggest
reason is you need to know what you should be searching for in your favorite
web browser when you are looking for answers to your current problem.
With that said, there are two main components to the Power
BI Desktop, Power Query (aka. Edit Queries) and the Tabular Model that you have
access to in the main part of the tool. I’ll mention the tabular model first,
but we won’t be diving into that in this article. It is responsible for compressing
our dataset and gives us the speed we see over all the data we want to slice
and dice. We can create relationships in our model window, and we can create
additional calculations to extend our original dataset by using measures or
calculated columns. The underlying language in the
Tabular model is “DAX”.
Power Query is our ingestion engine, it connects to our data
sources and allows us to perform the ETL (Extract, Transform, Load) activities
on our raw datasets. This is extremely helpful and an absolute must to create
clean data, and shape it into the best form for loading our models that we want
to work with in our reports. The underlying language in Power Query is called “M”.
When you toggle open edit queries
You are presented with a new window that makes a clean separation
between the two interfaces. Connecting to or creating data creates a new query,
clicking the “Advanced Editor” will open another popup window where we can see
the “M” code for all of the steps we have taken in that query. Most of
everything you do in Power Query will be in the interface as you get started,
but getting to the point of understanding how to manipulate the code in the
Advanced Editor will change the way you can build reports immensely.
The vast majority of Power BI users are extremely new to
Power Query, so today I wanted to spend a little time talking about a book that
has helped me immensely in understanding how to get the most out of that aspect
of the tool.
One of the best things you can do when trying to understand
something is to get an expert to show you how to do it, whether that is in a
class setting, a presentation or a book. They have knowledge around the area
and can streamline your learning process. There are people who learn in
different ways, but I would argue that each type has different levels of
retention. For instance, when I go to a session that is heavy in coding and
techniques, I take a bunch of notes because I know that while it all makes
sense in the session, I’ll forget the specifics and have to refer to my notes
when I need to apply what I learned. The same goes with books I read, I grab
snippets of techniques and write down a bunch of reference things for later. Whereas,
if I go to a class and have to walk through the steps on my own or take them
away as homework, it forces me to practically walk through an exercise and the
steps in order to complete it. Doing this locks the technique in, and I’m able
to recall how to do it when I need it instead of having to look things up
again.
If you are serious about getting better at the ETL portion
of your Power BI report building there are numerous resources out there, but
today I’d like to spend some time talking about one in particular that I would
highly recommend authored by Gil Raviv.
(Disclaimer: Gil Raviv is a friend, and his book was
gifted to me)
Suffice to say Gil is one of the best to learn from since he
was part of the MSFT team that created Power Query. To read more about Gil,
check out his bio on his website here -> https://datachant.com/about/
The main reason I’m recommending this method to learn is
that it isn’t JUST a book. Think of this as an instructional class, where you
have a bunch of material for both preparation and homework. What Gil does here
is amazing from a sharing perspective. Along with the book, you are given
access to almost 200 files that include the data sources and the Power BI
Desktop files with the solutions built into them… Take a moment to digest how
much content is in here, it is quite astounding.
The layout of the book is designed with the business user in
mind, and focuses on the most often used steps and how you perform them. As you
advance through the book, you will have the opportunity to build along with
each lesson, if you get stuck or don’t quite understand a lesson, then you have
the solution file as a reference. I loved the interaction here, in most of the
lessons I just created my own queries right along side the solution queries and
if I ever got jammed up I could just click in to the solution query for the
correct step. This made things really conducive to staying engaged in the
lesson and moving through a lot of material.
Like our journey with Power BI, it starts simple and moves
towards advanced concepts. About mid-way to ¾ of the way through you will be
introduced to the advanced editor and M, here you will start to learn how you
can manipulate the queries themselves without the UI… Getting here puts you in
such a powerful position with your ETL. Working through these lessons has
empowered me to easily understand how to manipulate my backend sources, modify
my steps I built in the UI with additional filters or actions, and troubleshoot
potential issues in Power Query. If all that isn’t enough, Gil gives his top pitfalls
and things to avoid section which is an invaluable resource section.
I really enjoyed this book, and it sets a high bar for me moving forward in that I will be looking for other materials from other authors to provide this level of engagement with something I want to learn. The structure of the book, using the chapters to read through the step by step way to perform the action, having the source material and the solution files all in one spot makes for a fantastic way to learn. Whether this book is your first foray into Power Query, or you choose to go it alone, I highly recommend that you get started in the journey.
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
As report authors we sometimes get caught up in how easy it is to create a report and provide value to the business. Each report is an opportunity to make a big contribution to the organization. Power BI makes it easier than ever to turn many of those reports around quickly. This is a good thing of course. But, sometimes we can get caught up in the madness of turning out another report with only a flash of recall that we could have used the same or similar model done in a different report. The internal monologue kicks in.
“Pffft! Re-use a model, that is way to much work! Why do that?, when we can just create a copy of the PBIX (Desktop file) and have two models to manage with the two reports for the same business area! That sounds like fun!”
Hmm… Or does this actually sound awfully similar to the challenges one might face with sprawling Excel solution. Where variants of logic are buried in different files. Only the composer knows how to bring order from the chaos. Might I suggest that we spare ourselves that sort of pain. Just learn how we can easily leverage our already hard fought model work. Avoid tedious updates without having to over complicate our nirvana of sticking to the business world, but how would we do that?
The Answer
Power BI datasets. Power BI datasets allow us to re-use our model across multiple reports. This simplifies and speeds up future report authoring. Also this gives us the building blocks for sharing that dataset to a wider audience.
How do we create one?
Technically, we’ve likely already created many. You see, when we publish a Power BI report, we publish a dataset along with it. This dataset is stored in the Power BI Service, and our deployed report relies on it now. But the connector “Power BI datasets” allows us to connect directly to any of these datasets that we have permission to edit. This means that we have the ability to extend a single model across multiple reports without the need of standing up a separate Analysis Services server anywhere. This is a big deal, this allows the everyday business user to leverage a reusable model. A single change or update to a calculation can update multiple reports at the same time. One measure or calculation addition can be done in one place instead of many.
All we need to do to create a dataset that we can connect to is publish a PBIX file that contains data. I’ve adopted a practice recently and rather than generating my first report and reusing that model, I now upload a PBIX file that ONLY contains the model and I name it something like “Sales-Model”. Now I have an object that I’ll know serves the purpose of just being a model instead of a report. This makes it easier from a trace-ability standpoint when looking at the related objects in the Service or selecting it from my list of options when choosing my dataset.
How do we use one?
Using the Power BI dataset is one of the most
straightforward connections in Power BI. Selecting Get Data -> “Power BI
datasets”
This brings up the menu of all the datasets in the Power BI Service. The list that is shown are the datasets that our user account has access to use. The great thing about these datasets are we now have the ability to connect to and use a dataset from a different workspace provided we have permissions to edit them. This feature is called a Shared Dataset. Select the dataset and your report is automatically connected to dataset.
Now, what we’ll notice here is that using this feature automatically pulls in a model for us and we can start building our report. This data source connection behaves exactly the same as if we created our own “live” connection to an Analysis Services instance we would set up. Probably not shocking to any reader here. But, that is exactly what is happening in this case as well. We get the benefit of Microsoft handling all that painful work for us while we reap the benefits of a streamlined process.
As with any “easy button” solution, there are pro’s and con’s. What I mean is that in our new reports we do have easy access to the model. Now you can start building reports immediately. We don’t have the ability to modify the model or the ETL processes. If we want to edit then we need to go back to the original dataset to make those changes.
But the minor inconvenience of having 2 PBIX files open if we need to in order to make updates to the model is trivial compared to being able to connect many reports to that single model. The live connection does still allow the report author the ability to create measures. So, if there are measures that are only suited to our report and not the overall model we still have the ability to add them.
Once we’ve completed
our report, we just publish as we normally would, only this time the dataset is
already out in the Service and only our report is published. There are so many
things we can now do to share that dataset, but we’ll leave that to another
article.
If you’ve never used this method before, I would highly encourage you to try it out. Any time you can save yourself now with reducing the number of models you maintain, the faster you can produce more reports. You now spend less time maintaining all the reports you are publishing.
Happy report building!
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
I recently encountered a
really frustrating experience related to a set of reports seeming to not update
after some data source changes. I’d done this change before, and had another
co-worker take on the task of updating some of the other reports in the same fashion.
After a bit he reached out and explained that a few of the reports wouldn’t
update while the others had… This was odd since we were making the same type of
change. Queue hours of testing, changing, fixing one report without any idea
how I did it, and then hitting a brick wall with the remaining two.
The Setup: The data sources for the
original reports were Excel spreadsheets stored in a SharePoint Online folder. The
PBIX file connected to Excel files in SharePoint online using a Web connection.
Once the reports were created, we stored the PBIX files in a separate
SharePoint Online folder. Next, we opened the Power BI Service and connected to
the reports directly from Power BI to a SharePoint team site, if you aren’t
familiar with that process you can read about it in depth in an article that
Mike Carlo wrote -> here. (The TLDR
-> Get data -> Files -> SharePoint Folder -> navigate to folder ->
Click on PBIX -> Connect)
By connecting to the PBIX files from the PBI Service the dataset, report and dashboard for the report are loaded and sync’d. Where this comes in very handy indeed, is linking the SharePoint folder to your windows explorer and accessing the file, making changes, and saving the PBIX in that “local” location no longer requires you to re-publish the report from the PBIX and the changes make their way to the Service without any further work. I had made several changes to the reports from an asthetics perspective, and maybe added a new measure here and there, but hadn’t done anything drastic until just recently when we updated the sources. Due to some changes on the backend processes that were generating the Excel files, it became apparent that it would be easier for the Dev team if we used CSV files instead of xlsx. The team went through and changed the reports and 3 of the 5 reports we were working on broke in the Service giving this error.
At first, I thought it was the use of parameters in the report, but after determining that wasn’t the issue, it actually helped me figure out that the report in the Service wasn’t updating because the parameters were still showing the “xlsx” file type from the files that I had already removed from the SharePoint location. After repeated attempts to change the datasources manually in the original file, delete and replace, and every other combination of things to try a refresh, I was stumped. What was more confusing, is that connecting to the file again from the PBI Service created another dataset & report of the exact same name!
In my mind, that wasn’t
supposed to happen, and I was getting more frustrated because things were not
behaving as I would expect. My initial assumption here was that the PBI Service
would look at the new PBIX of the same name and recognize the changes and
replace the existing dataset and report, the same way that it does if I
manually push a PBIX via the “Publish” action. But using this direct connect
method, refreshing the dataset did nothing…(cue the “grrrr”).
The solution:
Thanks to the fantastic Power BI team, they were able to determine my issue and share with me the root cause of some of the datasets not updating. The issue is that when you connect to the PBIX files in this manner, the PBI Service looks at the root id of the PBIX file in the OneDrive location and that file id cannot be deleted and replaced with another one. If you do that, you get the above error on the dataset source because it cannot find the file id. If you remove the file out of SharePoint then when you insert it back into the folder the ID will be different. This is what breaks the dataset in Power BI, and also the reason it allows what appears to be the same named dataset and report. However there is good news, because you are in SharePoint you will be able to go back in version history of the folder and pull the previous versions of the reports and resolve the issue in the datasets. I did end up having to replace one of the reports, but lucky for us it was only a single user for the customer report and re-sending a shared link was a really low impact compared to what it could have been.
Here are the following
guidelines I can offer up after going through this.
Do:
Sync the SharePoint folder to your local storage
Open the report directly from the synced folder and update it in location
Leave the originals file in the folder and replace it with the new one
Do try to resurrect your previous version of the report PBIX in the folder (if you are hitting this issue now)
Don’t:
Move the file to your local machine, then back to the SharePoint folder
Remove the file from the folder (You can do this if you move it to a SharePoint folder, update and move it back, but if the dataset refreshes during that time in the Service it will break and you will need to fix the credentials when you have put the file back.
I hope you found this post
worthwhile, error messages are hard to track down to the root causes and due to
the nature of this being a tricky one for me to solve I thought I would share
the experience and hopefully help others avoid it or solve the errors that
match above.
When testing out this issue again, I did
uncover this error message which is much more specific. It came up after I made
the breaking change, but prior to an automated refresh.
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
There are different ways you can connect to a multitude of different data sources. I’ve written about the different connection types before and you can find those articles here if you are unfamiliar with what I’m talking about.
When you import data and publish a Power BI report to the Power BI Service you need to schedule a dataset refresh in order for the report data to be updated. This can be done manually by refreshing the PBIX and re-publishing the file, but for any production report that you want to set a schedule for, you need to ensure that you set up things to automatically update without further intervention.
The first thing you need to check is whether or not you need
a gateway. For any data source that is located within your company servers, on
a local machine, or within certain types of cloud infrastructures are
considered “on-premises”. When a data source is on-premises you require a
gateway in order to refresh the data. If your data is in a cloud source such as
OneDrive or SharePoint Online, then you will not need a gateway to refresh.
Once you determine whether or not you require a gateway, you need to understand that there are two different types. The first is called a personal gateway, this gateway will allow you to use the majority of Power BI data sources with no configuration. After you install this gateway and publish your report using a data source that is supported by that gateway you can set up a dataset refresh in the Power BI Service without any additional setup. The second type of gateway used to be called the “Enterprise Gateway”, but now it is just the gateway that is not (personal mode).
This gateway is designed for use in your enterprise wide
reports. It has a lot more data sources that it supports for refresh (That list
can be found here)
It also requires some additional setup, configuration and management from an
Administrator. Let’s walk through how you set one of these up so that you
understand how to do it, or how you can request to have it done for your
organization. It is highly recommended that you install the “Enterprise
Gateway” on a server that will never shut down or be on a scheduled maintenance
cycle. You can download this gateway to your local computer, but if the
computer is off, so is your gateway, and the reports will not refresh.
Installing the gateway:
Installation is very similar to the personal gateway, except you have options to add the gateway to a cluster or incorporate a recovery key. ALWAYS set a recovery key and SAVE IT! This key will allow you to move the gateway at a later date if you need to without impacting any of your reports. This key is GOLD, treat it like it and you will save yourself a bunch of headaches in the future. Outside of that, the install of the gateway should be really straightforward, if you do run into issues there is a comprehensive guide to troubleshooting the problems that you can find here. I recommend using an administrative account to set up the gateway because the ownership of the gateway in the Service will begin with this user. If you have one account that manages all the Enterprise Gateways, then you’ll save yourself a ton of pain down the road from an administration standpoint in discover-ability. If you aren’t the admin, be sure to either have an admin set up an account for you or let them know that you have an Enterprise gateway set up using your account and have them securely store the gateway recovery key. Alternatively, if an admin account is not used, be sure to add an IT admin as a gateway administrator in the next step as you configure the gateway.
Configuring the gateway:
After the installation of a gateway you need to log in to the Power BI Service (app.powerbi.com) with the user that you configured the gateway with. Go to Settings > Manage gateways and you will see several different configuration options that will affect how end users can see and interact with data sources on this gateway. The main call-out I have in these options is the first one. More often then not, you will want to make sure that you allow users to “mix” cloud and on premises data sources.
If Gateway Administrators want other people to be able to access and create new data sources on the gateway for others to use it requires that every on-premises data source be setup and configured (Unlike the personal gateway). This means that the data source can use a single user/permission set for a data source and the report and end users will inherit the read permissions of that user. (A caveat to that would be the application of row level security, or a live connection, but that is beyond this overview).
After the data source has been configured, the administrator needs to add any users to the data source so that they can deploy reports using the gateway. This is an ideal use case for using Active Directory security groups, this allows administrators to apply a single group and add and remove users from that group verses managing all the data sources in Power BI. If a user is not added to the data source on the gateway, the report publisher will not be able to setup a refresh using that gateway.
Scheduling a Report dataset to refresh:
Now that you have completed the set-up of the gateway and added your users (including yourselves) to the gateway, you can publish a Power BI Desktop report and schedule a refresh.
First, publish the report from the Power BI Desktop to the workspace that you want the report to live in.
Next navigate to the workspace and select datasets, follow the dataset ribbon to the right and click on the ellipses:
Navigate to Datasets and expand the Gateway connection section. Enable the use of the gateway if it is not enabled already. All your source will be listed here, and if you have any data source connections strings that do not match exactly in your PBIX file it will become apparent very quickly that you have a broken dataset. All data sources in the gateway need to match exactly to what is in your PBIX file.
Once the gateway is enabled and all your data sources are working, you can schedule a refresh.
One thing in particular to note here, when you
deploy your report from your local time to the Power BI Service all times are
stored in UTC times. So, if you are doing time specific calculations you may
need to take some time zone things into consideration. Reza Rad has a good
article outlining some methods to sort this problem out, you can find that here.
A
recent update to the dataset refresh now also allows us to add in email enabled
groups or other specific users to the refresh failure section. This is
extremely helpful in ensuring a wider audience of users can be notified any
refresh fails.
I hope you found this overview helpful, stay tuned for more blogs that walk through specific data source connection and refresh setups that will help ease the process of connecting to, and refreshing all your various data sources in Power BI.
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products: