In October of 2019 Power BI released a new file type, PBIDS. The Power BI Desktop Source (PBIDS) file is a JSON object file that aids users connecting to data sources. In true PowerBI.Tips fashion we have of course, made a tool for that.
Introducing Connections
Today we release the new tool called Connections. It can be found at https://connections.powerbi.tips/ . With this tool you can use our predefined templates or customize one of your own file. To learn more about this sweet sweet JSON editing tool check out the following YouTube Video:
Technical Details
For more information on the Power BI Desktop Source file check out these links:
If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
This post will walk through how to pull daily stock price from Yahoo! Finance, then transform the data using a technique called a query branch. It will be completed all in the Power Query Editor. We will convert this to a function to reuse on any stock we want.
There are many API to pull stock information that get historical stock prices. Many come with a cost to get this information in a decent format. The technique described here is free but will require some data transformations to get the data in a usable format. The purpose is to explore parameters, web URLs and query branches to design a usable function. If you’re just interested in pulling stock information, skip to the end to grab the M code – just make sure you read the performance considerations.
Note: The content in this blog was first presented at the Power Platform Summit North America on October 18th, 2019.
Getting Started
This blog will use parameters to create functions in Power Query. Some experience using Power Query editor may be helpful, specifically: – Knowledge of tools such as merge and append queries – Familiar with query steps and the formula bar
For a detailed look at parameters or if you need to brush up, check out this post on parameters.
Before starting, you will need to ensure the formula bar in the query editor is open.
Open the Power Query Editor by Clicking the Edit Queries on the Home ribbon of Power BI desktop. Then, go to the View ribbon in the Query Editor and make sure the check box for Formula Bar is turned on.
Create the Parameter
First, Create a Parameter. This is a value that we can change and feed into our query, in this case the stock symbol.
In the power query window, under the Home ribbon, Click the bottom half of the Manage Parameters button. From the drop down Select the option New Parameter.
In the Name section, enter the text StockSymbol (without spaces – this makes it much easier to reference later). Give it a description if you like. If you share this report other people can read the description to understand what the parameter is being used for. Change the Type field to Text. Enter MSFT in the Current Value input box. By the way, MSFT is the stock symbol for Microsoft.
Making the Query
Now we have set up a parameter, we can use it to pull in some data. The data source is going to be a web URL. In Power Query editor window, Click the Home ribbon and the button Get Data. Select the item Web in the drop down. In the popup dialogue, Click on the button labeled Advanced.
You’ll notice this brings up a dialog called URL Parts. This allows us to break down the URL into parts. We can easily change individual parts of the URL using this dialog. It will then concatenate it all back together in the order it is entered. Our URL to connect to Yahoo! for a single stock will be split into three parts.
The base URL, which points us to Yahoo! Finance website and the correct data
The stock symbol, in this case will be our parameter
Our other settings (range and interval). These could also be changed in Power BI with a parameter, but we do not want to for this example
In the open dialogue box, first Click the button Add part. This will add a new box. Locate the first window and Enter part 1 of the URL. In the second box, Change the abc symbol to a parameter. Make sure Stock Symbol is selected. In the third box, enter part 3 of the URL. We’re setting the range to 5y (5 years of data) and the interval to 1d (daily). You can change these if you want at a later time.
Note: It is important to remember that Stock Symbol is a parameter – change the symbol to parameter and select from the drop down. Do not type Stock Symbol into the box.
Now Hit the button labeled OK. The request will be sent and returned to us in a JSON format.
Rename the query Stock Value. You can edit the name above the Applied Steps section on the right.
Making the Query Branch
The data returned is a little messy and not in the best format. We need to drill down and pull out the appropriate bits of information. Start by drilling down to the correct information. To drill down, Click the underlinedresult part in the following order: Chart: Record Result: List 1: Record
Your screen should look like the image below. If it doesn’t, simply delete the navigation step and start again.
Here, we are presented with three options of paths to drill down further:
Meta: holds some info about the stock, as well as the timeframe and granularity we chose Timestamp: a list of the dates in the range we selected Indicators: this holds the price information of stock
Right now, the dates and the price are in two different
lists. The price information is another layer down than the dates which makes
this more complicated. Extracting these individually would result in a list of
random prices and a big list of dates – not helpful if these two pieces of
information are not together.
To solve, we will create a Query Branch. The branch will split our query at this step into two paths. One will retrieve the dates, the other the prices. Then we will merge these branches back together to get the dates and prices in the same table.
To start this branch Right Click on the Navigation Step, then Select the option in the drop-down menu Insert Step After. This will reference the previous step and show the same data. Our newly created set is the start of the branch. Rename this new step StartBranch.
Note: the reason for this reference is that the “Navigation” step is not really a step at all. It is actually a collection of steps that Power Query editor groups together. You cannot reference “Navigation”, which will be needed later. You’ll see you cannot rename the Navigation step and if you open the advanced editor you can see the breakdown of individual steps. Another option is two perform any action after the Source step, before you drill down. This will cause Power Query to list each drill down step individually.
Branch 1: Dates
Our first branch we
will pull the dates.
Click on timestamp: List. This will drill down to a list of dates, but they are stored in a UNIX format. UNIX date format is the number of seconds past January 1, 1970 (midnight UTC/GMT), not counting leap seconds. Converting this is quite easy but will take a couple of steps.
First convert the list to a table so we can perform transformations. Click on Transform ribbon. Select the button To Table. Next, under the Add Column ribbon Click the button Custom Column. Change the name to Date and use the following formula in the formula window:
25569 + ( [Column1]/60/60/24 )
Then Select the Date column. Click the Transform ribbon. Under the Data section, Select the Date format. Note: do not select the Date/Time.
Now we have the date but need to preserve its order. This can be solved by adding an index. Go to the Add Column ribbon, Click the little Drop down on the right half of the Index Column button. Select the option From 0 from the drop down menu. Remove the column labeled Column1, as it is not needed anymore. To do this, Right Click on Column1 and select the option Remove from the drop down menu.
This finishes the branch for the dates. Rename this step EndBranchDate by Right Clicking the step in the APPLIED STEPS and Clicking rename.
Branch 2: Prices
Now we need to get the information for the stock prices. Right ClickEndDateBranch and Click the option Insert Step After to add the start of the branch. By checking the formula, we can see it says
=EndBranchDate
This step is doing is referencing the step before it, EndBranchDate. It is duplicating the output of this step. We need to get back to the StartBranch step in order to start the second branch. Change the value in the formula bar from = EndBranchDate to = StartBranch.
This now loads us back to this step to drill down to the stock prices. We will use the adjusted close – this is the stock price at the end of the day after adjusting for dividends. Here we need to drill down to this information, by drilling in the following order:
Indicators: Record adjclose: List 1: Record adjclose: List
Next, Covert our list to a Table. see above for this step. Here we have the list of prices and again need to preserve the order with an index column. Go to the ribbon labeled Add Column. Click the Index Column and select From 0 in the drop down.
This is the end of this step, so Rename it EndBranchPrice.
To summarize the query so far:
Pulled the information for a MSFT stock for 5 years on a daily basis.
Drilled down to the dates, converted them to a better format and added an index to preserve order.
Revert to an earlier step.
Drilled down to the daily prices and added an index column.
Merging the Branches
This leaves two separate tables, but it is only possible to output one of these results. We will need to add a final step to merge these two branches into one table.
On the Home Ribbon, Click the drop down arrow on the Merge Queries button. Then Select the option Merge Queries. This brings up the merge screen. Merge the query with itself. On the bottom half of the merge, Select StockValue (current). Click on the Index column for both top and bottom.
Clicking OK, will merge the data to itself. This is the formula in the formula bar:
This step uses the Table.NestedJoin formula, which has 6 arguments filled in:
Table.NestedJoin(table1 as table, key1 as any, table2 as any, key2 as any, newColumnName as text, optional joinKind )
In our example, table1 and table2 is the same (EndBranchPrice). This makes sense as we joined it to itself. You will notice that when joining from the same query, the table argument references a step in that query (EndBranchPrice). We really want to join EndBranchPrice to EndBranchDate. We can simply change the second table in the formula bar to EndBranchDate:
Change:
To:
Now, we are joining the EndBranchPrice to the step EndBranchDate. These both have a column named index that we added, which will join our data in the correct order.
Expand the merged table by Clicking the Expand button on the column name. The settings will just Select the Date and Deselect the option to Use original column name as prefix.
Remove the index column as it is not need this anymore. That completes our query with the branch.
Enabling Publish to the Service
If we want to publish this to the service (app.powerbi.com), we will need to make a small edit to our URL. The service will not allow parameters in the base URL. To get around this, we can split our URL using an option in Web.Contents called RelativePath. After Clicking on the Source in the applied steps window, Edit the URL as follows:
From:
To:
Make sure the brackets are correct. Here is the code you can copy and paste into the formula bar:
Now we have changed the URL, we need to make a change to the settings. This is because Power BI will try and check the base URL (https://query1.finance.yahoo.com/v8/finance/chart) before it runs the query and adds the second part in RelativePath. This isn’t a valid URL on its own, so it will fail.
To do this, publish the report to the service, and Navigate to the dataset settings. This is found in the service by Clicking the ellipsis in the top right, then the option called Settings in the drop down.
You should be in the tab called Datasets. Then Navigate to the published dataset. Under the option titled Data source credentials, next to Web, Click the option to Edit Credentials. Make sure to check the option to Skip connection test.
This query uses a parameter which enables us to can convert it to a function. To do this, right click on the query in the Queries pane on the left and select make function.
Now we have a function where we can input any stock symbol and return a list of daily prices. To check multiple stocks, you can add your function to any list of stock symbols. This can be found in Add Column ribbon. Then Clicking the button Invoke Custom Function. This will return a table for each row. Before expanding, it is important to handle errors, otherwise it could break the query. One option is to Right Click the column header, and select the Replace Errors option, and Type the text null.
Performance Considerations
While this query will quickly return single stocks, adding multiple stock will send a different query for each stock. Make sure you design the correct solution to what you are trying to achieve, and check out this article on API considerations.
Final Result
For those who like M code, here is the final function. You can copy and paste this directly into the advanced editor (See this article on how to do this).
Visually splitting the steps, the query can be visualized like this:
If you like the content from PowerBI.Tips please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
As a user that builds Power BI reports, did you know the different
technologies that come into play when you interact with the tool? This is one
of the first questions I ask to new audiences or people I’m training on Power
BI. Why is this important? The reason is that it is crucial to understand what
part of the tool you are in so that you can separate out the different tasks that
you should be doing or trying to accomplish in each area. The other biggest
reason is you need to know what you should be searching for in your favorite
web browser when you are looking for answers to your current problem.
With that said, there are two main components to the Power
BI Desktop, Power Query (aka. Edit Queries) and the Tabular Model that you have
access to in the main part of the tool. I’ll mention the tabular model first,
but we won’t be diving into that in this article. It is responsible for compressing
our dataset and gives us the speed we see over all the data we want to slice
and dice. We can create relationships in our model window, and we can create
additional calculations to extend our original dataset by using measures or
calculated columns. The underlying language in the
Tabular model is “DAX”.
Power Query is our ingestion engine, it connects to our data
sources and allows us to perform the ETL (Extract, Transform, Load) activities
on our raw datasets. This is extremely helpful and an absolute must to create
clean data, and shape it into the best form for loading our models that we want
to work with in our reports. The underlying language in Power Query is called “M”.
When you toggle open edit queries
You are presented with a new window that makes a clean separation
between the two interfaces. Connecting to or creating data creates a new query,
clicking the “Advanced Editor” will open another popup window where we can see
the “M” code for all of the steps we have taken in that query. Most of
everything you do in Power Query will be in the interface as you get started,
but getting to the point of understanding how to manipulate the code in the
Advanced Editor will change the way you can build reports immensely.
The vast majority of Power BI users are extremely new to
Power Query, so today I wanted to spend a little time talking about a book that
has helped me immensely in understanding how to get the most out of that aspect
of the tool.
One of the best things you can do when trying to understand
something is to get an expert to show you how to do it, whether that is in a
class setting, a presentation or a book. They have knowledge around the area
and can streamline your learning process. There are people who learn in
different ways, but I would argue that each type has different levels of
retention. For instance, when I go to a session that is heavy in coding and
techniques, I take a bunch of notes because I know that while it all makes
sense in the session, I’ll forget the specifics and have to refer to my notes
when I need to apply what I learned. The same goes with books I read, I grab
snippets of techniques and write down a bunch of reference things for later. Whereas,
if I go to a class and have to walk through the steps on my own or take them
away as homework, it forces me to practically walk through an exercise and the
steps in order to complete it. Doing this locks the technique in, and I’m able
to recall how to do it when I need it instead of having to look things up
again.
If you are serious about getting better at the ETL portion
of your Power BI report building there are numerous resources out there, but
today I’d like to spend some time talking about one in particular that I would
highly recommend authored by Gil Raviv.
(Disclaimer: Gil Raviv is a friend, and his book was
gifted to me)
Suffice to say Gil is one of the best to learn from since he
was part of the MSFT team that created Power Query. To read more about Gil,
check out his bio on his website here -> https://datachant.com/about/
The main reason I’m recommending this method to learn is
that it isn’t JUST a book. Think of this as an instructional class, where you
have a bunch of material for both preparation and homework. What Gil does here
is amazing from a sharing perspective. Along with the book, you are given
access to almost 200 files that include the data sources and the Power BI
Desktop files with the solutions built into them… Take a moment to digest how
much content is in here, it is quite astounding.
The layout of the book is designed with the business user in
mind, and focuses on the most often used steps and how you perform them. As you
advance through the book, you will have the opportunity to build along with
each lesson, if you get stuck or don’t quite understand a lesson, then you have
the solution file as a reference. I loved the interaction here, in most of the
lessons I just created my own queries right along side the solution queries and
if I ever got jammed up I could just click in to the solution query for the
correct step. This made things really conducive to staying engaged in the
lesson and moving through a lot of material.
Like our journey with Power BI, it starts simple and moves
towards advanced concepts. About mid-way to ¾ of the way through you will be
introduced to the advanced editor and M, here you will start to learn how you
can manipulate the queries themselves without the UI… Getting here puts you in
such a powerful position with your ETL. Working through these lessons has
empowered me to easily understand how to manipulate my backend sources, modify
my steps I built in the UI with additional filters or actions, and troubleshoot
potential issues in Power Query. If all that isn’t enough, Gil gives his top pitfalls
and things to avoid section which is an invaluable resource section.
I really enjoyed this book, and it sets a high bar for me moving forward in that I will be looking for other materials from other authors to provide this level of engagement with something I want to learn. The structure of the book, using the chapters to read through the step by step way to perform the action, having the source material and the solution files all in one spot makes for a fantastic way to learn. Whether this book is your first foray into Power Query, or you choose to go it alone, I highly recommend that you get started in the journey.
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
We are starting today off with a fun chart. We will be making a filled donut chart. Typically, I don’t use donut charts but in this case I think we have a good reason, and it’s delicious…
The data being used in this visual varies from o to 100%. This could be something like a test score or a compliance number. Thus, we will be using the donut chart to represent a completion of 100% or some variant below.
Video on how to build this chart.
During this video we used a couple measures. They are the following:
Measures
Average Product Score = AVERAGE( 'Product Scores'[Score] ) / 100
Average Product Score Inverse = 1 - [Average Product Score]
Where the value of the Score comes from the Product Scores table. The Score column ranges from 0 to 100%. If you like this visual and want to download a sample file visit my GitHub page.
If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
As report authors we sometimes get caught up in how easy it is to create a report and provide value to the business. Each report is an opportunity to make a big contribution to the organization. Power BI makes it easier than ever to turn many of those reports around quickly. This is a good thing of course. But, sometimes we can get caught up in the madness of turning out another report with only a flash of recall that we could have used the same or similar model done in a different report. The internal monologue kicks in.
“Pffft! Re-use a model, that is way to much work! Why do that?, when we can just create a copy of the PBIX (Desktop file) and have two models to manage with the two reports for the same business area! That sounds like fun!”
Hmm… Or does this actually sound awfully similar to the challenges one might face with sprawling Excel solution. Where variants of logic are buried in different files. Only the composer knows how to bring order from the chaos. Might I suggest that we spare ourselves that sort of pain. Just learn how we can easily leverage our already hard fought model work. Avoid tedious updates without having to over complicate our nirvana of sticking to the business world, but how would we do that?
The Answer
Power BI datasets. Power BI datasets allow us to re-use our model across multiple reports. This simplifies and speeds up future report authoring. Also this gives us the building blocks for sharing that dataset to a wider audience.
How do we create one?
Technically, we’ve likely already created many. You see, when we publish a Power BI report, we publish a dataset along with it. This dataset is stored in the Power BI Service, and our deployed report relies on it now. But the connector “Power BI datasets” allows us to connect directly to any of these datasets that we have permission to edit. This means that we have the ability to extend a single model across multiple reports without the need of standing up a separate Analysis Services server anywhere. This is a big deal, this allows the everyday business user to leverage a reusable model. A single change or update to a calculation can update multiple reports at the same time. One measure or calculation addition can be done in one place instead of many.
All we need to do to create a dataset that we can connect to is publish a PBIX file that contains data. I’ve adopted a practice recently and rather than generating my first report and reusing that model, I now upload a PBIX file that ONLY contains the model and I name it something like “Sales-Model”. Now I have an object that I’ll know serves the purpose of just being a model instead of a report. This makes it easier from a trace-ability standpoint when looking at the related objects in the Service or selecting it from my list of options when choosing my dataset.
How do we use one?
Using the Power BI dataset is one of the most
straightforward connections in Power BI. Selecting Get Data -> “Power BI
datasets”
This brings up the menu of all the datasets in the Power BI Service. The list that is shown are the datasets that our user account has access to use. The great thing about these datasets are we now have the ability to connect to and use a dataset from a different workspace provided we have permissions to edit them. This feature is called a Shared Dataset. Select the dataset and your report is automatically connected to dataset.
Now, what we’ll notice here is that using this feature automatically pulls in a model for us and we can start building our report. This data source connection behaves exactly the same as if we created our own “live” connection to an Analysis Services instance we would set up. Probably not shocking to any reader here. But, that is exactly what is happening in this case as well. We get the benefit of Microsoft handling all that painful work for us while we reap the benefits of a streamlined process.
As with any “easy button” solution, there are pro’s and con’s. What I mean is that in our new reports we do have easy access to the model. Now you can start building reports immediately. We don’t have the ability to modify the model or the ETL processes. If we want to edit then we need to go back to the original dataset to make those changes.
But the minor inconvenience of having 2 PBIX files open if we need to in order to make updates to the model is trivial compared to being able to connect many reports to that single model. The live connection does still allow the report author the ability to create measures. So, if there are measures that are only suited to our report and not the overall model we still have the ability to add them.
Once we’ve completed
our report, we just publish as we normally would, only this time the dataset is
already out in the Service and only our report is published. There are so many
things we can now do to share that dataset, but we’ll leave that to another
article.
If you’ve never used this method before, I would highly encourage you to try it out. Any time you can save yourself now with reducing the number of models you maintain, the faster you can produce more reports. You now spend less time maintaining all the reports you are publishing.
Happy report building!
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
This post will answer how to sort a measure that returns text values to a custom order, without affecting other columns. It will utilize the DAX functions of REPT() and UNICHAR(8203) – a Zero width space.
The requirements
I’ve been working at a florist! In this example, I have been in charge of looking after four plants, named A, B, C and D. The florist owner is a big Power BI fan, and asked me to measure how much water I have been giving them a day to put in a report. They need at least 20ml to survive, but over 50ml will stop them growing as well.
Create a table with the flowing: The flowers get under 20 ml, label as Bad. When the flowers get 20 – 50 ml, label as Good. Finally, if the flowers receive over 50 ml, label as Warning. I’ve been asked to show them in order of Bad, Warning then Good. This is vital so the plants needing attention are at the top of the table.
Creating the table
Here is the measure I create:
Adding this to a table:
Now comes the question, how can I order this to put Bad and Warning together? If I order by Water Target measure, this will be alphabetical. Sorting by WaterIntake can not give me the correct order either. One option would be to make a conditional column and use the “Sort by Column”. However, this may be a complicated calculation, especially on more complex measures. In addition it will sort every visual by this column, when I only want to sort in this one table.
Creating the custom sort
My solution? Make use of the UNICHAR() function. For those unaware of this function, UNICHAR() can return characters based on their UNICODE number. This can include more text characters not included on the standard keyboard.
A character that can help is UNICHAR(8203). This is a “Zero width space”. This is a space that has not width, so it is essentially invisible and will not be visible in the measure. The Zero width space is still recognized as a character by DAX. Spaces come before any letter in the alphabet. Two spaces comes before one, and so on.
The second function I will utilize is REPT(). REPT() or replicate, simply repeats text multiple times. It takes two arguments, the text and the times to repeat.
For example: REPT( "Hi", 3 ) will return the text "HiHiHi"
To change the sort order, I will repeat the Zero width space in front of the text. The text I want to appear first will have the space repeated the most amount of times. This will put it first in an alphabetical list. I will use the & symbol to concatenate the Zero width spaces and the text.
Now, “Bad” has the Zero width space repeated three times in front of it. This now puts it first in an alphabetical list. Warning has the Zero width space repeated twice, putting it second. “Good” has it once putting it third.
Applying the sort
Now I can arrange my table by Water Target (alphabetical), in an ascending order:
And success! I’ve added a custom sort to my text measure, without making any other measures or columns.
If you like the content from PowerBI.Tips please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
The release of grouping visuals was an extremely welcomed
feature. As one who builds lots of reports grouping elements together is
essential to stay organized and to increase report building speed. Since I’ve
been using this great new, I found an interesting design element to style
groupings for reporting impact. The grouped visuals feature enables a new
property, background color. This can be
applied for the entire group of visuals.
See the following example of setting a background around two
visuals.
In this example the intent is to show the user that these
two visuals are related. The graph on the left shows the number of units sold
for a selected time period. The bar chart on the right shows the relative sales
over time represented as a percent change. This illustrates the principle of
position and direction. The number of units sold is what happened right now. It
is my place in time with respect to sales. However, this does not show any
context to performance. The percent change provides the directional context. Since the position and direction are an
important insight as a paired visual, we use the grouping to visually bind the
two.
For those who have done some research around design
principals inevitably you will stumble across the Gestalt
Principals of design. Grouping
visuals with a common background falls into the Law of Common Region or Law of
Proximity.
Alright let’s walk through how to use grouping with
backgrounds colors.
Once you have created the visuals which will be grouped together;
select each visual by holding CTRL and Selecting each
visual.
Right Click on one of the visuals and select the menu
item labeled Group, in the flyout menu select the option called Group.
A grouped element will be created in the Selection Pane.
Note: If you don’t see the Selection Pane, you will need
to turn this on. The setting to turn the
Selection Pane is found in the View ribbon with the check box for Selection
Pane. See below for reference.
With the newly created group being selected, Click on
the Paint Roller (Format) icon in the Visualizations Pane.
Expand the property section called Background.
Toggle the background to be On and select a Color from the
drop-down menu. For this example, I
selected the very first shade of grey in the first column of colors.
The final product will be a grouped arrangement of visuals
with a shaded background.
To extend this idea further we can take the same approach
when working with Text boxes and Visuals.
Often, I find I need more style for applying a Text box or header to a
visual. In these cases, I will use two
visual elements to create one visual.
See this example of two visuals with custom titles created with a
textbox.
Note: Backgrounds are colored differently to illustrate
that each background for the grouped visuals is different.
While this meets the need the boxes are not identical in
size. This violates yet another Gestalt
Principle, symmetry. The bounding
regions of the elements inside the grouping define the outer perimeter of the
background shading. Knowing this we can
modify the visuals within the groups to provide a symmetrical background shape.
Here are the same before and after images with each visual
object colored to see the adjustments in size for each visual type. This creates the proper background
sizes.
Before:
After:
The visual on the left required an increase of the text box
at the top to get the desired width of the background shape. By contrast the visual on the right required
an extension of the bar chart in length to acquire the desired length of the
background. The result provides a
symmetric view of both visual groups.
If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
I recently encountered a
really frustrating experience related to a set of reports seeming to not update
after some data source changes. I’d done this change before, and had another
co-worker take on the task of updating some of the other reports in the same fashion.
After a bit he reached out and explained that a few of the reports wouldn’t
update while the others had… This was odd since we were making the same type of
change. Queue hours of testing, changing, fixing one report without any idea
how I did it, and then hitting a brick wall with the remaining two.
The Setup: The data sources for the
original reports were Excel spreadsheets stored in a SharePoint Online folder. The
PBIX file connected to Excel files in SharePoint online using a Web connection.
Once the reports were created, we stored the PBIX files in a separate
SharePoint Online folder. Next, we opened the Power BI Service and connected to
the reports directly from Power BI to a SharePoint team site, if you aren’t
familiar with that process you can read about it in depth in an article that
Mike Carlo wrote -> here. (The TLDR
-> Get data -> Files -> SharePoint Folder -> navigate to folder ->
Click on PBIX -> Connect)
By connecting to the PBIX files from the PBI Service the dataset, report and dashboard for the report are loaded and sync’d. Where this comes in very handy indeed, is linking the SharePoint folder to your windows explorer and accessing the file, making changes, and saving the PBIX in that “local” location no longer requires you to re-publish the report from the PBIX and the changes make their way to the Service without any further work. I had made several changes to the reports from an asthetics perspective, and maybe added a new measure here and there, but hadn’t done anything drastic until just recently when we updated the sources. Due to some changes on the backend processes that were generating the Excel files, it became apparent that it would be easier for the Dev team if we used CSV files instead of xlsx. The team went through and changed the reports and 3 of the 5 reports we were working on broke in the Service giving this error.
At first, I thought it was the use of parameters in the report, but after determining that wasn’t the issue, it actually helped me figure out that the report in the Service wasn’t updating because the parameters were still showing the “xlsx” file type from the files that I had already removed from the SharePoint location. After repeated attempts to change the datasources manually in the original file, delete and replace, and every other combination of things to try a refresh, I was stumped. What was more confusing, is that connecting to the file again from the PBI Service created another dataset & report of the exact same name!
In my mind, that wasn’t
supposed to happen, and I was getting more frustrated because things were not
behaving as I would expect. My initial assumption here was that the PBI Service
would look at the new PBIX of the same name and recognize the changes and
replace the existing dataset and report, the same way that it does if I
manually push a PBIX via the “Publish” action. But using this direct connect
method, refreshing the dataset did nothing…(cue the “grrrr”).
The solution:
Thanks to the fantastic Power BI team, they were able to determine my issue and share with me the root cause of some of the datasets not updating. The issue is that when you connect to the PBIX files in this manner, the PBI Service looks at the root id of the PBIX file in the OneDrive location and that file id cannot be deleted and replaced with another one. If you do that, you get the above error on the dataset source because it cannot find the file id. If you remove the file out of SharePoint then when you insert it back into the folder the ID will be different. This is what breaks the dataset in Power BI, and also the reason it allows what appears to be the same named dataset and report. However there is good news, because you are in SharePoint you will be able to go back in version history of the folder and pull the previous versions of the reports and resolve the issue in the datasets. I did end up having to replace one of the reports, but lucky for us it was only a single user for the customer report and re-sending a shared link was a really low impact compared to what it could have been.
Here are the following
guidelines I can offer up after going through this.
Do:
Sync the SharePoint folder to your local storage
Open the report directly from the synced folder and update it in location
Leave the originals file in the folder and replace it with the new one
Do try to resurrect your previous version of the report PBIX in the folder (if you are hitting this issue now)
Don’t:
Move the file to your local machine, then back to the SharePoint folder
Remove the file from the folder (You can do this if you move it to a SharePoint folder, update and move it back, but if the dataset refreshes during that time in the Service it will break and you will need to fix the credentials when you have put the file back.
I hope you found this post
worthwhile, error messages are hard to track down to the root causes and due to
the nature of this being a tricky one for me to solve I thought I would share
the experience and hopefully help others avoid it or solve the errors that
match above.
When testing out this issue again, I did
uncover this error message which is much more specific. It came up after I made
the breaking change, but prior to an automated refresh.
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products:
Ever need two different scales on the Y-Axis of a line chart? If so, then this tutorial is for you. While creating a dual y-axis line chart is pretty common in excel, it is not as easy in power BI. The only standard chart that comes with Power BI Desktop that enables dual y-axis is the Column and line combo chart types.
For this particular visual I needed to show correlation between two time series with different Y-axis scales. The Y-axis on the left of the chart had data elements in the thousands, but the right side needed percentages. The tutorial below illustrates how to accomplish by building a custom visual using the Charts.PowerBI.Tips tool.
Video Tutorial
note: there are a bunch of really good custom visuals that can be downloaded from the Microsoft App Source store. However, this article will not review all third party visuals that are able to produce a dual Y-axis line chart.
Source files
All files used to create this visual are located here on GitHub.
Layout file
The file used in this tutorial was a derivation of the Sunset layout from PowerBI.Tips. If you like this file, you can download it here:
If you like the content from PowerBI.Tips please follow us on all the social outlets. Stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel. Or follow us on the social channels, Twitter and LinkedIn where we will post all the announcements for new tutorials and content.
Introducing our PowerBI.tips SWAG store. Check out all the fun PowerBI.tips clothing and products:
There are different ways you can connect to a multitude of different data sources. I’ve written about the different connection types before and you can find those articles here if you are unfamiliar with what I’m talking about.
When you import data and publish a Power BI report to the Power BI Service you need to schedule a dataset refresh in order for the report data to be updated. This can be done manually by refreshing the PBIX and re-publishing the file, but for any production report that you want to set a schedule for, you need to ensure that you set up things to automatically update without further intervention.
The first thing you need to check is whether or not you need
a gateway. For any data source that is located within your company servers, on
a local machine, or within certain types of cloud infrastructures are
considered “on-premises”. When a data source is on-premises you require a
gateway in order to refresh the data. If your data is in a cloud source such as
OneDrive or SharePoint Online, then you will not need a gateway to refresh.
Once you determine whether or not you require a gateway, you need to understand that there are two different types. The first is called a personal gateway, this gateway will allow you to use the majority of Power BI data sources with no configuration. After you install this gateway and publish your report using a data source that is supported by that gateway you can set up a dataset refresh in the Power BI Service without any additional setup. The second type of gateway used to be called the “Enterprise Gateway”, but now it is just the gateway that is not (personal mode).
This gateway is designed for use in your enterprise wide
reports. It has a lot more data sources that it supports for refresh (That list
can be found here)
It also requires some additional setup, configuration and management from an
Administrator. Let’s walk through how you set one of these up so that you
understand how to do it, or how you can request to have it done for your
organization. It is highly recommended that you install the “Enterprise
Gateway” on a server that will never shut down or be on a scheduled maintenance
cycle. You can download this gateway to your local computer, but if the
computer is off, so is your gateway, and the reports will not refresh.
Installing the gateway:
Installation is very similar to the personal gateway, except you have options to add the gateway to a cluster or incorporate a recovery key. ALWAYS set a recovery key and SAVE IT! This key will allow you to move the gateway at a later date if you need to without impacting any of your reports. This key is GOLD, treat it like it and you will save yourself a bunch of headaches in the future. Outside of that, the install of the gateway should be really straightforward, if you do run into issues there is a comprehensive guide to troubleshooting the problems that you can find here. I recommend using an administrative account to set up the gateway because the ownership of the gateway in the Service will begin with this user. If you have one account that manages all the Enterprise Gateways, then you’ll save yourself a ton of pain down the road from an administration standpoint in discover-ability. If you aren’t the admin, be sure to either have an admin set up an account for you or let them know that you have an Enterprise gateway set up using your account and have them securely store the gateway recovery key. Alternatively, if an admin account is not used, be sure to add an IT admin as a gateway administrator in the next step as you configure the gateway.
Configuring the gateway:
After the installation of a gateway you need to log in to the Power BI Service (app.powerbi.com) with the user that you configured the gateway with. Go to Settings > Manage gateways and you will see several different configuration options that will affect how end users can see and interact with data sources on this gateway. The main call-out I have in these options is the first one. More often then not, you will want to make sure that you allow users to “mix” cloud and on premises data sources.
If Gateway Administrators want other people to be able to access and create new data sources on the gateway for others to use it requires that every on-premises data source be setup and configured (Unlike the personal gateway). This means that the data source can use a single user/permission set for a data source and the report and end users will inherit the read permissions of that user. (A caveat to that would be the application of row level security, or a live connection, but that is beyond this overview).
After the data source has been configured, the administrator needs to add any users to the data source so that they can deploy reports using the gateway. This is an ideal use case for using Active Directory security groups, this allows administrators to apply a single group and add and remove users from that group verses managing all the data sources in Power BI. If a user is not added to the data source on the gateway, the report publisher will not be able to setup a refresh using that gateway.
Scheduling a Report dataset to refresh:
Now that you have completed the set-up of the gateway and added your users (including yourselves) to the gateway, you can publish a Power BI Desktop report and schedule a refresh.
First, publish the report from the Power BI Desktop to the workspace that you want the report to live in.
Next navigate to the workspace and select datasets, follow the dataset ribbon to the right and click on the ellipses:
Navigate to Datasets and expand the Gateway connection section. Enable the use of the gateway if it is not enabled already. All your source will be listed here, and if you have any data source connections strings that do not match exactly in your PBIX file it will become apparent very quickly that you have a broken dataset. All data sources in the gateway need to match exactly to what is in your PBIX file.
Once the gateway is enabled and all your data sources are working, you can schedule a refresh.
One thing in particular to note here, when you
deploy your report from your local time to the Power BI Service all times are
stored in UTC times. So, if you are doing time specific calculations you may
need to take some time zone things into consideration. Reza Rad has a good
article outlining some methods to sort this problem out, you can find that here.
A
recent update to the dataset refresh now also allows us to add in email enabled
groups or other specific users to the refresh failure section. This is
extremely helpful in ensuring a wider audience of users can be notified any
refresh fails.
I hope you found this overview helpful, stay tuned for more blogs that walk through specific data source connection and refresh setups that will help ease the process of connecting to, and refreshing all your various data sources in Power BI.
If you like the content from PowerBI.Tips, please follow us on all the social outlets to stay up to date on all the latest features and free tutorials. Subscribe to our YouTube Channel, and follow us on Twitter where we will post all the announcements for new tutorials and content. Alternatively, you can catch us on LinkedIn (Seth) LinkedIn (Mike) where we will post all the announcements for new tutorials and content.
As always, you’ll find the coolest PowerBI.tips SWAG in our store. Check out all the fun PowerBI.tips clothing and products: