There are often questions surrounding Publish to Web. What is it? How can I use it to share my reports? This video walks through the proper usage for Publish to Web and how to manage the Power BI tenant settings.
Check out this video explaining the full details surrounding this topic:
Of all the connection types, I’ve always gravitated towards this one. I imagine it is because I come from the database developer side of things. I’m a big fan of doing things one time, and having one version that controls many reports is extremely appealing. In fact, this topic was my first blog on Power BI so long ago (here for those feeling nostalgic). The live connection is the most powerful of all the connections that Power BI has to offer in my opinion.
Before we dive into the deep stuff, are you aware that you can use this connection type without your own instance of Analysis Services? Let me explain. Anyone who uses the “Power BI Service” connector that was first made available in April 2017 and released to GA in August 2017 is using a live connection to an Analysis Services Instance hosted in in your Power BI tenant. In fact, each time you build a Power BI report in the Desktop, you are building a Tabular model that is then created in the cloud upon publish! This live connection method allows you to gain a bit more control. You can deploy a single dataset to the Service and re-use it to build multiple reports! Having your own instance of Analysis Services on premises or Azure lets you maximize your development and deployment efforts and truly create a sustainable reporting solution.
The evolution of a Power BI solution “should” typically land in a space where a centralized or several centralized models are being used as the backbone for the vast majority of Power BI reports. This centralized approach is imperative in order for large scale BI initiatives to be successful. SQL Server Analysis Services Tabular is the typical implementation that I see most often employed due to the relational nature, compression, in memory storage and speed. That being said, lets dive into the details of what a centralized model gives us, and the pros & cons of the Power BI Live Connection.
Cons:
Most limiting of all in terms of disabling Power BI features.
Notice: the Data and Relationships icons are not visible after making a Live Connection.
This is without a doubt the most intimidating to the end user that isn’t familiar with the live connection. As with Direct Query, there are features and capabilities in the Power BI Desktop that are just flat out turned off or completely gone. ETL / M / Query Editor? Gone. Data pane, DAX tables, Calculated columns? Gone. Power BI has become only the front end of the process. The expectation is that you are doing all the data mashup / ETL and modeling behind the scenes and as such, these features are all removed.
However, a really great addition in the May 2017 Desktop release was the addition of allowing measures to be created on top of the live connection. This means that if you have a couple measures you need to add to a single report, you can easily add those in the Power BI Desktop without the need to have them added to the model.
Cost
Without a doubt one of the most appealing aspects of Power BI is the price. It is amazing the amount of power and value you get for $10/month. (Desktop is free, but let’s just call it $10 because you need a Pro license to share). When you scale up and start to use enterprise level tools you need to look at the costs that those include. This post won’t go into details because there is a myriad of options out there and the number of options increase exponentially when you start comparing Azure to on-premises. Suffice to say, you’re looking at a much heftier investment regardless.
Different Tools
Another drawback to this connection is that we are now pulled out of the Desktop as a standalone solution and thrown into development areas. At bare minimum we’ll most likely be using Visual Studio, Team Foundation Server for version control, possibly SSIS, SQL Databases and SSAS. Throw in Azure and you might be using Azure SQL Data Warehouse with Azure Data Factory and Azure Analysis Services…
Pros:
Change Control
This feature that can be implemented by Team Foundation Services allows for a developer to manage their code. In the context of the model, this means that I can check in/check out and historically track all the changes to the model. Which in turn allows me to roll back to a previous version, control who has access to the model and secure the access to the model to a known group.
Central model that supports many reports
Hands down the benefit to the model / live connection is that I can build a central model that supports a vast number of reports. This streamlines development, lowers the time to implement changes across all reports from IT and centralizes calculations so that all parties are using the same metrics.
No memory or size constraints in Power BI
Another great feature is that a dedicated server / Azure implementation has the capability to scale up to whatever RAM is necessary to support the model. The limitations of the Desktop are gone, and Power BI capable of handling insanely high volumes of data. This is because the heavy lifting is happening behind the scenes. A prime example of this the new MSFT demonstration that uses 10 billion rows of data related to NY taxis. (Did you catch that? That is a “B” for “BILLION”) I saw it for the first time at PASS Summit 2017, but you can see a quick demo of that below, or here in the first portion of this scale up & diagnostic video.
Now, the underlying hardware in Azure must be immense for this to contain the 9TB of data, but I still think it is amazing that Power BI can provide the same drag and drop experience with quick interaction on a dataset that large. Simply amazing.
More Secure & Better security
Along with the security of being permissioned to access the model there is an extremely valid argument related to security that just make a SSAS model better. The argument is that while the functionality exists in the Power BI Desktop to enable row level security, the vast majority of the time, the report author shouldn’t control access to certain sensitive information. Having that live in a file accessible by others to be modified isn’t something that passes muster in most orgs. With a model that has limited access, change control and tracking, and process for deployment the idea that a DAX function controlling a security access level to information becomes more palatable.
Partitions
This feature enters stage left and it is just “Awesome”. Partitions in a model allow you to process, or NOT process, certain parts of the model independently from one another. This gives an immense amount of flexibility in a large-scale solution and make the overall processing more efficient. Using partitions allows you to only process the information that changes and thus reduce the number of resources, reduce processing, and create an efficient model.
All in all, a lot of this article was about model options behind the scenes, but effectively this is the core of the Live Connection. It is all the underlying Enterprise level tools that are required to effectively use the live connection against a SSAS instance. In some respects, I hope that this gives you some understanding of the complexity and toolsets that are actively being used when you are using Power BI in general. All these technologies are coupled together and streamlined to a clean user-friendly tool that provides its users with immense power and flexibility. I hope you enjoyed this series and that it brought some clarity around the different connection types within Power BI.
Thanks for reading! Be sure to follow me on Twitter and LinkedIn to keep up to date with Power BI related content.
If you have spent any time working in Power BI, your very first step is to, wait for it… Get Data. Using Get Data will start loading your data into the Query Editor for Extracting, Transforming and Loading (ETL). When you start out in Power BI it is likely that you don’t spend much time in the Query Editor. However, the longer you use Power BI desktop, and the more you learn, you find that the Query Editor is highly under-rated. There are so many amazing transformations that you can perform on your data. After some time getting comfortable you’ll be building larger queries with many, many, steps. Eventually, it may look something like this:
Perhaps your queries are already long, or may be even longer. Wouldn’t it be nice to shorten the number of steps? It would make it easier to read. In this tutorial we are going to talk through how we can combine several steps when you create a new column. This is achieved by modifying the M scripts or equations slightly when new columns are created.
While doing this won’t cut down every query in half, but it will remove a couple of additional steps per query. This makes your queries easier to read and maintain. Also, using this best practice, will save you headaches in the future. At some point you will run into a data type error. This is seen when you try to join multiple tables on columns with different data types, or when you need a measure to create a SUM but the column data type is still text.
Let’s get to the tutorial.
Open up your Power BI Desktop program and on the Home ribbon click Enter Data. Using the dialog box for entering data enter the following table of data:
Sales
100
120
94
20
80
Once you’ve entered your data the Create Table screen should look like the following. Be sure to name your table, in this case I named my data table Data…. yea, feeling a lack of creativity today. Next, click Edit to modify the query before loading the data into Power BI.
This will open the query editor. Click on the Add Column ribbon, then select Custom Column. The Custom Column dialog box will open.
Note: When you bring in the created table that the Sales column already has the data transformed into a whole number. Also note in the right under Applied steps we have two steps, one for the source and one for Changed Type. This is because not every M equation (M language is the language used to perform the ETL in the query editor) can handle data types.
In the Custom Column dialog box enter the following, the column name, the equation below. Click OK to add the column.
Note: It is good practice to name the new column something meaningful. This helps when you are working in the query editor. When you come back to your query months later and wondered what you were creating, the column names will help! Trust me I learned this lesson the hard way…
Great, now we have our new column. Notice the image in front of our column named Increase Sales. This means Power BI thinks that the data type of this column could be Text or a Number. Let’s change it. Click on the ABC123 icon and select Whole Number. Now the column data type has changed to numbers only.
If we glance at the Query Setting under the Applied Steps, we now have 4 steps. Two were added, one for the added column and the second for the data type of the column. This is not what we want. Instead we would like the column to be added with the appropriate data type right from the beginning.
Let’s remove the very last step labeled Changed Type1. To do this we will click on the little X next to the step. This will remove the step. While highlighting the Added Custom step click in the formula bar and modify the equation to include the following statement in RED. Press the Enter to execute the new formula.
= Table.AddColumn(#"Changed Type", "Increase Sales", each [Sales] * 1.1, Int64.Type)
Note: if you don’t see the formula bar it can be toggled on or off in the View ribbon in the check box titled Formula Bar.
The query editor should now look like the following:
Without adding an extra step on the Query Settings, we have changed the data type. I know this might seem trivial, but when you are creating large queries, they can get difficult to read. For me, I find this technique quite useful, and it doesn’t only support whole numbers. This technique also supports the following data types:
Data Type
Syntax
Whole Number
Int64.Type
Decimal Number
Number.Type
Dates
Date.Type
Text
Text.Type
Thanks for following along. If you liked this tutorial, please share it with someone else who might find this valuable. Be sure to follow me in LinkedIn an Twitter for posts about new tutorials and great content from PowerBI.Tips
First off, let me say WOW! The announcement of Layouts was well received by the Power BI Community. Thank you so much for the positive feedback. So much so, that I got excited and developed another Layout, Splash o Red.
This new layout is called contains two pages. It uses the drill through feature to navigate from the main page to the sub page. The demo below shows you how it works. I hope you enjoy this new layout, you can download it here.
To see how to load data into the Layouts check out the following video below. It will guide you through the process. Additionally, each layout download will contain an instructions sheet just in case.
PowerBI.tips is extremely happy to be part of, and contribute to, the Power BI Community. We’re constantly trying to think of new and interesting ways we can enhance the Power BI experience for everyone by providing helpful tips & tools. Some examples of what we’ve done so far are Power BI Desktop Theme generator, and the Power BI toolbox:
Today we’re happy to announce another tool that we hope you will get an immense amount of use from.
Introducing:
Layouts are Power BI Template (PBIT) files that will contain layouts with visualizations already in place and only require your data to light up. We have, and will be, using as many of the visualization techniques. We are taking some of the best layouts we’ve seen, and those we’ve developed to create these files for you. This means that you don’t have to spend any time worrying about the vast number of design techniques. Additionally, it will save you time placing or moving things around on the report page. All that is required of you, is to download the PBIT file, load your data, and start selecting the pre-placed visualizations. With each layout template there will be a sample file (demo) that will show you the look and feel of each layout so you can easily choose the layout you want on each report page. You can always change the visual type with the click of a button.
Today, we’re releasing the first of our efforts with a 3 tab layout focused on the business analyst. These layouts are designed with maximum flexibility in mind, to let you alter color themes, easily change the visualization type, and provide enough visualizations to give you a huge initial benefit. One of the best parts about the Layout is that you are not limited by our designs, they are just the starting point, you can fully customize them however you would like. We just provide you with a solid foundation to build from.
Demo of Layouts:
In addition to the first analyst layout, we’re releasing an Info-graphic style layout that contains some deeper interactions using Bookmarks. However, these Layouts will be a bit more restrictive in terms of how much you can change visually. This is due to the need to rely more heavily on other tools to create the look & feel. Thus, you will have limits in just how much you can change. Our hope with these is to build more stunning report layouts that will maximize presentation, or help wow an audience.
We are releasing layouts for the analyst, executives and still have some fun with highly stylized files. We hope you get as much use out of these new Layouts as we know we will! Over time, we will continue to develop and produce an entire library of the selections. Thanks to all of you who make this such a fun and great Community to be a part of.
There are cases when working with Power BI files, that you would want to transfer a visual from one report to another report. While this feature is not available within the Power BI desktop program as of today, this is available within the PowerBI.com service.
To copy a visual from one report to another, both reports need to be located with the PowerBI.com service.
Navigate to the report where you want to copy the visual from. In this example I have created two reports, Sample 1 and Sample 2. I will copy a table visual from Report Sample 1 and paste it into Report Sample 2. Once you have opened the report Click the Edit report button on the top navigation bar. This will change the report into edit mode and provide additional options.
Select the visual you wish to copy and press CTRL + C, to copy the visual. Next click on the new report Sample 2, in our example, where you want to paste the visual.
Again, click the Edit report on the report (Sample 2) where the visual is being placed.
Finally, click anywhere on the white space within the report page and press CTRL + V to paste the visual into the new report.
That is it. You have successfully moved a visual from one report to another.
Points to Consider:
Both the Copy From report and the Copy To reports need to be in the PowerBI.com service.
Once you have copied the visual over to the new report you can download the report by clicking File, then selecting Download report(Preview). You can only download reports from PowerBI.com, if the report was originally authored by the Power BI Desktop program. If the report is made using the PowerBI.com service, the download report option will be grayed out.
When coping from one report to another, you have to use the same tab in either internet explorer or chrome. For this demo I used google chrome and was able to copy and paste the visual between report pages.
I hope you enjoyed this little trick. Please share it with someone else that might find this valuable.
Now that we’ve solidly gotten through the basics in terms of what connection types are in the opening blog, found here, and detailed out what is included in the default connection type of Import found here, let’s get on with some of the more interesting connections.
Direct Query is the first connection type that we will discuss that extends, but at the same time limits functionality in the tool itself. In terms of data access this connection type allows us to access our data in the source system. The supported data sources for direct query can be found here. This is distinctly different than what we observed in the import method. When using Import, the data is a snapshot and refreshed on a periodic basis, but with Direct Query it is live. “Live” means that with Direct Query the data stays in the source system and Power BI sends queries to the source system to return only the data it needs in order to display the visualizations properly. There are some pros and cons in using this connection so it is important to understand when you might use it, and when you should probably avoid this connection.
Cons:
When Direct Query is used you can no longer do many of the data mashup actions in the “Edit Queries” section of Power BI. It is assumed that you will have already done this in the backend. You can do simple actions such as removing columns, but don’t expect to be able to manipulate the data much. The Query Editor will allow you to make transformations, but when you try to load to the model you will most likely get an error that looks something like this
The data tab is also disabled in the model layer and thus you need to make sure that all the formatting and data transformations are completed in the source.
You can do some minor adjustments to format, but this could be a heavy restriction if you don’t have access to the data source.
There are performance impacts to the report that need to be taken into consideration. How large is the audience that will be interacting with the report? How busy is the source system, are there other processes that could be impacted?
Troubleshooting skills in source system language
Multiple applications required to adjust data ingestion and formatting
Pros:
The Direct Query connection does not store any data. It constantly sends queries to the source to display the visuals with the appropriate filter contexts selected.
In the November 2017 release there is a new capability in Power BI allows you to reduce the traffic and enhance this connection method exponentially. The feature is called Query reduction, and allows you to enable an “apply” button on a slicer or filter. The benefit with this option is that you can set all your selections of a filter prior to Power BI executing the query. Before this feature was available, every selection you made would fire off a query to the source database. To enable this feature, go to File -> Options and Settings -> Options -> Query Reduction you will find these options to help with Direct Query Performance.
Note: This enhancement greatly increases the performance of a Power BI report against the data source, but be aware that there could be poor query performance, or aspects of the solution that would require troubleshooting in the data source depending on what queries are being passed. This would require understanding of how to performance tune the source.
Deployment of the Direct Query connection requires the use of the gateway previously called the Enterprise Gateway. Note that the Enterprise Gateway is different than the personal Gateway.
No data is ingested into the model using Direct Query thus, there is no need to schedule a refresh. Once the dataset is connected to the Gateway, the data source feeds information to the report as the user interacts with the report.
It will always show the latest information when you are interacting with the report.
Direct Query is a powerful connection type in that it produces the most up to date data. However, as we have seen, it does come with some considerations that need to be taken into account. The Pros and Cons of the connection mostly revolve around whether or not the end user can understand and deal with potential performance issues, updating data retrieval processes, and understand the downstream implications of a wider audience. Typically, Direct Query is used in extremely large datasets, or in reports that require the most up to date information. It will most likely always perform slower than an import connection and requires an understanding of tuning and troubleshooting of the data source to alleviate any performance issues.
While on a recent project I needed to build a variation of the DAX date table. In my previous post, found here Creating DAX Date Tables I was built a date table for every day of the month for multiple years. I’ve only ever needed to consume a fully populated date calendar, but in this instance because the data I was collecting was already aggregated to the first of the month I only needed a date calendar with each month’s start date. After some playing around with my previous DAX functions I think I was able to come up with an elegant solution.
Let’s get into it.
Let’s begin by making a basic table. Open Power BI Desktop, on the Modeling ribbon click New Table.
Enter the following code:
Dates =
GENERATE (
CALENDAR ( DATE ( 2017, 1, 1 ), DATE ( 2017, 12, 31 ) ),
VAR currentDay = [Date]
VAR day = DAY( currentDay )
VAR month = MONTH ( currentDay )
VAR year = YEAR ( currentDay )
RETURN ROW (
"day", day,
"month", month,
"year", year )
)
This will produce our standard table from January 1st to December 31st, 2017 with every day populated in between. In, addition we get numerical values for Day, Month and Year.
Note: This table is producing a list of dates using the Calendar DAX function then iterating row by row to calculate each numerical value for the Day, Month and Year.
Add the Table visual with the following Values from our created DAX table.
Note: When you drag over the field labeled Date the field will be naturally added as a Date Hierarchy. To remove the Date Hierarchy, you have to click the little drop down arrow next to the word DATE and select the word Date in the drop down menu. This will remove the Year, Quarter, Month and Day date context from the visual.
The date calendar we made has every date, but we want only the first of each month.
Lets build a new table by following the previous steps and adding the following:
Start of Month Dates =
GENERATE (
GENERATESERIES(1,12),
VAR inc = [Value]
RETURN ROW (
"date", DATE(2017,inc,1)
)
)
Add the Table visual to the report page and add the following fields:
Note: I already removed the Date Hierarchy using the instructions listed above in the previous note.
This new DAX Date table is first generating a list of numbers 1 to 12 one for each month. Then it iterates through the list and produces a date using the Date function were we manually provide the Year, and the day. You can see the Generate function produces a column of data called [Value]. The variable denoted by VAR inc is capturing the number for each month. So, now what if we want to produce more than one year of dates? Super simple, just change the generate series from 1 to 12 to 1 to 48. This will produce three years of dates.
Change your Start of Months Dates to the following:
Start of Month Dates =
GENERATE (
GENERATESERIES(1,48),
VAR inc = [Value]
RETURN ROW (
"date", DATE(2017,inc,1)
)
)
With one number change we can produce 4 years of dates.
Cool, let’s go a little further. Just in case we need it we can also produce a list of dates that contain the end of the month. Add the following your Start of Month Dates with the following DAX table (don’t forget the comma on line 1 in the ROW function):
We have added a new column to note the end of each month.
Well, thanks for following along. In my use case this start of month date table was exactly what I needed. I thought this was a handy little DAX table, and I hope you have found this helpful as well. Be sure to share this post if you found this helpful.
One of the really cool features contained within the PowerBI.com service is the ability to monitor how often your dashboard or report is being viewed. You’ll find this feature by opening up either a Dashboard or a Report, then clicking the button called Usage Metrics. Clicking this button will generate a custom usage report. For more details on Report Usage Metrics see the following article from Microsoft.
Note: In order to see the usage metrics report you must be able to edit the report and have a minimum of a Power BI Pro license. Also, the usage metric report only captures the last 90 days of report usage.
The sad thing is that this report is only the usage metrics for the Dashboard or Report that you opened. This report is also read only and cannot be modified. However, there is a way to fix this, and see all the usage on all the Dashboards or Reports within a workspace.
Let us begin.
You will first need to log into PowerBI.com, once you have logged in navigate to either a Dashboard or Report. To open a dashboard or report start by opening a workspace in the left navigation bar. If you are already in a workspace you can open a Report or Dashboard by clicking on the Dashboards or Reports headers in the main selection area of the workspace.
For this tutorial I will use a report but the same steps will work for both the Dashboard and Report usage metric reports. Open up the report that your interested in viewing the metrics. I am using my report called Home for this example, this report is used on PowerBI.Tips, and you can view the report here if your interested. Now that we have opened the report, click on the Usage Metrics link on the top navigation bar. This will open up a usage metrics report. This report is read only and does not allow changes.
Here is where we get sneaky… If you observe the URL for the usage metrics report it looks like the following:
Notice the middle of the report where it states “?filter=Reports” this means the Usage Metrics report has been filtered for only one report in the workspace. The report filter context is passing the Globally Unique Identifier (GUID) of the selected report down to the Report Metrics. This is good news because knowing this we can modify the report and remove the filter, thus allowing visibility to all the reports in a workspace.
First we will need to save a copy of the report so we can make changes. With the Usage Metrics report open click File then in the drop down click Save As.
This will save a copy of the report into the workspace. Notice we now see in the Power BI header bar that the report has been named Report Usage Metrics Report – Copy. Also we can now see an Edit report button.
Click on Edit report to start changing the report. As soon as you open the report you can see in the visualizations pane that there is a Report filter applied. Remove this filter by Clicking the little Grey X for the ReportGUID in the Report level filters section of the visualizations pane.
After removing the report filters, we can see all the data from all reports.
Let the modifications begin.
On the left of the report we have a report page slicer. This allows you to see the activity on one page of a report. Now that we can see all the reports across the workspace filtering only pages of each report doesn’t make sense. We need to add an additional slicer to select reports we are interested in.
Select the Report Page Slicer the using Ctrl C copy the slicer, then paste it using Ctrl V. We should now have two slicers on the page. Select the top slicer named Report page. Change the field of the top slicer from ReportPage to DisplayName. The DisplayName is found under the Reportstable in the Fields pane.
Notice even though we changed the slicer information that the slicer title did not change. We have to manually change the title description in the display settings. Click the Paint Roller to open the display settings. Expand the Title section and change the Title Text to Report Name.
The title of the slicer visuals is now changed. Sweet!
Let’s move on to modify some of the standard visuals. Select the report titled Views per day and change it to the following settings:
Note: We changed the Axis Date column field. We removed the date field from the views table and added the date from the Dates table.
This allows us to see over time the number of views per report. Lets clean this up a bit. Change the settings of this visual by Clicking on the Paint Roller ribbon. Start by Turning Off the Legend, then open up the Data Colors, and Click the Revert to default to return the visual to it’s normal colors.
Now, lets modify the Unique viewers per day.
Again, the formatting of this visual isn’t great so let’s modify it. Click on the Paint Roller again and Turning Off the Legend, then open up the Data Colors, Click the Revert to default to return the visual to it’s normal colors. Finally, change the visual type from Stacked Column Chart to the Ribbon Chart. Your visual should look like the following:
This visual will show you which report has largest viewing audience.
You might have noticed that in both of these visuals I’ve been removing the legends. Which means, you don’t know which report is represented by each color. We will fix that by adding a final visual. Add the Bar Chart visual from the visualization pane. Add the following field names to the visual, as shown below:
Next, we will format the visual to clean it up. Make the following changes, Toggle the Legend to Off, Toggle the X-Axis to Off,Toggle the Data labels to On, Toggle the Title to On and change the Title Text to Report Views, finally change the Font color to Black and Text Size to 14.
We want to sort the reports not by name but by how often they are viewed. To do this, Click on the ellipsis and from the drop down Select sort by ViewsCount.
Now we have a custom Usage Metrics Report.
Be sure to save the file. Click on File then in the drop down Click the Save button.
You will notice that our modified Usage Metrics Report will be saved in the Workspace.
Thanks for following along. If you found this tutorial helpful please share it with someone who will find this valuable.
Power BI’s default connection type is Import. In fact, if you have never dealt with a data source that handles multiple loading methods, you may never know that there are different loading methods because Power BI automatically connects via import. However, if you’ve ever worked with sourcing information from databases or models, then you have seen the option to select Import vs. Direct Query or Live Connection.
Note: This is a continuation of the Power BI Connections series. If you would like to read the overview of all the Power BI Connection types you can do so here.
Below is a quick chart to outline some of the considerations to help you decide whether import is right for you.
Import is the only connection type that brings to bear the full capabilities of the Power BI Desktop. As you move from Import to Direct Query to Live Connection, you trade off ease of use for solutions that will scale.
Import will pull in the data from the data sources that you have connected to and store & compress the data within the PBIX file. The eventual publishing of the PBIX file will push the data to Azure services supported in the Power BI Backend. For more information on data movement and storage see the Power BI Security Whitepaper.
When using import, the full Edit Queries suite is available to mash up any data source, transform data-sets and manipulate the data in any way you see fit.
Once you click Close & Apply, the data is loaded into the “front end” of Power BI into the Vertipaq engine.
Note: The Vertipaq engine is used in both Excel and SQL Server Analysis Services Tabular models. In simple terms, it is the backbone that compresses all your data to make it perform extremely fast when visualizing, and slicing & dicing. For more detailed information on the engine see an excerpt from Marco Russo & Alberto Ferrari’s book “The Definitive Guide to DAX: Business intelligence with Microsoft Excel, SQL Server Analysis Services, and Power BI” found here.
At this point it is ready for you to extend by building out the relationships between your objects in the model section. After the model is set up you will now be able to add any additional calculations in the DAX (Data Analysis Expressions) formula language. There are two types expressions that you can create, measures and calculated columns. To create these, you can go to modeling, and select the option. When you do this, the formula bar will display. You can also right click on any column or field and select “New measure” or “New column” from those drop down lists.
Other than the formula bar with intelli-sense, there are several built in tools that can help you build those calculations.
The first method is to Right Click on the desired field and select an implicit calculation from the drop down:
The second is Quick Measures. This can be accessed by using right click as described above.
Here is an example of the Quick Measure dialog box:
Quick Measures allows you to choose from a wide variety of calculations to generate a measure. Once the measure is created, you can interrogate the measure and see the code that was generated. Click on the measure (denoted by a little calculator next to the text) created by the Quick Measure dialog box to see the DAX code.
Here is an example of what that looks like:
This is a great method to get your feet wet while you’re learning DAX.
Note: there are a lot of safety features added to these Quick Measures, such as, an “if” statement wrapped in a “isfiltered”. You might have to remove these bits of code in order to play with the measure.
When you have completed your report and publish the report & corresponding dataset to the Power BI Service, you will need to schedule a refresh. This will be required for any report which relies on the Import Connection. There are numerous use cases that surround whether or not you need a gateway, but a simple rule applies. If the data comes from an on-premises source, you will need one, for cloud sources you usually do not, but you can find in depth refresh documentation here.
The Import connection has the least amount of restrictions between the three methods, Import, Direct Query, and Live Connection. However, there are a few Import restrictions you should be aware of.
First, depending on your data source and the size of the data set, the processing of the model could take a bit of time.
Second, since all the data is being loaded into a table, there is a limitation on how big the file can get for successful publishing to the Power BI Service. That limit is 1 GB for free users & Power BI pro users, 2 GB for Report Server Reports and for Premium Users the size is only bound by the amount of memory you have purchased.
Note: The PBIX file can get as large as you want, however, it won’t let you publish.
Using Import is good when:
You can schedule your data to refresh
Data only needs to be refreshed periodically
Can be refreshed up to 8 scheduled refreshes in a day (restriction from Power BI Service)
The amount of data your importing is relatively small (doesn’t need to scale)
You need to mash up multiple sources such as Azure SQL database and google analytics data sources
In summary, the Import method is the most flexible, provides all the tools to connect, mashup, extend and visualize your datasets within the Power BI Desktop. It is likely the most used connection type and is the default for all connections. The data you connect to is drawn in, and a copy created and used in both the Desktop and the Service. Scheduled refresh is a requirement for almost all scenarios, and it is likely a gateway is required as well if your data is not located in the cloud.