PowerBI.tips

Too Many Details? – Ep. 233

Too Many Details? – Ep. 233

Report consumers almost always want one more click of detail—until that “detail” turns your dataset into a wide, high-cardinality monster that refreshes slowly and feels sluggish in every interaction.

In Episode 233, Mike, Tommy, and Seth respond to a listener question about the right level of detail in Power BI: when to model and import detail, when to push it to drillthrough/detail pages, and when to route users to a more appropriate system (without breaking their investigative workflow).

News & Announcements

Main Discussion

The question isn’t “should Power BI ever show detail?”—it’s what kind of detail users really need and what that implies for your semantic model. Sometimes “detail” is a breakdown (dimension), sometimes it’s a definition (why did the number change?), and sometimes it’s truly record-level rows. Each one has a different best home.

Episode 233 is a good reminder that VertiPaq is phenomenal at filtering and aggregating, but you still have to design the model with intent. If the dataset becomes the dumping ground for every possible attribute and note field, you’ll feel it in size, refresh time, and a report that gets slower the more it’s used.

Key takeaways:

  • Start with the decision you’re supporting and define the grain of the model; don’t let ad-hoc “just in case” columns set the architecture.
  • When a stakeholder asks for detail, ask “detail for what purpose?”: reconciliation, exception investigation, auditability, or just curiosity.
  • Keep the main model lean (dimensions + measures) and use drillthrough/detail pages to provide context-preserving investigation.
  • For high-cardinality “reason” fields or notes, consider keeping them out of import models and surfacing them via a dedicated detail experience (DirectQuery/composite, paginated, or the source app).
  • Use performance as feedback: wide tables, duplicate attributes, and ambiguous relationships usually show up as both model bloat and complicated DAX.
  • If you must show rows, design for it (limited columns, proper indexing upstream, and careful visuals) so “table detail” doesn’t become the default user experience.
  • Don’t break the user’s flow—carry filters and context into the detail path so the narrative stays intact.
  • Treat licensing/capacity constraints as part of the design conversation (size drives cost, and cost drives sustainability).

Looking Forward

A little discipline on what goes into the model—and a great drillthrough experience for what doesn’t—goes a long way toward keeping reports fast and users happy.

Episode Transcript

0:09 thank you [Music] good morning everyone welcome back to the explicit measures podcast with Tommy Seth and Mike hello everybody hello Mike hello I’m a bad tagline hello Seth hello Mike Mike happy Tuesday gentlemen it is jumping in

0:40 happy Tuesday gentlemen it is jumping in today today so for those of you who are listening this is the record episode so this won’t be a live however what we do want to talk about here we’re going to bring in something we haven’t done recently is drop in a mailbag someone has asked a question it’s a pretty interesting question so we’d like to address this this question I think this question comes from Stuart and it’s a bit of a longer Yeah question so maybe you want to give some high level bullet points on on the the key points for this question

1:10 for this question yeah first Stuart thank you for the question love the detail right love the Nuance sometimes we get questions which are just one-liners which hey man we can take that anywhere true Michael can go in anywhere in the cases where you specifically want us to talk about a particular topic Stuart we I think we’re dialed in here it’s good it’s good in an abbreviated form Stewart says I’d like to hear your thoughts on what level of detail to include in power bi reports and data

1:42 include in power bi reports and data sets I think the consensus is that power bi is a tool used for aggregation so not the place for record level detail vertapac columnar databases are basically designed for this type of work however we get requests all the time from users asking to show details in reports and he gives a specific use case where he goes through a need to have a reason Field behind like a price or modification change to a

2:12 like a price or modification change to a value that would be produced in the report he talks about going in and out of other applications or other tools to get detail instead of getting that aggregate like that level in power bi if we were going to remove it from there and talks about SSS ssss the SSRS the SSRS he says takes people out of their flow so he has a concern like if if you’re gonna use something else outside of power bi to show detail is it

2:44 outside of power bi to show detail is it too disruptive in the end user experience where they’re interacting with the power bi report and then you they have to go somewhere else so ultimately he he asks do you think it makes sense to just show low level to just show low level high cardinality type detail in the report what are your alternative solutions for bigger models and for reference I think he it dependent dependent large model is a subjective I

3:17 large model is a subjective I think he says small models are less than three gigabytes which he thinks is the case in most in the world real world which I would agree with but we can talk about the the large model stuff as well because where where do you put details and do they belong in a power bi report yeah this is a question thanks Stuart great question yeah so a couple couple thoughts thinking off here off the cuff just reading through this and thinking through some items here the first thing that sticks

3:47 here the first thing that sticks out to me here is just I I think I understand some of the modeling challenges that you’re you’re facing here right it the the balance of what the users are asking for versus what you’re trying to produce in the report is a constant challenge and you’re right the data models inside power are great at two things filtering and aggregating if you have a lot of extra stuff in your model that’s causing a lot of bloat like your Dax matches get really complicated maybe it’s time to go look at the design of your model maybe there’s something

4:17 of your model maybe there’s something you want to fix there or things like that that my other second observation at thinking about this was I wouldn’t necessarily consider a three gigabyte model three gig Model A large model I think that’s getting up there it’s definitely making you make some decisions around you’re now having to get out of the power bi Pro license your so there’s a decision point for pricing at around that three gigabyte level but I think in general models are a lot larger the interesting

4:47 models are a lot larger the interesting thing that I think is also in this article is they have about I’m not sure if it’s this one data this data model but they have around 2 000 users a month so maybe not super large data but lots of users and that also creates a a unique challenge for data sets as well so I would love first off as you say it’s not a large model that could be its own internal debate I would love to see a bell curve of who thinks three gigabytes is a large model I honestly would think a lot of the majority of people would probably say it’s on the

5:18 people would probably say it’s on the larger Side based on what people are building is it large in the scheme of things probably not but I think the majority of people are probably dealing with those types large models what’s interesting though too is the blow and a lot of things that he’s talking about here is not so much I’ll be talking about some of those like daytime columns the things that slow down email addresses a lot of like if you have a lot of like transaction IDs on a sales table table those High cardinality columns right

5:49 those High cardinality columns right yeah right but we’re doing this has been a conversation we’ve had multiple times but I don’t think we’ve really dived into it for a topic itself we are always talking about what do we give the users and part what I’m I always think is well we don’t need to give them the granular detail but as as you were going through this Seth I’m thinking about why do they come out with the power apps why they come out with power automate right those are

6:20 out with power automate right those are non-aggregated applications to be built in power bi foreign Visual and the power automate Visual and power bi the purpose of those is you it’s not going to be with aggregated numbers it’s usually with something granular that you can send back to the source I’d agree with those things right those are visuals that interact with like row level details of data typically that’s the purpose yeah so so I think I

6:51 that’s the purpose yeah so so I think I think there’s two points one is a he he’s he’s in in the final say so on small models so he’s not saying three gigabytes is large right yeah less than three gig which he believes it like we agree would be the vast majority of models in the world do you think it makes sense to just show low level high cardinality type detail in the report and and I guess I I want to answer this in two ways right one is we’re talking about cardinality and a a free notes reason column on

7:23 and a a free notes reason column on every row of data is certainly going to be that column right sure and the challenge with columns like that is they consume the vast majority of space in the tabular model right so if it’s under if it’s under three gig I still guarantee you and the rest of the model is is well-designed all your all your values like a line in columnar can be compressed highly Etc like I’ve seen models and for reasons some I own that

7:53 models and for reasons some I own that 80 of the model is that one column right because it can’t be compressed so I think that’s that’s part of it but the the second part here is it it and I’d be interested in your thoughts too I I don’t think I’ve ever taken a stance that says you can’t do detail in a power bi report never I never would have yeah what the reason you typically avoid it is it’s going to like it takes a while for that data to render and be

8:24 a while for that data to render and be returned so if you can optimize it as much as possible because there’s a reuser requirement like I’m not saying never do it where where I more lean into never do it or don’t is where people want to dump out massive amounts of row level data and

8:45 massive amounts of row level data and use power bi like a paginated report that is where I draw the line like you it’s going to fall apart especially when you throw 2 000 users at it right like if everybody’s trying to do this like type of reporting it’s not going to go well and that’s just a performance thing so in it those are the things that kind in it those are the things that pop out to me initially from like his of pop out to me initially from like his request and where we’re at as far as like cardinality as well in in do can can I not show details

9:17 in in do can can I not show details in a report this can you bring up a really interesting point I want to tag on your last comment there around I think it really depends on on the use cases of the information in the report and I would agree with you I would never push someone away from saying ah you have too much detail you need to summarize more I I don’t think I would ever really push people away from that and let that model grow to its correct size size but I would really want to ask the question of what happens when users see the information that is there I think there’s a couple outcomes of what

9:47 there’s a couple outcomes of what happens when you interact with the data if I need to go change a single record or update or it’s like a quality of report of data I’ll need links to go to a system that can change the information right so so there’s if if having the detail excuse me if having the detail is there so I can clean and correct and look at information then I think the Stewart’s question around like hey sometimes it’s helpful to go to Source systems yeah I agree and you should probably think about is there a way to deep link hopefully into that

10:18 a way to deep link hopefully into that Source system right hey here’s the URL you’d go to click this URL link and it takes you to the product or the the detail of the thing that you’re looking at and then you can make your changes there granted it has to they don’t have to be refreshed for that to be updated but at least the actions of seeing the detail are known known another thing I think you point out there Seth as well which I think is a great point is well what if you need to data dump a bunch of things right what if the details in the model just so people could get a table of data and

10:49 people could get a table of data and then hit export and I’d also agree there the model might need to have the records in it but revealing that table of data inside the powerbeat report probably not ideal ideal there’s probably other things and I think in the the question it was noted there was SSRS or patenting reports I would point to those items as being more relevant to getting tables of data that you need to export it’s easier it’s better on the model it’s more efficient on the back end dumping an Excel file from there

11:19 end dumping an Excel file from there is much easier so I think there’s you is much easier so I think there’s I would for me if I was in this know I would for me if I was in this situation I would be like what do we do with the detail why is this there there’s two very good points there and I’ll speak to the first thing you said Mike we we did a lot actually with powerapps and power bi reports that integration both within the report itself using the visual which really worked incredibly well and it actually linked to a power app and that’s where they did a lot of the source change yeah

11:49 they did a lot of the source change yeah but granted in both those situations we weren’t dealing with a three gigabyte model it was much smaller but it worked very well and I I think you’re you’re speaking to something not just what do they do when they get there but I think to me I went to how do they get there to the rows and rows of data is it a report where the first thing that they get to is a table of Records or in a lot of cases that I think usually solves better is it a draw through where there’s

12:20 is it a draw through where there’s already some selective filtering that occurs before they get to the details Pages usually any page that we’ve had that has records of details because it is important not even whether they’re going back to a product page or for trust just to make sure that what they’re seeing we’ve had an audit page but it’s always from a drill through Page and that has served usually pretty important is served a lot of purposes

12:50 of purposes and and from a trust point of view but to me I start with how are they getting to that drill through page then there’s a lot of reasons and a lot of things they can do from there but it’s the how is it because you don’t want to I wouldn’t personally just open it up as one of the main Pages where someone would get to that’s a that’s a oh man it’s two another great points there Tommy yeah but I think that’s I think that’s a key point that I was I was going to make later too is like how how would you do this if you needed to and I think Stuart outlines the fact that he

13:21 think Stuart outlines the fact that he has this setup I’d have to reread the parts but I thought he I thought it read such that I have a table of information and it’s showing the prices or whatever related to a particular row and somebody can drill through to the detail which is the notes of that particular price well that that is the best way right to get at the detail because yes you’re using the in the columnar indexing right to already drill down into a a very minute area but you’re

13:52 into a a very minute area but you’re using those big big filter indexes to get to the detail as opposed to your previous point which is where I do see a lot of problem in terms of performance which would be anybody who’s opening The Upfront page where all of the table is shown yes all of that detail yes is going to take a huge amount of resources yeah and and if you’re using the power bi service solely if you have 2 000 users and a handy a report like this

14:22 users and a handy a report like this like that yeah you’re gonna kill it man because every it’s got a render everything on those tables for everyone every time the page gets open so a great way to control that usage of the model is to drill through because you’re only expanding down to the very detail in specific slices with all of that pre-filtered indexing on it I think there’s another big Point too Mike you’re speaking about this with the

14:53 you’re speaking about this with the exporting data or the general use case I think if you don’t have a call to action on those detailed Pages you’re doing yourself an immediate disservice and and I’ll speak to this a bit where usually people will look at those audit tables if there is no General action that they need to do there’s a product they’ll need to check or someone they need to contact they’re going to find something eventually that’s not right and I think a lot of times they’re going to blame the report they’re going to blame the model but I we what we’ve done

15:26 blame the model but I we what we’ve done in a few use cases is we actually took those pages and said no go go look through it and if you find something wrong well speak to this operations or speak to X Y and Z T team because that’s where the information is coming from we didn’t know that column X you from we didn’t know that column X should only have four allowed know should only have four allowed values that’s not our role but if they just go into a page with no no call to action then generally they’re going to blame bi and go we found this error or what they’ll call an error or

15:56 error or what they’ll call an error or this bug right right sorry everyone will always plan B yeah the reporting team first yeah about dating everybody have you been working in this industry like yes yes I’m just saying preventative things that’s why I’m bringing up where there is a lease on the page itself like see any bugs contact cure or go to this ticket basically and at least a lot of them know like we know too it’s not perfect well it’s also making sure that you have data stewards right so yeah the the data so the data engineering there’s

16:28 the data so the data engineering there’s definitely multiple teams involved when you get to this level right we technically don’t own the data coming from the source system so yeah if the data is coming in bad the data is going to render bad in the reports and I’ve had a number of times where we’ve had to go through and they’re like oh this number doesn’t look right yeah it’s not right because if you look Upstream like let’s Trace down the problem like let’s go through the measures okay here we are data at the source system is actually wrong so that actually brings up another point we talk about what are the use cases of why we need a lot of

16:58 the use cases of why we need a lot of data and typically they are I’m taking data or the actions are not clearly defined right I just need it like so there’s there’s Two Worlds I think we play in I need insights on data which is talking about aggregations and rolling up data and then there are I need access to the data so insights and access if I want access that typically means let me select a couple options a filter or two and let me dump out the data to a flat

17:24 and let me dump out the data to a flat table table right so the access to that data is a different request because one people don’t trust it maybe maybe there’s they want to do their own reporting offline they’re trying to get access to the information and putting it somewhere else so they can like Leverage and use that information one of the main challenges I see when you talk about use cases of why we need lots of data or lots of tables of data one of them has recently come on my radar is around people doesn’t trust the data there’s a trust issue and so by having more information in the

17:55 and so by having more information in the report they could go down to the detail and they could match what was in the operational system against to the report so they could actually say yes this record transaction is in fact pulling the right data from The Source system it is showing me the information that I need and so we were keeping too much data in the model just because a team needed a quality check something so I think it’s another use case that you’ll find here of why people are asking for all the information is so they can trust the data they can get down to the detail and they can say yes this does make sense

18:25 they can say yes this does make sense well allowment asked my deep cut question question because this you raise up a good point is it inherently a good thing or does it inherently build trust when you show the raws of data or is it a hindrance when you think about the long term these I think that I don’t think it’s I don’t think it’s ever bad to show more data I don’t think you hinder anything by showing more information I think you have to let people get comfortable so it’s about trust right if you tell them you’re going to aggregate the information they have to be able to trust that

18:55 they have to be able to trust that information at some point let me rephrase that real quick is it always a good thing to show all data to all people I’m not sure if that’s the well I think it’s an okay question to ask I’m not sure if I if it’s in the model I would say you should be able to access it yeah so I think I think the answer is you have to determine again this is where I would lean on the output of the report right what is the desired outputs from this report the actions derive the level of

19:25 the actions derive the level of granularity that you need so if your action is I need to see again I was listing things I’ve done where I’ve seen lots of data records or a lot of detail being shown one was hey I have a lot of data from many many customers and I need a model that rolls up all customers data into like a bigger model that could get really large another version of this is financial right I have an invoice and have invoice details for the header and header details of invoices I need to see all that information in a model that can get

19:55 that information in a model that can get really expensive because every single line item becomes a record or row of data and if you have lots of invoices or small amounts in your organization you could be making millions of rows of data in a month so in those cases I would look at this going when I have those needs the question usually comes down to okay what is the need what is the real output of the information are we talking about really real summarization of things or are we actually saying requiring trying to Quality check or look at the details of stuff

20:27 so again I think I think the answer is like it depends okay I don’t know when you say trust issues like and you’re pulling additional data in is that just like you’re speaking it’s in a model and then you display that for data validation in a different report or you burying that in the report so I’ve seen where I’ve been used recently looking at this scenario was we have aggregated numbers people look at those numbers and they want to go down to the level of detail saying hey I’m looking at this product

20:58 saying hey I’m looking at this product on this date for this team like you’re going all the way into the into the weeds because someone is trying to check the numbers and again I wouldn’t put on the main page like you said Seth I think there’s a whole methodology of how do you make the report efficiently run run but there are teams that are when you’re looking at the data when you have another team reviewing the model as they get into that model more they’re trying to get they’re trying to get to that data level that detail level because they have to match it to something that they can actually see

21:33 yeah I just the I guess my my question is when when would when would it be a good idea to not have to do that in your model though like it like do you pre-aggregate numbers in the source system and those are the numbers then that you check against that well that you you build upon in power bi right because like 100 in a lot of the use cases there’s we’re building tax calculations we’re rolling we need data off of raw data yes one of the challenges within there is to

22:05 one of the challenges within there is to your point I want to validate the aggregated number that you’re producing yeah so I have to provide the detail of course is removing that out of the model and this is probably dependent on model size or volumes of data that you would push that back more into the source and have a different layer of okay the pre-aggregate all I have to show you is that I my additional calculations on top of the pre-aggregated numbers tie out as opposed to pulling forward all of the details forward into the model to have

22:37 details forward into the model to have to prove out and I’m not saying it’s not a worthwhile exercise because yeah absolutely it is especially in finance like you have to tie things out yes and you can only do that so much with documentation without a validation check somewhere in there right like this is the calculation we’re doing here’s how it rolls out blah blah blah so I’m just interested in like where’s the brake line of do you always do that in the front end in detail in the report model or is it a volume issue at some point that you’re going to say hey we need to pre-aggregate some of this and and that’s where I’m going to point

23:07 and and that’s where I’m going to point you for a lot of your data validation checks let me give you a very common scenario that occurs too when you go into those rows of details when you don’t have the data stewards if if they find a role that one of the columns is wrong like let’s say the sales rep or the country or the account number well the immediate thought without anything usually set up like a maturity from a Governor’s point of view is well if this is wrong how can I trust anything else is wrong too or how do I know something you started wroting trust right away

23:37 you started wroting trust right away this is the immediate action from there without a good setup system is well I can’t I can’t trust this report now because I’m like I’m responsible for the numbers in here or I’m my performance is tied to this and now it’s showing the wrong thing so and I have thousands of rows in here so I don’t want to be judged on this so therefore that there’s a there’s a there’s a pretty big windfall from that and that honestly it leans towards the other way well let’s let’s go back

24:07 the other way well let’s let’s go back to Seth’s question there so I’m going to go too far off the route off a path here but but your your question was what’s the right break point of quality checking where does that live and I think it depends Seth I think that’s a great question I think that’s a many if I think about that question that is a place where organizations need to identify what data this is is this something that we’re going to certify is it something that’s going to be from the central bi team or is it something that a team or Department level is doing who’s who needs to own this looking forward and to some degree

24:39 this looking forward and to some degree right if you have team members that are able to build again if we think about the the data comes from a source system it’s usually not in the right form so there’s a whole bunch of engineering or a black box of engineering for shaping data we’re making fact scheme or Star schema type stuff right you’re trying to simplify the model a little bit so that it becomes easy to consume as a report consumer consumer if you have enough time or capability in your team you can do data quality checking Upstream inside that data transformation stage right and

25:10 that data transformation stage right and so I think when I when I look at this going okay if I think about a certified data set typically my M queries or power query in desktop are very simple select this table here’s my incremental refresh policy and that’s it the the state so if I’m doing the data engineering outside of power bi and that’s happening Upstream somewhere then the quality checks could start Upstream as well and you can do some quality checks there initially to get people’s confidence and maybe you even produce a separate data model with a

25:42 produce a separate data model with a certain amount of information that is detail related so people can trust it to gain that trust and then you in the final report do the aggregation but now you have some compare again so I have like a detail view with some aggregations that I can apply to my final view which my final data model that has existing aggregations in it so that you can say yeah these numbers match we’re good to go however I think the story changes when you start having a lot more data

26:05 you start having a lot more data engineering inside power query inside the models because now there’s a lot more chance for risk of introducing errors into that data engineering that should not be there and then in that case you probably have to keep the detail because you had to vet those power query or those transformations in m that should not I think the transformation shouldn’t be available to all everyone should see that I think

26:35 all everyone should see that I think that’s more from the engineers if you’re doing a lot of the Transformations in power bi or data flows right those Transformations and what you’re trying to choose should be more of the data stewards because now the bi team is responsible for that so let me let me back up a bit no no well no no okay go ahead okay so we have there’s a really good situation that we had where we introduced a sales report and it was available to the sales team and

27:05 available to the sales team and immediately there was this backlash where this country’s wrong this account’s wrong and no one could trust the report which the immediate findings was this was actually the source systems that this was coming from it was just now revealed how many errors this were this led to a data contract or a contract in this report from from the vice president saying if you see something say something or change it please this is what’s actually you’re getting we’re going to be using

27:37 getting we’re going to be using this report we can’t help if you’ve inputted the information wrong so this reports now it became almost an audit report for the entire sales department this is this is my speaking to the point of quality checking right this is a quality check but this had to be in a sense called a contract called an agreement sure but there was an actual handshake so to speak between the spear eye and sales where we’re going to show you the information but you are responsible for changing it if you do not change this you will still be

28:07 not change this you will still be reported based on what it says but this has yeah yeah I think this brings up an important point that I don’t think we’ve called out very often but often but if you want to protect yourself as the report author as a bi team as whatever we have we have typically we Traverse many different areas within the organization and I think it’s very important that unless given the time and or understanding the

28:38 given the time and or understanding the processes and standards within the organization if you don’t have time or you are not being asked to test analyze validate data like in those Source systems and understand it but you’re just building a report on it then it’s your job to ensure that the report consumers understand that this is not validated yeah it’s not certified it’s not anything until you put it through that rigor yes and it’s only like this is where you go into like oh well people

29:08 is where you go into like oh well people don’t trust it and you’re right well that just means your throat you’re plugging into a source system and throwing a report at people without communicating that yes right because it shouldn’t be like listen if you’re especially if you’re not doing like a bunch of transformations in things because it could easily flip right where you start doing a bunch of transformations in in power query and all of a sudden you are screwing up the data yes yeah so there could be absolutely that but validation and analyzing the actual data

29:39 validation and analyzing the actual data that’s coming in and going into a report and communicating who the owners of those things are I just absolutely should be part of conversations before you just throw something out the door and and we notice with that data contract that we had within a month the majority of the issues the report morphed back into a report rather than this audit report because people were often had the fire lit up now the reason why I I want to go back to the point where I thought to say backtracking

30:11 where I thought to say backtracking about if you’re doing a lot of ETL and power query the reason why you wouldn’t necessarily show all those records there because to the user they’re not going to make a lot of sense the reason why someone wants to see the detail I disagree hold on so the reason why they want to see the detailed level data is because they can always usually relate it back to a source system they can relate it back to the CRM they can relate back to Salesforce they can relate it back to Something in marketing okay that column to that value now when I am doing an incredible amount of ETL

30:42 I am doing an incredible amount of ETL in power query or an Incredible amount of manipulation like first-time customer second time customers it’s going to be very hard for them to validate that against anything the normal consumer if so I’ll let me reiterate that if I’m looking at detailed level information I want to compare that against the source system if the Transformations are all coming from Power query there’s no way for that user to validate that disagree if you’re if you’re abstracting the data so far away from what’s in the source system that no one can understand what the

31:12 that no one can understand what the information is someone needs to document that and there needs to be a quality process around that to verify what you’re doing fine so it’s not all data to all people though that’s still part of the model yeah no I agree but what do you mean all day to all people I don’t even understand your question your statement there like all day to all people if you build a model and give it to people in a report anyone who’s able to see the report has like like I don’t understand the question I’m losing context here yeah okay so the question I asked earlier about should

31:42 question I asked earlier about should you show if you have a model should you have an audit tab that’s available to everybody you said yes I don’t think we ever asked that question directly so I’m saying one of the use cases of where reports I think I think you’re Miss reading my interpretation of my statement earlier I think my my question is when I find people asking for lots of details of data what they’re trying to do in that experience is to audit the data okay does every report need an audit page maybe does that every does that page need to be exposed to every external

32:13 need to be exposed to every external user they may not understand that maybe not hit maybe it’s hidden somewhere or maybe there’s a separate report hanging off of the data model that’s doing more of that quality checking because that is part of your certified process to be able to make that report page or to make that report certified I think in both cases here we’re still talking about there has again we’re talking about trust in the report we’re talking about getting people to verify that the information that is in the report is accurate and what they say so they can make decisions off of it sure so but let

32:45 make decisions off of it sure so but let me so let’s go back to the question then if you’re doing an incredible amount of transformations in power query what does the average consumer not the steward or not the validation side but the average consumer need to see from from the details that they can make sense of okay one you’re never going to have a Details page in a report that describes in detail all of the transformation everything you’re doing an audit page in

33:17 everything you’re doing an audit page in power bi would in my in my view would be to audit the calculations and things that you’re doing within the model if somebody needs to understand the difference between Source data that they’re going to dump out from a source system and the data that gets aggregated or put in the report that is where documenting the business logic in business terms not in what I’m doing in power query is important for them to understand because if they can’t yes they can’t make the leap from Source system to what I’m showing them in a report without understanding that I’m

33:47 report without understanding that I’m applying all of these business loaded conditions and transformations to get it to a shape where I’m building calculations on top of right and I meant to use the word audit not necessarily details where I’m explaining the Transformations but so things where I feel data communication needs to be very clear where business logic I feel like has been applied that makes us challenging whenever you’re trying to join two tables of data together those joints or how those joints are being handled is a potential weakness right did your join

34:19 potential weakness right did your join actually make sense based on what the the business understands that the relationships of those datas I see a lot of trip UPS there I see a lot of trip UPS there the other one another another element that I see where potentially is challenging is whenever you have an allocation of data right we have like hey there’s a there’s a dollar amount and we need to do to reallocate that money right the source system checks it one way but the business looks at it another way so you take this like large amount of dollars and you have to distribute it

34:48 dollars and you have to distribute it across either multiple people or multiple categories whatever things like that right when you have things that are being like reallocations when I hear when I hear words of oh yeah we’re just going to allocate oh my head just goes nuts because it what is happening is the source system isn’t capturing the data the right way the way the business wants to look at it and someone in a process has said no we’re going to redistribute this data so it makes sense financially or that’s how we look at our business whatever that is whenever people are saying the word

35:19 whenever people are saying the word allocations I get real nervous because I’m like that’s that’s a very usually heavy weighted business logic piece that UPS to your point Tommy that obfuscates what the users see from what in the report to what they see in the source system and there’s very there’s less transferable knowledge between the two systems so I would agree with you in that way right when we’re talking about joining things together when we’re talking about allocations on data those are areas that I feel because more challenges for

35:49 I feel because more challenges for organizations however what I would say though is though is when we’re trying to check against Source system if we’re doing so many Transformations that I can’t get back to a record that’s in source then I would argue we need a better process around what the report is doing because we’re now at a place where you really can’t quality check the report and that that would be a team of people there’s like there’s a quality assurance team you’re doing some QA work on a report and showing people okay how do we get to this number let’s spot check some

36:20 get to this number let’s spot check some things and then as you make changes the model you have a regular way I’m going to use a term here that I think is becoming a bit more popular you have a little bit of a data Ops process how do we how do we operationalize the data how do we regularly show you the data is changing but it’s changing consistent way every single time but this is where I I think and that’s a really good point because this is where I think you start extending outside of the report where we talked about how you start to elevate a data set into a different realm yes where you have

36:51 different realm yes where you have proven out that within power bi you’ve connected these systems you’ve gotten to the point where there’s a lot of a lot of transformation happening and it doesn’t belong there anymore we need to make sure that it’s in core systems that this is how things operate this is how danger danger will bro allocations work yeah right that are not buried in a report correct because they shouldn’t be yes so I think that to me all of a sudden raises that flag of like okay now we need to start talking to a a team to

37:23 we need to start talking to a a team to push this into a a much more formal process where we must build even more rigor around checks and validation to ensure that the data coming through is as accurate as possible what I have found though is when you have the trust built with the bi team teams are more willing to take more raw data and let bi do a lot of those Transformations but Mike yeah I would totally agree with that yeah and that that’s usually where those more com convoluted models come from where it’s not so much the two raw

37:53 from where it’s not so much the two raw data is set to tables where it’s like well we have this API that we’d love to see and also this and but you raise a very good topic that I think is a good question when we talk about the audit tables do you put your Dax in the audit tables or any type of Dax in those audit tables or do you remove all the summarization so because and I’ll say for myself I never put any of the actual Dax calculation if you want a raw audit

38:23 calculation if you want a raw audit table that’s what you get you get the values that are nothing’s summarized usually usually I I’m gonna go my answer this question is going to probably be not a good one okay but I’ll give you one anyways when when the Dax calculations get very complex in those allocation scenarios you you need to like when you’re doing complex filtering and you’re doing you complex filtering and you’re doing other things inside Dax again know other things inside Dax again where I’ve seen this happen again this is a general rule of thumb this is not

38:53 is a general rule of thumb this is not in every every use case when the Dax gets more complicated it’s because the tables and the shape of the data that you have in the table elements are not as easy to use you’re trying to do some extra business logic inside those stack statements and I’ve seen some very complex Stacks things I typically prefer my Dax to become simpler right it’s I’d like I’d rather have the Dax become simple and I’d rather do more engineering on the data to shape the data for what I need inside the report so so that being said right when you’re talking about a quality page

39:24 when you’re talking about a quality page your measures are hopefully doing simple things and there may be a handful of pages or a page or two that is doing some checks around what the measure calculations are doing because you should be able to go to a column drop in a couple categorical values and see the details rolled up and know that that’s the same value as the measure so you may do some things there but again I’ve had very good success if the measures become simpler they’re much easier to check so they’re not really required as much to go into the quality pages

39:55 go into the quality pages and again if these quality pages are meant to go back and help you determine what’s in the source system and match those numbers thinking like invoice detail or something along those lines right here’s the invoice I can literally see the invoice in the source system and I need to verify the total dollar amount on this record or for this customer is the same amount then yes you go there is no measures there it’s just simple right show the values you’re also speaking like rainbows and unicorns and all those dreams that never occur in most

40:26 those dreams that never occur in most business logic or models but no I I agree with your saying we’re an alternative we’ve done is not so much an audit table but where it’s more advanced stacks and people are trying to check that it’s basically Summers I summarize tables of categories countries or months of that Dax calculation just so they can say hey Karen had this Joe had this for this number of a quick and easy summarized view where they can kind easy summarized view where they can check the high level values but it’s

40:56 of check the high level values but it’s almost impossible to put the Dax in those audit tables because it doesn’t work that way especially with a little more advanced scenarios I disagree you gotta check the Dax just like you do all the raw data it’s part of the data model it will calculate things and again as you do more complex filtering as you try to blank out values that it should be blanked when they don’t have the right filter context or not the filter context matters in the measures so yeah a lot of times I’ve seen measures being created without any level of protection around this measure

41:26 level of protection around this measure only works in this very specific scenario therefore you have to add other conditions to that filter that way you don’t put a measure on a table or a measure in a place where it doesn’t belong your hidden filter on a card just so yes no I know what you yeah so so we we’ve successfully once again diverted into trust and audit from details so bringing it back to the conversation if around details and specifically this concept of going from a report elsewhere

41:57 concept of going from a report elsewhere when do you guys kick people in if possible right because it’s sometimes you can’t push them into a different application but like when possible when do you when do you choose to kick people into a different application as opposed to using the details within a report like what’s your what’s your threshold or like what’s your use case I don’t think it’s the number I think it goes back to Mike’s Point at the very beginning of this oh the why of what what is someone trying to do in a Details page and I I think the

42:27 in a Details page and I I think the number or the threshold is not a gigabyte or a megabyte issue usually it’s that could be a hundred rows but if they’re the reason why they want to see that is they do want to act on it then give them that access I think I would also Echo the same thing I think you’re saying the same thing I would tell me just a little slightly different way I think it boils down to the actions I need to take on that data right right if I’m if I say I need this table of data with these number of columns and I’m going to export it and I’m gonna go do something else with it

42:58 I’m gonna go do something else with it then I would I would maybe push you to a different tool or I would push you somewhere else to go if I need to see this row of data I need to go change the record well yeah the action is change record of a data sure that’s gonna happen somewhere else like you’re you’re going to I’m going to launch someone else to another application and maybe I’m doing something fancy where I’m maybe I’m embedding a power app inside power bi fine right I can click a row of data I can get a record in powerapps and then I can change the record right there immediately great super cool but when when I am

43:29 but when when I am expecting people to use the insights and information from that data that’s in the report I’m less inclined to Route them to another part of the application I don’t agree with that right because like even even as Stewart’s question aligns and what it seems is yeah the the that that I think is being glossed over here is the end users are already getting value because you’ve created an aggregation or some insight that is driving them into looking at

43:59 is driving them into looking at something more specific right and that’s where a lot of the action I think the the default with me too would be yeah kick them into a system where they can do something right because they yes we’ve shown them like hey here’s here’s your percentage of failure great now that I know my three percent errors what are those errors okay why why did this happen like drill through like or I need to go fix something because something’s broken drill like linked to the application to go fix it yes the only other use case that I

44:30 yes the only other use case that I could think of where I would probably kick people into a different application that wouldn’t necessarily be like I have I want to take an action is where the volume of data is so large or there’s different types great Point there’s attachments or like there’s going to be word doc or descriptions like where I’m not going to just consume everything out of a like and this is maybe might be a poor example like a Salesforce opportunity in all the fields within that opportunity that full history of that opportunity or

45:01 that full history of that opportunity or whatever it may just kick them into Salesforce to look at the opportunity and everything around that because that’s gonna that’s gonna give them their full picture especially if there’s columns like not reason code hopefully has a a character limit but if it’s like paragraphs of information yeah I’m not pulling that in right is not great at representing full paragraph like that right now that that’s where you get into trouble like with the model really exploding but those would be the two use cases where it’s like I like that there’s so

45:31 where it’s like I like that there’s so much ancillary data that I would never suggest pulling all that in because you’re gonna have storage let alone access problems even on a drill through but just it’s like to have it in the model make doesn’t make sense so as long as as long as that detailed level data is limited and then maybe there’s a link through that if there’s other additional information into an application but like you you can get a long way with solving those problems with if rather than a reason code if it’s just a link to like

46:03 reason code if it’s just a link to like a fabricated link to an internal system and if people aren’t familiar like typically you can that there’s a common URL string you can construct that with one identifier that would be in a data set or two right through the URL right yeah interdirected into the URL and it just becomes the link in a report where people can easily navigate between the two systems and and ultimately like I think where I land is if somebody is going from insight to

46:35 if somebody is going from insight to analysis and action like the report has served its purpose right it you’re they’re they’re doing multiple different types of activities from a report consumer perspective right you’ve highlighted a particular area on an aggregate level in the power bi report or at least the hypothesis is here that that is happening first before somebody’s diving into detail because if they’re straight diving into detail then we just go back to SSRS why aren’t you just dumping all

47:06 to SSRS why aren’t you just dumping all the detail out for them in some way yes right because that’s all they care about and in in some cases then then fine and this makes a lot of sense too where I don’t know why powerapps default view if you connect to power bi is the phone view where you can do a lot of those characters or those records in a system that makes more sense to showcasing that where if I I have a table and I have a app the size of half the screen power apps where I can show those paragraphs I

47:36 apps where I can show those paragraphs I can’t show that detail and if I can act on it if I need to because generally speaking I don’t want I’m only going to want to see those paragraphs or those fields or those media items if per record not for multiple records and that’s where like powerapps makes a lot of sense even embedded on a screen but yeah we’re in the year 2023 your system by now except if you’re CRM or dynamic sometimes has a direct link that makes a lot of sense but and again

48:07 makes a lot of sense but and again that’s only the last thing I’ll say to that is that’s half the people that need to do that that either have access to those Source systems the other half goes to Mike’s Point two word they’re going to need they’re going to want to explore and do an analysis I’m surprised Mike one thing we haven’t talked about with the analysis roles like I want to I want to act on this from a more granular point of view I want to do more you point of view I want to do more detail in Excel not talking about know detail in Excel not talking about the analyze in Excel as one of the features either which we can say for another day yeah I think that’s another

48:38 another day yeah I think that’s another another again I think it’s all action based right right if I want wide tables of data I’m not probably going to push it to power bi if you want exports again this I think it really boils down to access on data or insights of data right if I want the date if I want the report to tell me something to do that’s an insightful thing if I just need access to the information yes the model is key but then I’m accessing it potentially through a paging report and to export my data Maybe I’m going directly to excel I get the data Maybe I’m I have some detail Pages inside the report and also

49:09 detail Pages inside the report and also to answer the final questions here from Stuart there was two main questions at the end do you think it makes sense to show low level cardinality data types details in the report and the second question was was do you have any salute or alternative solutions to larger models and so the first question to very directly answer that is yes I do think it’s irrelevant to have that information but only when it’s action based when you only when it’s action based when the output of that information is know the output of that information is going to be used and how what actions are going to be taken by seeing

49:40 actions are going to be taken by seeing that lower level of data and then you can really evaluate what is the right level of data and again I would argue aggregate as high as you can because it just makes your model smaller and makes it easier to work with but you need to have back-end engineering for that and then the other thought here was around what are alternative solutions for bigger models well then you start using mixed models and start you’re doing you mixed models and start you’re doing direct query on a larger table so know direct query on a larger table so direct query works good when you have a small number of Records you’re

50:10 small number of Records you’re interested in but you can have imported tables that have a much higher granularity so when models start getting really large you don’t bring all the data to power bi you only import a portion of it that is aggregated so it runs nice and fast but then on the details you push people to Pages or you design things on the report that are doing import level things and you have this real time looking at the information that is actually occurring inside the data set so there’s definitely other techniques you could probably use for larger Solutions when you have bigger data models that would could further tune those down

50:42 could further tune those down you guys have any other thoughts for those two questions with both those points you may not have action because it almost sounds like in the reason code people are just doing a final check or validation why why something changed yeah which is why you’d want that detail and which would be the only addition to point one and point two I really like that idea like the larger you go the bigger the problems become right I’d agree quickly going to be performance so that it might require a restructure right and you might get those data those data points returned Faster by doing

51:13 data points returned Faster by doing something like a direct query to access that so you don’t have the model size problems so it’s a it’s a really good point all right with that I think we’ve burned through a perfectly good hour of your time we appreciate your time thank you very much for listening this conversation Stuart thanks for the great question we talked a lot of things here hopefully some of those helped you out if not confused you more throughout this whole pattern here so thank you very much for the question I appreciate it our only ask if you found some value from this conversation if you found some other points that you didn’t think about earlier around large data sets

51:44 about earlier around large data sets High cardinality columns what you can do with those we love it share the share the report with somebody else that you think might find this also valuable Tommy where else can you find the podcast you’ll find it anywhere it’s available apple and Spotify make sure to subscribe leave a message if you want your question or topic to be discussed on the podcast go to Power bi. tips slash the mailbag and finally join us live every Tuesday and Thursday 7 30 a. m Central thank you all very much and we’ll see you next time

Thank You

Thanks for listening to the Explicit Measures Podcast. If you enjoyed this episode, share it with a coworker and drop a question or topic suggestion for a future show.

Previous

Symbiotic Relationship of Data – Ep. 232

More Posts

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.