DAX Complexity & Power Query ETL Tips - Ep. 515
In Episode 515 of Explicit Measures, Mike Carlo and Tommy Puglia unpack the latest Power BI and Microsoft Fabric topics from the show. You’ll get a quick read on the episode’s biggest ideas, why they matter, and where to dig deeper in the full conversation.
News & Announcements
- No linked announcements were available in the episode description for this post.
Main Discussion
This episode covers the major themes, opinions, and practical lessons Mike and Tommy surfaced during the conversation. The transcript below captures the full verbatim discussion if you want the exact phrasing and context.
- Mike and Tommy react to the episode’s biggest Power BI and Fabric developments and explain what stood out to them.
- They connect product announcements to day-to-day practitioner decisions instead of treating the news as abstract roadmap chatter.
- The conversation highlights where teams can move quickly, where they should slow down, and what tradeoffs deserve attention.
- They share candid perspective from real project work, which gives the discussion more practical value than a headline recap alone.
- The episode mixes tactical advice, opinionated takes, and a few forward-looking predictions about what listeners should watch next.
Looking Forward
If this episode’s topics affect your current Power BI or Fabric plans, use the transcript and linked resources to identify one concrete change you can test with your team this week.
Episode Transcript
0:02 Be it high. Tommy and Mike lighting up the sky. Dance to the day. The laughs in the mix. Fabric and A. I get your fix. Explicit measures. Drop the beat. Now pumpkins feel the crowd. Explicit measures. Good morning, Tommy, and welcome back to the explicit measures podcast, everyone. Hello and good morning. Good morning, Mike. How you doing? I’m doing well. Things have been just
0:33 I’m doing well. Things have been just clipping along. I’m I’m really enjoying these agent things, building agents and making agents do repetitive tasks and working with them to build like very specific task related things. It’s kind specific task related things. It’s been fun. But before we get into our of been fun. But before we get into our main our side conversation or any kind main our side conversation or any news conversation here, let’s talk of news conversation here, let’s talk about DAX complexity and Power Query ETL tips. This is our main topic today. So how do we handle these things? What what are some power query and DAX? How are these things related? Do we have any extra extract transform and load ETL
1:04 any extra extract transform and load ETL tips that help us work with power query and DAX as a like a unit together here? So I think this is going to be interesting. A lot of what I find in DAX, if the DAX gets more difficult, it’s because the power query or the upstream information isn’t simple enough. So we’ll unpack that a little bit today. That being said, Tommy, any kind today. That being said, Tommy, any news or other fun topical items for of news or other fun topical items for you, Tommy? No, I I think we’re ready to go. I’m excited anytime we get to talk about DAX and Power Query because there’s just our
1:34 and Power Query because there’s just our it’s our favorite topics. I I’m going to maybe just go on a little bit of a riff here before we get into like the main topic here. I I’ve really been trying to understand how does this new software world complex comes into place with this AI space. Tommy,, I just want to quickly touch on something here, Tommy. Okay. Okay. For you, when was the aha moment where AI really became from like something that I I play with, I write code with it on the side. You the AI would
2:04 with it on the side. You the AI would write code. When did you Is there a moment in time, Tommy, that you felt like I could fully take my hands off the reins and and an agent or some coding system was capable enough to actually like give you real results and you don’t really need to check the code? When was that moment for you? Was that recent? Yeah. Yeah. I’m I’m trying to think of the first aha moment because I feel like there were so many. I think it was when I really first started using claude
2:37 when I really first started using claude code code and realized the depths of that. where because a lot of my projects are in GitHub and I was like oh wow I just gave it some instructions and for example I I use a lot of open source projects and some of them are pretty complex and one just was having issues with some of the extensions that people made with the Python packaging so their dependencies and things wouldn’t install and then it wouldn’t work and I’m like hey just go through and I there’s all these things that versions
3:08 there’s all these things that versions and I’m hundreds and hundreds of packages packages just go through and help me understand or just do it for me install remove so it can use torch because it was an AI pietorrch which requires all this stuff and it basically said yep I’m going to reduce the redundancies here make sure these are not dependent on each other and just basically ran and installed everything for me which I used to do this before and it was a tedious process to go through so it’s incredible
3:38 incredible and you said this was happening when you started using cloud code. How long ago would you say was that for you? Oh my goodness. was this months ago? Middle of the summer? summer? I probably started using I probably started using cloud code when it was like available for Windows but not necessarily available for Windows. So there was like a a hack. geez it was probably when it first came out and it was like they even had like a WSL version of it. All right. So I I wasn’t really that far into cloud code and I’ i’ve been keeping
4:08 into cloud code and I’ i’ve been keeping cloud code at bay. I’ve been using it slightly. I’ve been using the models some of the models that they’ve come up with anthropic. I’ve been using them inside GitHub co-pilot. For me, Tommy, I feel like something really changed for me in January of 26. I think that was when I had some aha moments. things really started to like change for me and fundamentally I’ve shifted how I do my work and what I can build. M there’s been there’s been a revelation
4:38 there’s been there’s been a revelation I’ve been having to myself or a revelation I’ve been like unpacking here a little bit about this one of them is the barrier to get code built is almost a commodity now and I don’t really know like I’m I’m reinventing a lot of almost everything I look at my company my business I’m looking at a single app that can handle all the things that I need to do and I’m actually looking at looking looking at through my business. What apps do I pay for and what apps should I be rebuilding
5:10 for and what apps should I be rebuilding and where does the data get stored? Let me just give you one pain point I have Tommy in general with software as a service. Software as a service is great. You buy an app, someone’s figured some problems out. Maybe I have two problems. My first problem is every time I use some application, something else that’s doesn’t that I did not build the data that it makes, I have a harder time accessing just in general. HubSpot, Salesforce,, whatever the thing is, whatever the it doesn’t matter what it
5:40 whatever the it doesn’t matter what it is, there always seems to be a little bit of friction between me using the application and generating data in it to getting it out. Now, I’m looking at this also, Tommy,, my pain point here has been a lot around like YouTube and Reddit and LinkedIn and X. We do a lot of content on those places and it’s very difficult to get that content back out. back out. Yeah. Yeah. even with like an API that they maybe give you, but then it’s like metered and they’re like, “Okay, it’s like,, three cents a post
6:11 like,, three cents a post return or something.” So, it’s something expensive for them. So, they’re actually getting paid for that API access. Tommy, that’s difficult and I don’t really like it. Just it’s like it’s my data. I’m giving it to the platform. Why do I have to pay to get the data back out? That seems like a double hit., I’m getting this platform for free, but I need the API to get the data back out. This doesn’t seem to be very efficient to me. So I’m while I can’t just replace Twitter with with or X with my own app, everything else I look at my company, I’m now looking at it from a a lens of
6:43 I’m now looking at it from a a lens of okay, what internal apps do I use today? What do I pay for right now? And can I full-on replace that with a vibecoded agentic built app that I want that I can then integrate across multiple processes. So when I Does this make sense what I’m describing? Like Yeah, I know what you’re saying. So right now I’ve been working on a side project around we’re calling it content nudge and we’re taking this idea of okay what internet information is out there for like YouTube and Twitter and
7:14 there for like YouTube and Twitter and X. We’re collecting website data, Reddit information, bringing all this information centrally to like a single place. So I’ve got a central hub of knowledge of what’s going on in the internet around PowerBI and fabric. Awesome. Very helpful. From there, I’m able to use AI to then collect a number of ideas together and say, “Okay, now summarize all these articles down to like topics and phrases and then attach numbers to them. Which topics or phrases are most popular?” So now I can absorb
7:45 are most popular?” So now I can absorb all this rich information, summarize it down, and then now I’m able to build other things in this in this product content nudge where I can generate video transcripts. I can make a conbon board of like planning things. I’m now looking at doing video edits and making shorts and adding words and making thumbnails. Like all the things I would normally have used multiple programs for. I’m just deleting them like one by one. What do I use this program for? What do I want it to do? What does it
8:16 What do I want it to do? What does it not do that I wish it would do? Right? This it’s all these little things I can really build something that’s custom. This was an aha moment for me in January. And I’m thinking to myself, if I’m figuring this out now as like a small business owner, imagine the amount of waste and application purchasing that’s happening at large enterprises that they don’t need to buy anymore. They could run their own stuff. It it it’s just insane to me, Tommy. Like my two main going back to my wrapping up let me summarize my concept,
8:47 let me summarize my concept, right? right? Yeah. Yeah. The two pain the two pain points I have are the apps that I currently buy do like 80% of the work I want. They’re they’re almost there. They almost can do what I want. And the second part is the data that it makes is so difficult to get to and get access to. It’s like I don’t want to use the app anymore. I I I’d rather just delete the app and replace it with my own because I can just put the data right in fabric right where I need it and then I can go merge it with any other data that I have and it’s one single platform of this is how I do work and all the data
9:17 this is how I do work and all the data is in the same spot. My reporting comes from there my analyst comes from my analysis comes from there the the financials come from there like everything is now in the same spot and I’m finding that to be very useful for me. Tommy, what are you telling me you’re going to build your own PowerBI too? No, I don’t know, Tommy. Well, let me let me I’m going to throw a couple ideas. Let me let me give you some other experiences I’ve done. Okay. Okay. Okay. Okay. I think I think PowerBI when I say when
9:50 I think I think PowerBI when I say when I’m going to build my own PowerBI, let’s talk about two parts. The semantic model and the report layer. Tommy, I find a lot of friction around the report layer. It’s not very agentic. It’s not easy to build a report layer with agentic stuff. I am now just starting to see companies come out with like agentic report building experiences and also myself included, I’m trying to build an agent experience around report building. But is the right solution to build a PowerBI
10:21 is the right solution to build a PowerBI report report or is the right solution to use analysis services and some HTML agentic building experience on top of it that actually gives you what you want, right? Do we really So maybe my my point here, Tommy, is do we really need to lock ourselves down to like the way PowerBI designed visuals or should we be able to like leave it open and say any framework that builds visualizations could be open? Like what if I go what if I what do I need to bridge the gap Tommy between semantic model and Vega semantic
10:53 between semantic model and Vega semantic model and Vega light I’ve already got this done like this is already a product that I’m working on to release inside power designer where you can build a gentic Vega and Vegaite visuals on top of existing semantic models like this is something that I’m we’re really close to. Well, we’re close, but here’s the
11:10 to. Well, we’re close, but here’s the issue, Mike. The pro like and the barrier that we’re going to have to cross at some point is the fact that if you start building everything in instructions in code that you’re not sure how it works, right? Like, and again, again, here comes here comes the developer push back. Go ahead. Okay, hold on. There’s two parts here. So I’m going to push back on like and this is not the morality issue even though that’s a question for conversation for another day but if are you building right now AI visuals or
11:42 you building right now AI visuals or applications for clients we’ll start there there I’m building I’m so and what I’m doing though is I’m deciding do I go from SQL fabric SQL individuals that are built in apps or do I go from fabric semantic models into a query that runs a visual in an app and and the visual side like so I know I can so think of it this way Tommy I don’t need embedded anymore I can just tell it here’s the query I need you to run in power in the literally the
12:12 you to run in power in the literally the query I want you to execute this is the DAX query I want you to run I see what you’re saying I see what you’re saying and here’s the filters you’re going to use and here’s how you like I can explain to it what it needs to do the SQL the analysis service engine is still there that is the horsepower behind the PowerBI side but do I really need the embedded Do I really need the visuals that Microsoft provides me? Do I really need the limitations that it gives me, Tommy? I don’t really think I need that anymore. I can I can build my own. So, another example of this, I was on YouTube analytics. I was looking at a
12:42 YouTube analytics. I was looking at a visual. There was this really pretty visual in there. I thought, “Oh, that’s interesting. It has on the x-axis it has dates and on the y-axis it had number of views and little dots on the screen.” And each dot was represented by like a group of colors, things that you had or didn’t have. So I literally thought, “That’s a pretty visual. That looks impactful to me.” I took a screenshot, gave the screenshot to my agent, and said, “In my app, go build this. The x-axis is defined by this data. The y-axis is defined by the
13:13 this data. The y-axis is defined by the number of views, and I want the color bubbles to be a certain way based on my preferences.” Boom. It wrote it and said, “Done.” in less than a minute committed the change, pushed it into my app, and I had a working new visual that was being filtered by other things on the page automatically. Like in a minute, Tommy, like the you couldn’t build this visual in a minute and then integrate it with embedded that fast. So to me, I’m looking at this going, I’m fundamentally shifting how I want to
13:43 fundamentally shifting how I want to build things on top of fabric. And where I think fabric is strong right now is fabric is strong in like the data collection, the SQL fabric SQL, the analysis services engine. Where I’m finding there’s friction for me right now is I’m not we’re not too far away from replacing just the front end side of fabric. It’s not doing it for me. The the report side is not quite hitting the mark. So when I look at it, I’m like I’m going to this world of like I’m building
14:14 going to this world of like I’m building intelleos and embedded solutions. I’m still going to use the semantic model, but I’m now very heavily looking at how do I apply an agentic like experience for building the visual or visuals that you would want. Can you just describe it? Could I send it in an image? Can it just build it for me automatically? and letting the agent say, “You’re going to use these kinds of other React frameworks or Vega and Vega Light frameworks.” You could give an agent three or four frameworks and it could
14:44 three or four frameworks and it could just pick which one it thinks is going to be best to build the visual and I don’t have to worry about it. It just builds it and if I want to change, I just go back and say, “No, that’s not right. Adjust this and the other thing.” We’re really close to a new world, Tommy, of building the the collection of visuals on a page. I think I’ve seen Kurt Buer. Have you seen Kurt Buler’s thing he just did? did? Oh, yeah. yeah, I just saw the tweet. I haven’t dived into it, but it’s pretty I was he he made a So Kurt Buer did the exact same thing. He took some data from his like GitHub or his,,
15:15 his like GitHub or his,, commits that he’s doing on Git and stuff and built a visualization around it. Very cool. He vibecoded the entire thing. The whole thing is built with an agent and he just described what data how he wanted it and it made a local database and and built the visuals for him really well. The barrier to do this Tommy is now I just need to communicate what the visual needs to do is and it just works and and this is this is something that I think is a tipping point that occurred in like for me January. I think it started really
15:46 January. I think it started really happening in December with these new premier models Opus 4. 5, Opus 4. 6, six,, GPT 5. 3 codeex. These things are really good and they’re building code that I couldn’t build on my own. They’re faster and I I’m I’m able to trust trust I’m able to trust that it’s building code that doesn’t require a lot of extra hands on it. Right? So, up until December, January of this year, it was a lot of I would build code and I
16:17 it was a lot of I would build code and I had to go fix it, right? right? It would build code. It would build something good and then it would be like it would go off the rails very quickly. We’ve hit a point now where I don’t need to adjust that as much. So I think we we we to me it’s like we were clipping along and then we’d fall off a cliff and it wasn’t good and like ah I struggle. So I had to go back and refix things and I had a lot of hands on to it. We’ve gotten to a point where the agent is good enough now to build its own code and maintain its own code. It’s flatlining it. We’re we’re not we’re not really getting into big pit holes or or pitfalls. It’s doing a good job of
16:47 pitfalls. It’s doing a good job of maintaining its existing code. At some point, we’re going to get to a point where the agent is not only building our code, but it’s improving our code as it goes along. So, not only every time you work with it,, again, in general software, right, as you build a bigger software package, you introduce more bugs, you introduce more complexity, you introduce more code. At some point, you’re going to need the agent to just continue to optimize as you build. And we’re going to get to a threshold where the software not only is just sustaining itself with agents, but the agents are
17:18 itself with agents, but the agents are coming in and saying, “These are recommendations of things that are inefficient. Do you want me to rebuild them?” And I think in the next couple of months, Tommy, we’re getting to the point where you will be able to rebuild your entire piece of software every quarter. You’re going to start from start fresh and go. It’s it’s going to be that fast. You think so? You really What about collaboration? What about it or is that is that is that in like you or is that is that is that in like deprecated now the ability to know deprecated now the ability to collaborate on what with what the visuals right
17:50 on what with what the visuals right let’s say you’re doing a visual in whatever language you want to do it dabbage baba I don’t care what you call it it yeah sure yeah sure but and someone wants an edit or you give that report to someone else do they just talk to the agent and say hey just update this visual without even knowing what the code is yeah why not I don’t I didn’t write the visual with knowing what the code was, they shouldn’t write it either. I’d argue like, and why can’t we go to that level?, to me, the agent pieces of these things, agents are really good at
18:21 things, agents are really good at creating creating systems that run efficiently, right? It I don’t want I don’t want someone to go talk to the agent and say, “Tell me the insights on my data.” I don’t think that’s an efficient use of tokens because they’re expensive. I think it’s efficient use of like the agents to do things at this point. You need you need to bring the token cost way down for people to actually ask the agent a question and get it to give you legit results. However, I can put some thought into I want these visuals on this page. I want these visuals in this certain
18:51 I want these visuals in this certain fashion. When I click this interaction, I want something else to happen on the page. Those are the things that I want to describe to the agent and I want the agent to pick up on those interactions and build exactly what I want. And I wanted to build it in a framework that is not agentic, something that is reusable and and is able to be like run at an efficient rate, right? Build it in React, build an HTML page, build something that is going to be really cheap and easy to use and run so
19:21 really cheap and easy to use and run so that way I’m not spending a bunch of tokens every time someone shows up to report. report. Does that make sense?, I see it, but then the amount of tokens, too. This is the other problem, too, is we’re going to have these token limits. I I think we’re getting definitely to the point. Like, for example, right now, as we’re talking on my surface, Claude’s just doing a bunch of edits for Power Query for me in PowerBI PowerBI and I’m not really doing anything here. Yep. Yep. So, So, pretty cool.
19:51 pretty cool. But But I don’t know. I I feel like I’m missing something because I do like what you’re saying. saying. Oh boy. That’s all I’m going to say. What I think you’re dealing with is what I’m seeing a lot of engineers and developers struggling with. It’s this release of control. So at a bigger higher level here Tommy there’s like a release of control that is occurring and developers need to really realign their thinking from I write code 75% of the
20:22 thinking from I write code 75% of the time and 25% of my time is management. We really need to shift this to 98 99% of my time is organizing, planning, describing, writing documents, roughing out images or wireframes. It’s it’s more of the let me focus my attention on the process and how we need to get work done and then being working with the agent to refine the ideas. Mhm. And I’m finding great success in
20:53 And I’m finding great success in spending more time there. And then now I can step away. So let me give you another example, Tommy. We will we have over 300 podcast episodes fully transcribed and landed on our website powerbi. tips/mpodcast. Go there. Everything’s searchable. If you want to s see how many times Tommy said I disagree or gamecher or any of these phrases, you can now full text search over 300 episodes of the explicit
21:24 search over 300 episodes of the explicit measures podcast on powerbi. tips. com right now. That’s that’s a lot of work, Tommy. Like there’s a ton of like the process to do this. You got to go get the transcript. You’ve got to write the summary. You’ve got to get the video together. get the embedded links, get all the like there’s all these things that need to happen. If I was going to write this manually and go back in time and build all this, I would have had to build some crazy scripts. And again, you build some crazy scripts. And again,, Python scripts or it’s very
21:55 know, Python scripts or it’s very difficult to get something to summarize stuff correctly. That’s where the some of the agentic stuff makes sense. It does a good job of summarizing large chunks of text. So for me, I would never have been able to do this in like eight hours or less. Just impossible. So what I’ve been doing is I’ve been working with my agent to define what is the process, how do I do this, run it a couple times, see where the agent fails,
22:22 couple times, see where the agent fails, refine the process, change the process, build new skills, create different skills, refine. Like I’ve been refining this stuff for days and now I’ve gotten to a place where I can say batch process 100 episodes and it goes out, it grabs the videos, it makes the post, everything the way I want. It adds now shortcuts. So every,, minute or so in the transcript, it adds a little note. You can actually click the YouTube link and go right to YouTube. Or if you want to reference a link in one of
22:52 want to reference a link in one of things, you can actually go to the website and click a link and it actually shortcuts a deep link to that section of code. All of this now is 100% searchable. You can go and and this is like Tommy and Mike’s 5 years worth of knowledge dumping and Seth’s because he’s in there in the earlier ones. You now have all this information at your disposal. It’s all searchable knowledge and I never would have gotten there without an agent. This is the stuff that I’m saying like we can start doing things like this. We can start taking long form video and distilling it
23:24 taking long form video and distilling it down into information and knowledge that people can actually get their hands on. To me, this is like an aha moment. It’s it was impossible before agents. And so now I’m looking at everything I do and saying, “What other things have I been dreaming about wanting to accomplish but haven’t had the time to build?” It’s waiting, man. a lot of weight on that. But all right, I let’s move on to the main topic. I think it’s time to actually talk about actually doing the work.
23:55 actually doing the work. I’ve been doing work, just I don’t do the work anymore. I make the agent do all the work. So Oh, we know that. Let’s jump into DAX complexity and Power Query ETL. Tommy, this is another mailbag. So you want to do us the honors here and get us a quick read through of DAX and this Power Query question, I believe. believe. Absolutely. So here we go. Hey, how can we structure complex stacks measures to balance readability and performance when modeling business scenarios that involve nested time intelligence, dynamic user filters, and multi-grain
24:28 dynamic user filters, and multi-grain data? Are there best practices for debugging these in larger semantic models? models? Power Query. What advanced power query techniques are most effective for reducing data refresh times in complex ETL pipelines, especially when working with fabric dataf flows gen 2 or integrating SharePoint sources? How can we optimize step loading and reuse logic across queries?
24:58 So, both these questions feel like they’re coming out of a pure PowerBI question, not a fabric area. So I would recommend maybe Tommy we try to attach our answers both to DAX and Power Query maybe using the PowerBI only lens first and then maybe we at the end we expand into Okay, if we’re going to add fabric to this, is there anything we would change? Would we adjust the answer slightly? What do you think? How’s that How’s that sound to you? I I like that. I did mention a few things with Gen 2, but I think you’re spot on.
25:29 with Gen 2, but I think you’re spot on. Well, you’re right. Fabric data flows gen 2 is the end there. But yeah, I’d agree with that one that that feels a bit more fabric. I will say still in the same vein. It’s still the same spirit of Gen 2. So 2. So I agree with that one. So what is what is your initial reaction here, Tommy, to like the DAX and the Power Query thing? I I have some I have some thoughts around the DAX stuff. maybe I’ll start there. Maybe get Yeah, let’s go. Go ahead. So I think some of the more complex things that this user is addressing is nested time
26:00 this user is addressing is nested time intelligence, right? that would maybe maybe be calculation groups or something along those lines. Dynamic user filters. This is interesting. and then multi-grain data. I can’t tell you how many times I’ve struggled with multi-grain data. That that’s that’s a very difficult challenge to solve, I think. And I think that one really requires some cl care careful thought about how your business needs to look at its information or its data., it’s much
26:30 information or its data., it’s much easier if you can think about what does the user want to see in the report and then roll up your data to a higher grain in order to get it at the same level of granularity across all your tables, which is sometimes not always possible, right? There’s there’s definitely situations where you’re going to need lower grain and higher grain. But when you start comparing high grain things to low grain things, that’s where your DAX, I think, gets really gnarly. Honestly, you’re doing a lot of like repetitive recursive type things, and it just doesn’t make it easy. So, I guess my
27:00 doesn’t make it easy. So, I guess my first thought here is around time intelligence, user dynamic u filters, and then multi-grain data. I’m going to try to focus my notes around here. Time intelligence is interesting to me. I think there’s a couple patterns you can use to simplify this. It also depends on how well your users understand the consumer user understands your data model. Calculation groups I think are really interesting from a
27:30 it makes it really easy for you to pick different metrics. Sum of this, total of that, average of this, and then apply general time scales to all of those base calculations. I’ll guess I’ll call them right. right. Yeah. Yeah. To me, calculations groups are a great solution to this. The challenge of this is when you use calculation groups on a report page, it gets a little bit more tricky. You actually have to understand like what are you trying to do and how you’re trying to manage it. Would you I’m going to maybe pose a question back to I I would completely agree. When you’re using calculation groups, you need to be
28:01 using calculation groups, you need to be a de a prodeveloper there because we’ve already talked about that. That is not for the faint of heart. It is a really cool feature, but you have to be very aware of what you’re doing there. It makes your model really efficient. And what I feel like what I feel like most people do is not that. Instead, what most people do is they just build a whole bunch of measures that are like, hey, here’s the sum of sales year to date. Here’s the sum of sales this month. Here’s the sum of sales last seven days. And we actually get physical measures built in the model.
28:33 measures built in the model. It works. It makes it easy for users to understand what you’re selecting and what you’re putting on the report page. But I also think it makes it difficult because because you have now a lot of measures to manage. And I think that’s that’s the trick of this is like you want you want to build something that’s sustainable for the developer to use and create, but you also want to make something that’s easy for the user to use. What are your thoughts on that, Tommy? I I think this is the easiest thing here because this is where like we’ve talked about calculated fields too or the
29:04 calculated fields too or the parameters excuse me this just in Excel training. So parameters field parameters are user friendly and can do a lot of the calculation groups. Now enterprise settings very much so Mike calculation groups do come more than handy but this to me I’m finding more and more calculation groups are more and more of an enterprise gigabytes of semantic model data whenever really the best use case for it is a cool feature but if you only have you
29:35 cool feature but if you only have your 15 calc measures and you know your 15 calc measures and you just want to optimize for time intelligence I feel like it’s worth more than it’s the the bites more than the value and then to your point the user friendly side of it. What trying to manage that and collaborate becomes very difficult. I think one area I would like point out here a little bit Tommy would be when you have lots of measures that you’re creating for all your different chunks of time for time intelligence it’s better to make folders. Actually, I’ve
30:06 better to make folders. Actually, I’ve been using a lot of folders really recent rec recently. I love doing that. Making measures putting them in folders. So if you’re if your team needs that information, I would highly recommend using it. Also, you have tables that have a mix of like dimensional features like address, location based information. It also I think really helps to when you’re talking about these models that are getting more complex and you’re adding a lot of measures or have a lot of columns in them. It makes sense to make folders that are logical in nature of like what column is in this bucket or
30:38 column is in this bucket or group. Mhm. And I think when I look at models now, there’s a lot of concepts of like these dimensions or measures or columns all fit the same vein of information. And so it makes sense to like bucket them, put them in folders. The only downside I would maybe make a note here is you really want to be mindful of like your model design. So if we’re talking purely PowerBI, there’s not a lot of good ways to create standard tables of information and reuse them across multiple models. The best
31:10 them across multiple models. The best way in a pure PowerBI environment is data flows gen one. That’s that’s primarily your your main function there, right? right? Which is make a table for your date fields, make a table for your product master, make a table for your customer master, and you focus on making sure that table is available to you. And that way you can pull in those various tables into multiple semantic models. The challenge of this becomes how do you verify that the DAX when you’re making time intelligent is the same across all
31:40 time intelligent is the same across all models right if I’m if I have a if I’m making multiple models of this how do we have this central library and I’ll argue Tommy Tommy Microsoft doesn’t have a really good answer to how do you centralize business logic there’s not like this generalized language of these are our data tables and these are the measures that could be applied to each to these tables almost all schema based everything seems to be hardened down to like the semantic model level and this is a maybe an approach that I’m maybe understanding
32:10 that I’m maybe understanding more now when I look at like data bricks and how they’re doing things they have this thing called like the unity catalog and they have tables that are defined in the unity catalog and then they have all this metrics layer this is the metric that we’re talking about and that metric can be applied to things in this way and they can selectively grab okay I want this metric well I know this metric has dependencies on these tables. So if I want this metric, it knows how to execute that query against this larger call it pool of tables from the
32:41 call it pool of tables from the enterprise. And semantic models aren’t really doing that. Semantic models are more like a data library, a designed mini mini data set that’s for a particular group of reports or a a team or department domainbased. Does that make sense? Tell me what I’m describing there. When it comes to Yeah. And I I completely agree. One thing I’ve been doing with the display folders and the nested display folders is a really neat feature. Descriptions are huge. And
33:13 feature. Descriptions are huge. And where I am using I right now actually have a really cool tablet editor script. I know you think it’s the end, but it actually uses claopus and provides an awesome way to add description to measures. measures. Wait, wait, wait, wait. Hold on. You wrote a tabular editor script. Well, I didn’t write it to go call Claude to write descriptions
33:32 to go call Claude to write descriptions to then put them back into the model. Yeah. Yeah. Why would you do that in the Why would you just do that in the fabric MPC server and skip all that? Well, I still do like Tableau Editor 3 and it also has better instructions. The MCP server is looking at the entire model. What I can do with mine is I can actually select like a single measure. So, I don’t have to say all the measures. Oh, so you almost have like a selection tool. Pick this, run this, pick this three things, run these. Okay. Right.
34:02 Right. I could see that. Yeah. So that’s I like that a lot. All right. I like that. A lot of Yeah. And so the other So those descriptions are super helpful in the way I I ask it to look at dependencies if there are. So it has a really good description and it provides the formula as well. And I started using a DAX calculation table called info. view. view measures that creates a calculated table that has,, all the measures, the name, the table, the display folder. Yes. And the
34:32 display folder. Yes. And the description. And what I’ve been doing is all of my all of my reports now have this like description page to help people understand what’s available, which is cool. I think it’s actually super cool. And for example, a very really personal one, but one I’ve been using more is my wife and I with her job, she was using that API and this,, hour logging thing and they’re doing all this stuff in Excel. And I
35:02 doing all this stuff in Excel. And I said, “Let me take a look at this.” And Claude did write a bunch of measures for me, but there were probably quite a few. And I had all the descriptions writtable hours, this is the estimate used percent and there’s a description there. So for her, she has PowerBI desktop because I did dashboard one a day with her. She’s going in
35:33 one a day with her. She’s going in looking at that description page and going, “Oh, I see.” And she can filter based on the display folder. Okay, here’s all the financial things. Here’s all this to your point like the domains a bit. Yep. Yep. And it’s a really a feature that she’s been using quite a bit and now we’re just building off of that. So that’s a really from an organization point of view something that I’ve really been relying on. Good. I would agree with you Tommy. I really like these info view functions that run in DAX. you can run them in certain places. You can run them in desktop. You can run them inside power
36:04 desktop. You can run them inside power query inside desktop but you can’t run them using the API calls. So there is a DAX API call that they have blocked that you can no longer directly call the info view measures using the API call to fabric but I would agree with you Tommy that’s a great way of documenting things. I’ll also note here about these time intelligence now dynamic user filters Tommy that’s one I want to pick on. What does that mean to you? What when they when you hear the word dynamic user
36:34 when you hear the word dynamic user filters, is this slicers or is this something different like I want like dynamic user filters would be I want this table and I want to select these two out of 10 columns. How do you read that statement? I’m not entirely sure. So the one way I thought about is based on the user who is logged in, they have a set of filters or maybe it’s role level security too. So some to me there’s some using role
37:04 So some to me there’s some using role level security here. H it’s interesting. I was reading that phrase as like how do I let people pick fields of data and supply them to the report or in the report that I publish giving them more dynamic experiences for filtering., you can always throw slashes on the page. You can always throw use the filter pane. I prefer the filter pane just because I think it’s more capable. But some of these really nuanced areas, like if you’re talking like like I want a
37:34 if you’re talking like like I want a table and I want users to pick the columns in the table, I think you’re moving away from PowerBI reports and you really should start looking at data explorer like a data exploration. Interesting you say data explorers. Yeah. And also I maybe would argue if you’re doing dynamic filters on like a table where you want people to pick different columns I would also argue go look at pageant reports like that’s another area that still connected to the
38:05 another area that still connected to the same semantic model you can still get information in but that is more designed for picking columns that you want and it maybe is a pairing between a pageionate report and a regular power report. So you can embed page reports into powerb reports that’s something you can also do. So when I when I look at this dynamic user filters, I I take the opinion of you’re enhancing the UI of the report page to make it more applicationlike. That’s what I think you’re trying to do. And so there’s a
38:35 you’re trying to do. And so there’s a there’s a limit on what you can physically do inside PowerBI reports because at some point you have to maintain that complex visual report page, UI, bookmarks, all the things that go with it. And so I’m going to heir on the side of like keeping my reports kind the side of like keeping my reports simple and I’m going to look for of simple and I’m going to look for other experiences like data explorations and pageant reports to get more of these like refined or different dynamic experiences for users to get data out.
39:06 experiences for users to get data out. So that’s where I’m going to land on like there’s other so I think a lot of people come into PowerBI and think oh I need a report. Everyone says I want the report, but I think Microsoft is providing other areas that are also really useful for people to go access data with. And so I want people to open their mind a little bit, broaden your broaden your horizon a bit more than just I’m going to build only a report. What about the debugging then? Because I think that’s one of the biggest things is is there a do best practice because that’s one of
39:36 do best practice because that’s one of the questions in the mailbag. So, we have all these things we’re talking about creating, but if I have to go back and and to errorproof dynamic filters, we talked about the nested time intelligence. There’s one thing about whether or not we create them, but if we have them, Mike, how do you audit or go back and verify if some of the numbers are right or wrong? Because I think that’s a big part here. This is one I’m going to lean a little bit more on John Kursky. John Kursk has done some some really interesting blogs and posts around when you make a
40:06 blogs and posts around when you make a change to report, how do you check it? How do you make sure that you didn’t break something? How how do you make sure the bookmark didn’t just fail and now will no longer work? And what I believe John Kurski is doing, he’s using Playright. So, Playright is a web service or a testing service that allows you to control a web page with like an agent or some code. What that allows us to do is it allows us to be able to go to a web page, the PowerBI report, and then run a series of tests on it, verifying data is as you as you expect., I also think there’s another
40:38 expect., I also think there’s another area of of fabric that’s or PowerBI that’s missing, Tommy. Imagine power DAX query view. Okay. what DAX query view does? It’s writing DAX, right? DAX, get a table thing. Yes. Yes. Yes. Yes. I think there needs to be a solution around DAX testing. There’s nothing in fabric today that lets you run like a DAX test. And what by this would be something very simple. Write some DAX statement, get a table of data back and then compare that data table over time
41:11 then compare that data table over time for like an execution or a run. Like for example, sum of sales by year, right? You’ve got a model, it’s got data coming into it. sales in prior years,, 2025 and earlier, that’s locked in place. It shouldn’t be changing, right? When you make changes to a semantic model, how do you confirm that what you run or what new DAX you added to the model didn’t break something else that you were already using? Again, let’s go back to the calculation groups. This stuff can get complex and you might be making changes to a calculation group that might influence
41:42 calculation group that might influence or change how the data is coming out because of the way you’re defining the new calculation. So, we really do need like a testing bed, right? Here’s the standard DAX I’m going to run. Every time I make a change to a model, run this DAX test and then tell me if there’s any difference between the the last run and this run, right? That that feels a bit more robust when we make changes to things. Now, all this to say, none of this stuff is easy. None of set up. A lot of these
42:14 is easy. None of set up. A lot of these are just people in the community building offthe-shelf or or side projects that are helping you get through this. I don’t see anything coming directly from Microsoft PowerBI that has like this robust testing framework in it to make sure that when you build a report, it’s the same every single time. And this is one of the reasons why I feel like we want to go after Git and GitHub so much because then I can see every time someone’s making changes. All right. What are your thoughts on that? No, honestly the more and more you got to do
42:44 honestly the more and more you got to do with PBR like when you talk about debugging like I think we had previous tooling in the past but it’s difficult to do and what you can do with PBR and TIMLE to be able to verify like there’s one thing data validation which is I think a separate conversation I know John Kersy’s done a lot there but we’re not there yet in terms of I think a it’s nice to know that it exists like exist exist I would nice to know like John’s doing things and you can do it. It’s possible. He’s He’s documented. It’s known. It’s
43:15 He’s He’s documented. It’s known. It’s just I would argue it’s not easy. It’s not going to be like a, “Hey, Tommy, I need you to go set up a test for this report.” You’re not going to go into a tool and just like knock one out in like an hour. Like it it’s going to take you some thought and effort and setup and like it’s not smooth yet. So, if it’s not smooth and easy to do, it’s not going to be adopted and it’s not going to be widely used and 100% 100%. So I think GitHub is a huge for DAX especially you got to use Timle you got to use the DAX query view it’s the best way to verify and tech
43:46 it’s the best way to verify and tech formulas and also things that are more complex again but also rely on if you’re not relying on co-pilot or an agent with tim I don’t know what you’re doing at this point so I would agree with that I think we would agree with Tommy right the agentic space of modifying the timle format or modify even the PBI format is getting really good And we’re not far away from heavy automation of report creation and model model manipulation with agents. And if you’re
44:16 manipulation with agents. And if you’re not exploring these technologies now, you better you better start doing it now. I’m doing a full eight hour precon around agentic development experiences in PowerBI fabric and the VS Code extension. Now I understand not everyone loves VS Code. This is where it’s going to start. I believe it’s going to start in VS Code and it’s going to get simpler in other tools. Other people will build tooling out there. I’ve just saw I’m seeing a
44:45 out there. I’ve just saw I’m seeing a lot of other Microsoft MVPs coming up with some very creative solutions that are really interesting. Like I’m excited to see where people are going with this stuff. and it’s going to really change how you build stuff in the future very soon. Interesting. Also, this is a pre-recorded episode and we’re at Fabcon right now or coming out of FabCon at this point. So, I don’t know. There may have been also some announcements. We’ll have to rec,, come back and retaliate in April and really unpack what we’re seeing at FabCon Atlanta. I’m guessing
45:17 seeing at FabCon Atlanta. I’m guessing there’s a lot of agentic experiences that are going to be built into a lot of things in Fabric and PowerBI just because of the rate of what Microsoft’s building., today is a pre-recorded episode., it’s a Monday right now. I’m already seeing announcements from Satias talking about co-pilot co-work. We’re gonna talk about more of this in the future probably. It’s it’s It’s it’s oh yeah oh yeah there there’s a lot of emphasis from the leadership in Microsoft to make sure that these agentic experiences are built into your daily workflow.
45:47 into your daily workflow. So let’s let’s change to power query though. I think this is also has a different path for debugging. Would you agree? Yeah. And I want to maybe the the user here asks about what advanced power query techniques are there there and I’m going to maybe argue the the advanced techniques aren’t as important as data concepts in my mind about how do you reduce data refresh times and
46:17 do you reduce data refresh times and complex ETL. So my initial reaction to this question is the techniques are not advanced. in power query again I’m talking purely about powerbi. com you’re going to want to land the data in a raw step. So when you use Power Query in the PowerBI space, dataf flows gen one there will allow you to connect to any source. You can run the power query language and then it creates CSV files and puts them down somewhere
46:48 CSV files and puts them down somewhere where it can use them. CSV files are not very efficient. They’re not super fast. There’s a lot of challenges with the CSV files, but they’re more efficient than going back to your original data center or your your SQL server, right? So, a lot of pattern that I hear and and use myself personally is when you use data flows gen one, the first data flow you should build is just get the data in. No transformations, no complex aggregations. Just go to the source,
47:19 aggregations. Just go to the source, land the table as much as simple as you can. Once you have the table that’s landed, then start a second data flow that relies upon the first one and then do your transformations. So that’s a much more efficient pattern of land the raw data and then run behind it a second attached data flow that will then transform the data into what you need. So like,, it’s almost like the same pattern as you’d have in like a SQL server. you have like staging and you have the tables, right? The staging table and the actual table.
47:49 The staging table and the actual table. That’s the same concept I’m seeing here. I’d also argue don’t do a lot of complex merges. Don’t do a lot of like repeated row calculations of things. If you’re using Power Query, advanced techniques here, and this is I wouldn’t call it advanced, but use query folding. That is going to be probably by far your fastest fastest piece to make Power Query more performant is figure out where your query folds and how can you rearrange the steps in your Power Query to get query folding happening as long as
48:19 query folding happening as long as possible in your query. I’ll just pause right there. What are your thoughts, Tommy? I I think query folding is a big part here, but I think this is goes I’m always going to go back to Roach’s maximum when it comes to or decreasing your refresh times and also just having better success because the more you do regardless of what if it’s in desktop or in data flows, the more transformations you do in merges, the more likely you’re
48:50 do in merges, the more likely you’re going to get,, large things. For example, what I’ve really found, Mike, and I know that we’re trying to talk about this just in a PowerBI lens here, but it’s very hard for me when I realize that a lot of the more advanced transformations that I used to do or have tried to do when I moved that to the notebooks in Python, it’s amazing like taking things that were taking 35 minutes in Power Query take two minutes now in a in
49:22 Power Query take two minutes now in a in a notebook in Spark. So, it’s very hard for me to not connect the two. Even if we’re trying to say this debugging Power Query or just trying to optimize Power Query, if I’m doing some things that are advanced transformations and trying to reduce data times, I’m hoping that my company at some point is going to buy fabric and I’m going to use a lakehouse and I can do those iterative things in a notebook and store it in a lakehouse first.
49:54 first. Yeah, I’m Yeah, I’m it’s hard for me to separate the two. I want to agree with you there, Tommy, because, right now I’m trying to answer the question purely as like a PowerBI question right now. I I would argue as soon as you get to use fabric and have fabric at your fingertips, your first thing you’re going to do is understand what your Power Query was doing, translate it to notebooks. like you you will use a pipeline to load the initial data and then once the data is there then use all of the rich notebook experiences to then transform
50:25 notebook experiences to then transform and manipulate your data. Two things like you said Tommy I agree with you notebooks are faster they run faster. Notebooks can be just pure Python or Python and Spark. So if you need hypers scale you can write the same language Python to do the transformations the same language also to do really big multi-billion row table manipulation very fast and efficient. So I’ve done really crazy stuff there. So I would really argue those two kind So I would really argue those two main points. once you get to
50:56 of main points. once you get to fabric the game changes for you., the other thing I will note here is I’ve had a lot of friction, Tommy, pulling things from SharePoint and getting it into fabric in general. And I’m let me point out two items here under this area. If you’re using dataf flows gen 2, dataf flows gen 2 works pretty well. Connecting to SharePoint and getting data in. I I don’t mind it. There’s now a SharePoint synchronization feature for lakehouses where you can
51:26 feature for lakehouses where you can just autosync SharePoint data to your fabric lakehouse. This is one I haven’t explored as much, but I would argue getting data from SharePoint into fabric lakehouses or fabric experiences is quite hard. It’s not easy. And I’ve been traditionally using data flows gen 2 to lift data from SharePoint over to the lakehouse. And it’s slow, not very performant. I don’t really love it. So, I’m I would look at exploring these new SharePoint and Fabric Lakehouse
51:57 SharePoint and Fabric Lakehouse synchronization techniques. See if they actually add any value for you. That’s interesting. So, yeah, because SharePoint has always been a tough one. Even though honestly, Mike, what about the what about one lake though? How come we’re not talking about one lake and the feature of like the one lake windows connecting tool? We have not brought that up or I’m surprised that you did not bring that up yet. I don’t really use it honestly Tommy like I don’t really like the one lake desktop tool. So what so let’s I think
52:29 desktop tool. So what so let’s I think what you’re speaking to Tommy here is again let me just be very clear in the conversation in this conversation we’ve now just heavily shifted from PowerBI only to now we’re we’re talking about like fabric. Okay. So I want to be very clear the the answer to your question. Let me just let me circle back here. Let me answer the question here. Right. What advanced power query techniques are most effective? You your your most effective things would be think about raw and processed data. Think about loading the data initially with as s as little transformations as possible to a single data flow and then pick up that data
53:00 data flow and then pick up that data flow and make a second data flow that does those merging transforming features that you need in your data flows. I think that’ll be efficient for you. Yes. Yes. make sure you use and understand query folding in your power query. Right. So if you can query fold anything, select tables, filter things, use query folding to fold that upstream to your data source. If you’re coming from SQL or databases, make sure you’re using that. That will also make your Power Query much faster and more efficient. The last thing I’ll note here is think about your incremental loading
53:30 is think about your incremental loading on Power Query. How can you load as little data as possible? What data is actually changing and what data is not changing? I fully realize some tables upstream of you don’t have a changed date or an updated date on the record or you can’t trust those dates because for whatever reason the system that’s making those things are not updating them correctly. So I would argue if you can trust the information from the source system about changed or updated dates, use that use that to help you only bring
54:01 use that use that to help you only bring in the records that are different and then updating your records with those differences. So incremental loading is another major speed up and performance tuner inside Power Query. So you starting with those three things, I think you’d be able to get a lot more performance out of your Power Query. Now I want to transition over to what you’re saying Tommy, which is Lakehouse and now fabric. fabric. So back to your question around the one lake connector. Do you use this Tommy? Do you have it? I don’t even have It’s not even running on my PC right now. I when I reboot it was on my machine but
54:33 I when I reboot it was on my machine but then I rebooted it but it it was nice for Excel sheets because again when we think about SharePoint right and I I’m trying to bridge this with the SharePoint issue. What are most people trying to get out of SharePoint usually when it comes to using that connector? They they want the Excel document or they want the files inside the lakehouse so they can like pick them up and do things with them. Right. Right. That’s that’s usually or I have a table in an Excel document, I just need that table like copied over
55:03 just need that table like copied over into the lakehouse so I can then use it in a notebook or use it in a report or load it to use PowerBI purely when I when I’m using PowerBI and one of my sources is SharePoint. Oh, I’m I’m I’m using I’m hoping to not use Excel sheets. I’m trying to use lists in SharePoint as much as I can because it provides like a very lightweight interface for people to like edit things or make a list of stuff. Like I like lists. They work fairly well. we can do Excel documents and tables in Excel documents, but I feel like they’re just a little bit more finicky. One of my
55:35 a little bit more finicky. One of my least favorite data sources for anything in PowerBI is Excel just because people can do so many creative things in Excel. It usually breaks what I’m trying to get into any semantic model. So I I I’m always opting for a more stringent storage system of what data you can put in or out of a system. Does
55:54 you can put in or out of a system. Does that make sense what I’m describing there? No, I 100%. And like so SharePoint lists are fine for some data, but I think the Excel sheets where people want to be able to say, “Hey, we’re updating this Excel sheet or we get this data and we want to basically use this as the storage.” It’s a great I’ve seen and again I should use it more but Excel can the one lake application can be a great transition to using one lake. We’re like hey guess what guys we are moving finally from all of the issues that we’ve had with sharepoint
56:26 issues that we’ve had with sharepoint now just put your files in this one lake catalog this is the preferred choice. That’s interesting that you’re noting that Tommy. I’m not sure if I would have gone there first on the on the on hot swapping out like store your files in the one lake drive as opposed to sharepoint. I I also see your point though heavily Tommy I agree with you right most of my shareepoint is just storage of files. I I don’t use SharePoint I think as how it’s usually intended which is like a landing page
56:57 intended which is like a landing page and a news article and like here’s where you summarize your videos. Like I don’t use SharePoint. I think the proper way. It usually becomes a dumping ground for all of the work files by customer, by topic, by area, by team. And I just create a lot of libraries in there. Tons of libraries. And all of them just become put the PDFs, put the,, videos, put everything goes there because then it’s stored in the cloud and it’s like a fancy file storage
57:27 and it’s like a fancy file storage system. Sometimes I’ll build like a homepage. Sometimes I’ll build lists, but usually it just becomes a landing zone for files. Yeah. Yeah. And I think that’s maybe why you’re saying maybe you should look at one lake because then it could just be your folder storage system for files. Oh, 100% 100%. You’re you’re spot on here. And where and again when I have one lake then the ability for me to use notebooks on this or store it in a lake house is dead simple. it it’s
57:57 house is dead simple. it it’s insane how easy they make that. So, it’s hard not to say that being a major feature of why I want to transition a lot of what I do to that. So, let me go in this., yes. So I see that but what what is the I feel like there’s a little bit of friction there Tommy for people to like move away from the shareepoint thing to the one drive because there’s not really you can do it but then you need to have access to like the lakehouse and then
58:27 access to like the lakehouse and then you need to go to so in order to view so there’s like two lenses of this to me Tommy there’s the desktop application that has the folder with the items in it understand that makes sense SharePoint has the same capability to synchronize a folder from SharePoint point you can see the same stuff. Not a problem. It’s the online experience that I don’t really like. It’s when I go to like powerbay. com. So if I go to SharePoint, I’m familiar with the UI and how those files are stored and what what I see there. It’s a
58:57 stored and what what I see there. It’s a very like it’s a very focused on these are the files in the folders in that area. What I don’t feel like I get is that same simple UI in fabric. And if I go to fabric and try and look at files, see things, and manipulate stuff, I don’t have the same experience. Does that make sense what I’m saying, Tommy? Like, I had to go to the lakehouse, and then I’m in this folder structure, and I got tables and files and it doesn’t really feel like the same thing even though it is. Right.
59:29 the same thing even though it is. Right. This is the problem. This is where I hopefully they can do some integration between shareepoint and the one lake drive because yes it’s great to add files to the one lake drive and the syncing syncing but users are also editing those files right and going through those files. Yeah, like I’m going to go to like SharePoint, right? And I would click on the PowerPoint slide and it opens it up and I can just edit it right there and then done put save it and goes back down, right? I have the Excel document. It’s in SharePoint. I can open it up and work on the document directly in that thing and then close it out. We don’t
60:00 thing and then close it out. We don’t have that same capability, Tommy, in the in the one drive. You can see the file, you can edit the file, but I’m not getting a full like embedded Excel, PowerPoint, Word experience right inside lakeouses. that doesn’t seem like it fits. fits. So, we’re like I don’t my me personally, Tommy, I’m not feeling confident enough to like give my team the one like connector application for their desktop and just say just now use this. You’ll have everything you need. I don’t I don’t feel like it’s quite there
60:30 don’t I don’t feel like it’s quite there yet because I want this synonymous experience between working on files on my machine and working on files in the cloud. It should feel similar and I feel like SharePoint feels similar. the one like one drive experience doesn’t feel similar to me. I would agree with this. I would 100% agree with this. So, as we move forward though too, honestly, I thought fabric was going to change our lives just because of the lakehouse, but I’m realizing and the way we’re talking about it too, I’m realizing even from
61:00 about it too, I’m realizing even from the debugging and how we optimize DAX to especially Power Query, it changes how we’ve known, we’ve said this before, but how we think about semantic models. Mh. But it also changes the way we also,, I think debug and go through everything more than just simply we’re doing things in the lakehouse. That’s almost like not enough to say. Does that make sense? I’m not sure. Can you can you rephrase it? it? Yeah. So, what I’m saying is what when fabric first came out, it was like this is neat. This will be cool for building
61:32 is neat. This will be cool for building lakeouses, but it’s also changed how much work we’re doing in the semantic model itself. Oh, 100%. I I think when I was when I was originally using powerbay. com I was doing a lot more data engineering in the in the semantic model itself a lot more power query as soon as you introduce fabric I do almost no data engineering in power query in the semantic model and I do all my data engineering outside of it 100%. like it’s full stop for me like power query inside fabric is almost connect to this table that I
62:04 is almost connect to this table that I built from the lakehouse or data warehouse or the SQL server connect to this table here’s the relationships and the measures so the model to me now is purely table connection models measures and anything that’s visually like thin report related stuff like those are the things I’m putting in the model now what do you think is that how you see it too That’s that’s basically the same place I’m at. So, as we go through and I’m making sure we answered all the mailback questions here.
62:34 all the mailback questions here., so this was pretty technical one, but reusing logic across queries again lakehouse in his notebook. It’s so it’s always been one of the hardest things for us to do in in semantic models models in PowerBI. You can do that but you have to use data flows gen one and you have to do this whole load the raw table and then load the transformations and then you can pull up that data flow in multiple queries. So I also would recommend sometimes I’ve done in data flows gen one I would have five tables I would create
63:04 I would have five tables I would create in that one data flow anymore I’m not I would not create a lot of tables in a data flow. it’s going to be very singular table focused because I want to pull many data flows into a semantic model and I may want to refresh them at different speeds or different cadences. So I’m I’m it makes more items inside your workspace. It’s more difficult to work with and then it’s that that’s how how you have to work with it in PowerBI. When I
63:34 have to work with it in PowerBI. When I move to fabric everything changes and it it to me it feels a lot more fluid. It’s consistent. I get a lakehouse. I have tables. I can do asynchronous loading. I get directly. There’s a lot of advantages that I like inside fabric that makes this much easier for me. Dude, I love it. I love it. And honestly, honestly, the semantic model this again is still the gold source for business, but it’s amazing as much as I’m using PowerBI fabric, how much I’m not
64:04 PowerBI fabric, how much I’m not building semantic models anymore. Or in the same fashion. There’s some legacy things where I’m like, “Oh, yeah, I remember this.” That I just haven’t had to do. Yeah, I’m doing a lot more in the like once Microsoft brought me to fabric, I have a lot more work to do on the data engineering side of things. The semantic models are getting much simpler for me to execute against. And also, Tommy, I’d argue I’m using MCP servers for for the models a lot more now, and that’s making my model development go much faster. It’s a lot of the
64:34 go much faster. It’s a lot of the cleanup stuff that I don’t need to do. A lot of the lightweight, a lot of the work that I used to be spending time on is now being lifted over to these agents and MCP servers and it’s saving me time so I can go back and do more data engineering. And I think this is going to be the same pattern. We’re seeing MCP servers today help us out with the PowerBI model building. I think we’re going to see fabric MCP servers help us out more with the fabric building a lot more. And I think we’re going to see more extensions into this MCP space for different tools that we’re going to be building. Tommy, I like it. I think
65:04 be building. Tommy, I like it. I think it’s much it’s it’s helpful. I think we’re going to see the world rock here in the next 6 months with data engineering. I think data engineering is ripe for AI and using AI to help you build design workflows for shaping and manipulating data. Dude, I love it. I think I couldn’t say it better myself, my friend. Awesome. Well, that being said, this is a very full episode of Power Query and DAX conversations. How to optimize DAX for models and reusability of DAX. We’ve also tried to touch a little bit on Power Query and how we are leveraging
65:35 on Power Query and how we are leveraging Power Query to be more efficient with it. How do you reuse parts of Power Query as well? That being said, thank you very much for listening to this episode. Tommy, where else can you find the podcast? You can find us on Apple, Spotify, wherever get your podcast. Make sure to subscribe and leave a rating. It helps us out a ton. Do you have a question, idea, or topic that you want us to talk about a future episode? Head over to PowerBI Tips/mpodcast. Leave your name and a great question. And finally, join us live every Tuesday and Thursday, a. m. Central, and join the conversation on all of PowerBI. tips
66:05 the conversation on all of PowerBI. tips social media channels. And if you ever want to reference like what we said in the past or go find it, check out our website. We’ve done a major update to the podcast area. We’ve now listed over 300 episodes, full transcriptions, links, and shortcuts to everything you would need to get from our content. So, if you want to go check out our content and go search it and go find things that we’re talking about or reference it for like, hey, these guys are these experts are talking about a topic I’m interested in, go search for the words that you’re look you’re looking for. You’ll likely find them directly inside our website now. So, go
66:35 directly inside our website now. So, go check it out. PowerB. tips/mpodcast. You can search through all of our podcast and find all of our context context and text there. Thank you all so much and we’ll see you next time. Explicit measures. Pump it up. Be it high. Tommy and Mike lighting up the sky. Dance to the day. The laughs in the mix. Fabric and A. I get your fix. Explicit measures. Drop the beat. Now pumpkins feel the crowd. Explicit measures.
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
