PowerBI.tips

Optimal Power BI Architecture in Fabric – Ep. 444

July 25, 2025 By Mike Carlo , Tommy Puglia
Optimal Power BI Architecture in Fabric – Ep. 444

Mike and Tommy tackle a mailbag question about the optimal Power BI architecture in the Fabric era versus the pre-Fabric world of dataflows and shared datasets. They explore how the lakehouse, notebooks, and AI agents are reshaping what “better” really means for data teams.

Main Discussion: The Optimal Architecture in Fabric

A listener named Balaz wrote in asking about the ideal architecture for dataset and report development in the Fabric era. In the pre-Fabric world, the standard approach was: collect data into dataflows, create shared datasets from those dataflows, then build reports on top. Does that still hold?

Defining “Better”

Mike and Tommy start by unpacking what “better” even means. In the pre-Fabric era, Tommy identifies three core metrics: speed, accuracy, and flexibility. But in Fabric, there are additional layers to consider — the complexity of your process, the number of artifacts involved, and most critically, your capacity consumption.

Tommy introduces the concept of universal betters (speed, accuracy, complexity — always relevant) versus cultural betters (adoption, training, organizational readiness). You can have the most technically optimal architecture, but if your users can’t adopt it, it doesn’t matter.

The Pre-Fabric Pattern

Mike explains why the dataflow-based architecture emerged: you couldn’t easily share tables across semantic models, and organizations wanted some semblance of a data warehouse. Dataflows let you independently refresh tables at different speeds and share them across multiple datasets. It was the right architecture for its time.

The Fabric Architecture: Lakehouse at the Center

Mike’s recommendation for the ideal Fabric architecture is clear: it doesn’t matter how you get the data in — it all lands in the lakehouse. The lakehouse stored in Delta format gives you the most flexibility for whatever comes next. From there, it flows into semantic models for reporting and increasingly for AI agents.

The key shift: dataflows gen 2 are being replaced by notebooks for data transformation. Mike has seen multiple organizations start with dataflows gen 2, learn some Python with the help of AI agents, and within months delete their dataflows in favor of more efficient notebooks. The CU consumption drops significantly.

The Semantic Model Remains King

Both Mike and Tommy agree: the semantic model is still the core of serving data. Even Databricks and Snowflake are building their own semantic model layers — arriving at the same conclusion Microsoft has held for 20 years. The in-memory columnar storage model is simply faster than querying raw SQL for end-user interactions.

But the semantic model now powers more than just reports: agents, metric sets, goals, explorations, and paginated reports all connect to it.

The Conductor’s Era

Tommy coins the concept of the Conductor’s Era — rather than writing every line of code yourself, you’re orchestrating AI agents that handle the technical execution. A SQL agent that only does SQL. A Python agent for notebooks. You become the conductor, directing these specialized agents to build and optimize your data pipelines.

Mike doubles down: he’s building throwaway apps in 30-45 minutes by conducting AI agents rather than hand-coding everything. The technical barrier is dropping fast.

Desktop vs. Service

On the question of whether Power BI Desktop remains the primary tool: Mike sees the service gaining ground rapidly. Almost everything can be done in the browser now. The remaining gaps (Thindle editor, new visual types in edit mode) are closing. Desktop will remain a premium tool for professional developers, but the center of gravity is shifting to the service — especially with agents and notebooks being service-only experiences.

Looking Forward

Mike and Tommy agree this question deserves an annual revisit. The answer will keep changing as Microsoft’s internal teams compete to deliver the most efficient tooling at the lowest cost. For now: land data in a lakehouse, serve it through semantic models, and start learning notebooks if you haven’t already. The Conductor’s Era is here.

Episode Transcript

Full verbatim transcript — click any timestamp to jump to that moment:

0:25 Get out. Good morning and welcome back to the Explicit Measures podcast with Tommy and Mike. We’re back again with another, this is a pre-recorded episode, so let’s be careful there. This is a pre-recorded episode, but today’s topic is quite of a a mouthful. We’re doing a mailbag today. we’re going to be talking about the optimal PowerBI architecture in the fabric era. Is this an evolution or a revolution? It’ll be interesting. So,

0:58 A big topic. Great question. , yeah. Let’s any any other news related announcements, Tommy, that you’d want to throw out there or anything like that? Well, just for you who get bumped when we do the pre-recorded, , some of us I I think all of us have parents or at one point in time and mine are in town. And the best thing, Mike, about having Italian parents is there there’s a mecca. There’s basically a pilgrimage they have to do every time they come to Chicago. what we have for the bakery and I’m not gonna give numbers out because it’s a a public space

1:31 Here but we spent onethird of what we’ve had before and my parents were very worried would that be enough thing. Well it’s been two days and we’ve already gone through it three times and there’s still a ton. So this is like the best part of being adult with Italian parents because you deal with the discipline and the the the way of living now but now it’s just like well we got to get a Chicago hot dog. what about us fully? Well when are we going to get the canoli and it’s it’s just a beautiful very full time. So, it’s funny. , I have a friend of mine

2:03 Who just went over to Italy. , and he spent some time in Italy and he and he recommended he said, “Look, this has been probably by far the best food I’ve ever had. You go there for the food. It’s so good. You have to go there.” , and I was like, “Oh, man. I just haven’t haven’t made make it to Italy yet.” So, I’m still waiting for the Italian PowerBI and fabric conference to show up. So, if someone wants to be there with you put up a conference there, I will be more than happy to help do multiple sessions, a pre-day conference and everything else I can whatever I can do

2:35 To stay longer inside Italy would sounds like a wonderful idea. I I will I can brush up on my Italian very quickly for that, too, because especially if it’s in Ro. , dude, if we do it in Rome where Marco and Alberto are, they have the cut pizza. So they it’s called inal talat or not talotella that’s the type of pasta but in Italia which is how you cut it. So you say hey that’s like the place we went to in Chicago. Yeah we went to a place Tommy so for those for reference Tommy when we were in Chicago Tommy’s like there’s a there’s a great pizza place you have to go to this pizza place to get some pizza

3:06 And he’s like pizza by the inch. And with Tommy being around a bunch of engineers they were like well these pies shaped inches because are we doing like circumference inch? Are we doing like square linear inch? , it’s like, , there’s there’s two dimensions on pizza by the inch. You there’s a width and a length of this. , are we are we specifying both width and length of inches or just so it was it was just funny. It was yeah, pizza by the inch was a That’s right. a conundrum for a lot of engineers. This doesn’t work. All the engineers they were having they were having multiple headaches and issues with this.

3:39 But yes, I do remember that and it was actually very fun and super good pizza. So, very very much enjoyed that. in in Chicago. What’s the name of the place? Let’s give them some shout outs. Boni. Oh, yeah. I don’t think they need a shout out. They’re doing so well. But it’s called Boni. B O N C I. So that’s that’s actually even on our podcast, all the people who really care about pizza on our podcast are probably the the audience probably intersects very slightly here. I I Okay, Mike, this is actually great. Who doesn’t like pizza? Like and I’m not saying like they love it. I’m saying who who has a visceral reaction to pizza?

4:13 Okay, you’re probably a great point on there, Tommy, because I would agree with you. It’s it’s probably not necessarily anyone doesn’t like pizza. It’s probably more the idea of like what’s on your pizza that people have visceral reactions to, right? So, 100%. Right. Is it anchovies or not that people have a very strong reaction to that? Yes or no? like and then I’ll also add, , people pineapple seems to be like very visceral rea gut reaction to yourself. It really it’s really for a lot of people I don’t think they realize Mike when they put pineapple on their

4:46 Pizza they’re embarrassing themselves like like that’s really what they’re doing. They just don’t know and that I just feel bad for them or not even mad at them. it’s it’s very equivalent to like you walking outside downtown Chicago without any pants is that’s what you’re doing to yourself and it’s like they’re just unaware, . So it’s our job to help enlighten them. Oh my goodness, Tommy. Yeah. I think for that reason alone, there’ll be people that’ll be in the comments dividing about what they like or do not like on their pizza. So, if you have a favorite pizza topping that you enjoy,

5:19 Definitely throw it down. I’m I’m probably more of a meat lovers pizza person. So, anything that has like a lot of meat things on it, that’s what I what I enjoy. But again, I only know I think for lack of better terms, only American I know American pizza. Like that’s the only I know I have only had pizza here in the US. Never gone over to Italy to have pizza and I’ve heard it’s very different there. It’s completely different here for me. I any new pizza place, any new pizza joint, I got to try it naked. Cheese only because that’s a good sign of how good that pizza is thing. Then we can add topping with sauce. There’s a sauce on it.

5:51 Yeah, it’s just cheese. , your your normal standard crust, sauce, cheese, and just let me try that and then we can start adding to topics. So, I like this is why I enjoy talking to you because we said, yeah, let’s just go right in the mailbag. Here we are talking about pizzas for a while. Well, last question I have on your tongue. Have you ever heard of the gentleman by the name of Mike Portoi? Mike Port? I have not. He runs Bar Stool Sports. I think he he runs. Yeah. Yeah. He’s from Boston. He’s from Boston. But he goes around and everywhere he travels apparently he goes to get a pizza from a pizza place and he gives like scores out in front of the

6:25 Restaurant. He’s like goes in, buys a pizza, comes out, and he’ll have a pizza and he’ll give it a a rating or something like that. Have you seen his r Have you seen some of these videos? Is it is it wrong that I’m I don’t watch it because his name’s not Mike Okawani basically or or Mike Antonio? like if you’re gonna go around New York pizza joints and you’re from Boston that’s I don’t know man. I that my my my red flags go up pretty quickly. I I I’ve seen what he’s done pretty I don’t know if he’s like claims like I’m from the Boston heritage but I think he’s he’s from Boston. That’s where he

6:57 Is. But I think he’s more pizza and Italian I think is what his background is. I don’t know. I don’t know. I All I’ve seen is him going around and raiding a number of these pizzas places and he’s had some very interesting interactions with a lot of the store owners cuz some of them come out like very angry like don’t do this here. Get out of here. And he was like whoa time out dude. I’m just trying to like rank your rank your pizza here. And he’s been to some other places and he’s been like this has been the best pizza ever. And he’s he ranks the pizzas. I think one time someone said you have to try our try our chocolate cake. and he’s

7:29 Like, ”, I’m not sure about this.” And they gave him the chocolate cake and it like knocked his socks off and he was like livid. Like he was like think it was so amazing. Anyways, I was thinking maybe Tommy whenever we do traveling, we should do a little like mini Tommy the Pulia segment, the Pulia pizza rating and we should whenever we go somewhere we should get a pizza and you should eat a slice of it and see if you like it or not. I I I’m completely down. Here’s your rule of thumb. This is your rule of thumb. This is all you need to know if you want to know if you’re going to go

8:00 To a good pizza place wherever you are. The more the pizza joint looks like a laundromat outside, the better the pizza is inside. Oh boy. Because if you have to advertise so much, that means you’re not focusing on your pizza. So I want a place where you’re like that can’t be the place because that is the place and that’s the best pizza you’re going to have. So that’s interesting. And is this is this a Chicago observation or is this a general observation? just to me it’s a city thing like it’s observation my favorite place in Brooklyn shout out to Nino’s you’re like I’m not sure about

8:34 Where like I don’t even think there’s a pizza place here I see the colors I see red white and green I don’t see a pizza I don’t even see a sign and you walk in there’s two tables we’re like is this a waiting for the subway is the best it will knock your socks off oh my goodness that’s funny well I want to see these places at some point so at some point we’ll have to do some travels in he’s in he was it said Brooklyn is where this was. That one’s in Bay Ridge and Brooklyn. Yeah. So awesome. We we will do. Yeah. Okay. So, hey Microsoft when you’re not listening

9:08 Conference New York City, please make that two weeks because I’m going to need two weeks for that. So, and then somewhere in Rome and we will be just set. So, we’ll be there. We’ll be done. Awesome. Well, then with that then we’ve done enough rambling around the beginning part of this thing. Let’s actually jump into our actual main topic. Tommy, do you want to read off our mailbag today and just get into this new fabric era question? And we’ve been in general, we’ve been taking a lot more having a lot more topics around what fabric is, how fabric works in in addition to PowerBI, even though this has been started as a PowerBI podcast.

9:40 So anyways, take away Tommy, where we at with the fabric question. In the pre prefabric era, as far as I know, the most optimal architecture for data set and report development was the following. One, collect data sources into data flows. Two, create shared data sets with load data from the data flows. And three, create reports built upon the shared data sets. In the fabric era, how does the optimal architecture look like for data set

10:13 Report development? Is the prefabric architecture described above still the most optimal? If not, what’s your idea on it? Will PowerBI desktop remain the primary tool for data set development or or will the service take its place? Thanks for your time. Regards, and I’m going to botch the name, but thank you. Balaz. Balaz. That sounds good enough for me. But I love this question, Mike. No, I think Tommy, you made this all up.

10:45 I think you went to chat GPT and you said I’m going to ask this question and I’m going to ask Mike do you think desktop will stay around or will it be the service and then you said make up a funny name at the end and and a different name so it doesn’t sound like me Tommy. So I think I think this is a question you submitted Tommy like and you just put it in with a different name. I use my cousin’s name. Yeah. So but I just but but Mike here here’s the thing though. You asked this six months this is going to be really strange. If you asked this six months ago, yeah, I would go, , Mike, you’re

11:17 Right. But even what they’ve done with the ability to edit your D data lake in desktop, Mike, I think this this deserves a whole other conversation here. There I’m going to start right off the bat and I’m going to just put out the statement that I don’t think there is a universal standard way that right now fabric’s working. There are a lot of good ways it does, but try can you tell me there is one always the standard way that makes fabric better than not in terms of a

11:50 Process. I I want to I want to take another note here. I want to say what do you define as better, right? I think I think how you measure certain things also impacts a better when you say the word better, right? So I want to just unpack that word just slightly and give you my definition of what maybe that means. , you could, let me just give you some options, right? You could measure better by PowerBI in the pre fabric. Could you do that in that realm? , I think there was less options in pre fabric, right? So, we didn’t

12:22 Really have as many options to go with. I think the overall pattern, the pattern is the same, but what individual tools that you use throughout that pattern are probably now different. And now you have different ways of storing information and data inside fabric. And so this is where I think things get a little bit more designed I guess custom catered let’s call the word it’s more catered to what your company needs and what your company is actually comfortable with as far as skills. So okay let’s let’s define better. All

12:57 Right. What what is the better way to define this? I think you could define better as in better as in number of compute units, right? So before I would buy a pro or premium per user license and you were given everything all access to everything under the sun for PowerBI, right? It didn’t really matter what it was. Microsoft, , if you were premium per user for $24 a month, you can build as many data flows gen one as you wanted. You could have them interlin as much as you needed to. You could run them as often as you wanted. It it would just be handled and Microsoft would just like make

13:28 It work for you. So that was that was pretty much how how it operated in the in the old world. In the fabric fsq land you’re playing this game of balance between I need enough capacity to run the system but I don’t need so much capacity that I’m trying to overpay or I have a lot of extra capacity left over cuz you’re you’re basically prepaying for capacity. So I think to me when I look at this I think there’s a there’s there’s the better is it’s all graphical user interfaces. It’s clicky clicky draggy droppy. It makes things easy, right? So

14:01 There’s there’s a ease of working with the data and then there’s like an optimization path that says is it the most efficient cost effective way to run the data right so I think when you say better you have to think about what in your company are you trying to evaluate on or what is the return on the investment that you’re trying to look for does that make sense like what I’m trying to describe there. Completely, completely. And to me, what if you were to boil down better in the prefabric and just the PowerBI, there’s really three components that I

14:32 Would always think about. You’re you’re dealing with your speed, your accuracy, and your flexibility. And and whatever , , if you’re doing data flows or just doing that all in semantic models, those are really the three components is how fast is going to run, how accurate, and just the PowerBI. There’s really three components that I would always think about. You’re you’re dealing with your speed, your accuracy, and the flexibility and and whatever if you’re doing data flows or just doing that all in semantic models. Those are really the three kind

15:05 Of components is how fast is going to run how accurate can we in a sense trust it? Obviously we want as accurate possible and can it scale up and can it scale down from a flexibility point of view. Can I modify this after it’s already created? In fabric, man, I think we have those three, but I think that we have additional layers that have to be considered. You’re dealing with how how complex is my process because you may be dealing with your lakehouse and a

15:37 Notebook and a pipeline. , you’re dealing with how many artifacts are part of the process? How many workflows and domains am I dealing with? And I think you’re dealing a you have to be so conscious of your capacity and your CPU more than I think in the PowerBI era because we’ve seen to your point you only had so many tools to play with with power with PowerBI but we know now well if I want to use a data flow gen 2 I know it’s going to really eat up what I’m doing that might be the best path

16:09 Right now but it has to be consideration. So to me with with fabric better is how well does it do complex or flexibility speed and accuracy on top of those additional things the number of artifacts how complex is my process and what am I eating up but regardless you’re still trying to get to the end goal and I think that’s the biggest thing the end goal Mike would be how quick can I refresh this how sure can I be of the numbers after a refresh or it being automated

16:41 At the end of the day, that’s the bare bones of what a better is is when it’s automated. If I go if we if we go to Sicily and start eating pizza, do our our data sets going to reflect the right numbers and is it going to refresh on time or whatever? I use refresh because that’s a PowerBI thing. But is the number going to get there on time? Those are I think those if you want to boil down better, it’s those two things. How fast does it run and does it or and how accurate is it?

17:12 Yeah, I think I think you’re also comparing a little bit of apples and oranges in this realm as well because pro and premium per user was pretty much built on this model of another great point per per user pricing and then fabric is not built on per user pricing. It’s built on capacity pricing. Right? So you you can get more out of the fabric SKs, but there’s also more flexibility to to optimize things as well in the fabric skew. So I I I don’t know if they’re they’re directly comparable. So let me just talk about like to let me answer to the

17:44 First part of the question. The first part of the question was collect all your data sources using a dataf flow. dataf flow would then create basically CSV files somewhere inside either in a blob storage account if you connected it or it would connect it collect those directly inside a fabric environ or sorry a PowerBI environment and then from those CSV files you could then pick them up again do more transformations and pull them in the pattern of using data flows to build tables the reason why you did this is because you’re able to build reusable tables across multiple semantic models

18:17 And you’re able to lift out of the semantic model the ability like when the semantic model had the M code in it, it would take longer for the semantic model to refresh. So you could independently refresh different tables at different speeds based on your needs in the data flows and then you could also share them across multiple data sets. So the the whole reason of like data flows being part of your process why don’t why not the simplest architecture is just make a semantic model use m in the semantic model load the data you want publish it

18:49 And done like that’s e that’s the easiest method here so I think the reason that architecture evolved out of the powerbi pro and premium per user was we had challenges with I couldn’t take one table in a semantic model and share it with a separate semantic model it just didn’t work very well and organizations want some some semblance of a data warehouse. I need to put all the tables in some place and pluck and pull from them to distribute them out to my team.

19:22 So I will say this I think that was the right architecture in the PowerBI pro and premium per user era. I think now with the FSQ I what I want to bring to the thought here is the FSQ is not bringing necessarily more or simpler experiences to the PowerBI report developer. I think what the FQ brings is it’s actually bringing in a brand new persona or individuals that we never had in playing in our data space. Right? So all this is doing this

19:54 Is instead of thinking of it as like we had a I think of it this way is like there was a there was a bubble that there was a circle or bubble around reporting and it was basically PowerBI what we’ve done is we’ve added additional bubbles for other things and now our entire surface area of what we can do has increased in size. So how I look at fabric is fabric is now another portion of our business that can work right next to our analysts and business teams and we can have the lakehouse, we can have pipelines, we can have full SQL databases in there. And I’m to the

20:29 Point right I’m I’m actually changing my tune here a little bit. When SQL came out for fabric I was like okay I don’t understand why I don’t understand why it’s here. But now looking at it as a developer and a report builder of things, I’m now seriously considering like maybe fabric should be partly considered for some of our operational systems. If I’m building a web application or if I’m doing something where data needs to be moving in and out, fabric supports the ability for me to put in that operational SQL database. So I get all the SQL richness goodness and I just it’s a button click. I need a

21:01 SQL database. Click done. the the database shows up and then I can start connecting that to my applications and talking to it or if I need user userdefined functions those can also exist. So what I’m seeing is there’s a lot the fabric is bringing a lot of other data products to the world. And so I still think the pattern exists, right? We’re still going to have the pattern of centralize all your data. That does not change. Get it into semantic models for reporting and even now agents, right? So now you’re going to chat with your data. I don’t I think you can do chatting with

21:33 Your data directly to a lakehouse. I think that would maybe be okay. But honestly, the most effective part of chat with your data would be point the chat bots or the agents at the semantic model that has all the enrichment like the relationships, the descriptions, the everything else. , that’s still going to be key. So, I think at the end of the day, when you’re serving data, you’re going to be focusing on serving data from semantic models to your organization through reports and other experiences. And I think the other thing I I think I had an aha moment around here is in the PowerBI space, you’re always building reports. It’s pretty much reports and

22:06 Maybe a metric set. That’s maybe what you were doing. But now think about all the things you get for fabric. You get a semantic model. You get explore you explore your data experiences. You get reporting page reporting. You get agents which chat with your data. so there’s there’s and you get metrics and metric sets like all these other things. Goals are now there. So I think the idea here is we are thinking too small about our data at at this point right we’re talking a lot about just reports and I think fabric extends

22:39 The number of ways you can distribute your data out to other experiences. So you’re you’re everything you said here I imagine is where people have the biggest frustrations I think right now with fabric. One of my favorite episodes we ever did was almost about this mental misconceptions and we brought into the BI space. And one of the ones that always stuck out to me was this idea that they did this study where there’s like a jam company and they had three jams to pick from and people would pick the jams. But as soon

23:12 As they made it like 11 jams, the sales went down because people had more choices which made it harder to pick a single thing. And I think what people’s frustrations are now I agree. I agree there’s a new persona, but I think for a lot of people, they’re jumping into this into the fabric world, not as the new persona. They’re the same person, but now they have all these additional choices. And we we’ll try to boil it down here. You you mentioned a ton of things. We have the SQL databases, data agents, we have metric sets. , even things that are outside just the

23:44 PowerBI, , your definition of just PowerBI, but even in just even in just PowerBI, Mike, there’s a lot of choices. And my frustration is it it goes back to the the core thing that I I live by any product or thing that we deliver is if you’re going to do something, if you’re going to produce something, if you’re going to create something, it has to be better at what it’s doing than other things otherwise no reason to exist. is the whole Steve Jobs mentality on why he built the iPad. Well, it’s better than browsing the web

24:16 Than the phone and it’s better than watching movies than the computer. So, that’s why we have it. But you’re mentioning all these other features here. Metrics, metric sets, scorecards, reports, dashboards, pageionated. So, we’re introducing all these things for people and now we’re still trying to define the word better, right? So you’ve add you have just added a whole other layer of complexity on this better thing. When we initial when we both did our definitions of better, we were talking about the process to build a

24:49 Report. But when neither of us said report, we didn’t say a report that has to go through this process. We assume that. So now Mike let me ask you and to just a to really just add complexity here when we say better in the fabric era I think the probably the better way to ask this question is can we define a outcome or output that’s better than a report is there something better than a report or is this always in the eye of the beholder

25:23 I think it depends on what your needs are right I think so I I’m I’m just taking some notes here. I’m like talking about the optimal solution right we so let me give you another example here that and this is where the evolution of things have gone right so Microsoft is taking products that have existed for number of years longer than fabric has been around pipelines have been around since pipelines are an evolution of Azure data factory and those have been around forever a long time those been I’ve since I’ve been using Azure those have been some some semblance of those have been around for a period of time so that’s

25:55 Been out for quite a while it’s a very robust technology, efficient, it has a lot of UI based on top of it. It’s not the easiest to use. It’s a little bit of getting used to as far as a developer standpoint, but you can do a lot of creative things with pipelines. So, when I look at the optimal solution for fabric, you’re now adding all these other rich Azure- like experiences directly into the fabric space. So again back to your point Tommy like if you are comfortable in these more advanced solutions you have more capabilities and

26:27 I’ll go back to another example here that I’ll use around real-time intelligence or real-time data right PowerBI pro and premium per user has some real-time stuff you could do a streaming data set you can send data to powerbi.com and it would update reports and dashboards with information but the experience was very limited so yes you could do it yes there was some real-time experience there, but it was , , very narrow. You had to do a lot of work upstream of the streaming experience to then be able to send data in correctly so you could actually read it and use it and leverage

26:59 That data inside these real-time streaming models. And you’d be very technically there. I think in the fabric world, if we look at like real time and streaming, there’s a lot more in the ecosystem to make it more graphical. It’s easier to use. is we’re doing a lot more event driven things inside the fabric world that is being designed with a technical side. It’s still there, but we’re making it easier for you to use it. So, we’re we’re getting more capabilities is is what I’m trying to describe here. So, when you look at the optimal solution,

27:31 You not only do you have to say like I’m just getting data from a SQL server or data from Excel or data from SharePoint, we we have more options to where to pull data from, right? Is it real time? Is it a is it a a web hook? Is it now a SQL database? Is it something else? Right? All these all these other experiences now can be brought to us. And so fabric is just becoming a wider trough for us to put more things in. So if I think about it, it’s like there’s like four main aspects. Bloating data, which would be something like a

28:04 User user data function, a data flow, a pipeline, real-time analytics, something like that, right? that there’s a there’s compute you need to use to load the data in. Then you have store and transform. Storing and transforming is your lakehouse, your SQL database, there’s a KQL database, there’s a SQL data warehouse and then there’s notebooks for doing transforms. Right? So these are all new experiences that have existed in fabric or sorry in existed in Azure that are now being brought to fabric. I still think the best way to serve data is

28:36 Going to be the semantic model. Now it may be a semantic model with an agent attached to that semantic model but I still think when I when I look at the other landscape of like other experiences when you look at data bricks and you look at snowflake both data bricks and snowflake are coming up with semantic models. They’re they’re coming to the same conclusion that Microsoft has already been at for like the last 20 years is the semantic model needs to be there. It needs to be a memory cacheed version of data for the user to interact with because going back to the original

29:08 Server and grabbing all the data from SQL queries is just slow. Yeah. And doesn’t work as fast as having in cache memory column storage. Great. I we’re already there like we’ve already got that model covers too more than just a report now. Correct. And so then then the final stage of this is like the interactions, right? Your interaction is how do you interact with the data? Is it an agent? Is it a metric set? Is it a goal? Is it is it explorations? Is it pageionate reports? Right? So, I think now I think a lot of times we default

29:42 Talking about building a report and I think that’s a very narrow-minded way of looking at what I need to do to get the data out. And I think there’s a lot more. The number of organizations that still just sit on top of SSRS and page page reports is phenomenally large. And we’re we’re underestimating the ability for people to need to get away from that and getting into just PowerBI reports or some other online experience. I think people are just used to going to my inbox. That’s the context of where I’m at with my email. I go to my

30:16 Email. there’s a there’s a Excel file or something there and that’s where the data shows up. You don’t know what you don’t know and a lot of companies don’t either. So I completely agree with that. So again I’m just trying to say like so I think with fabric the ideal so we’re talking about like what’s the what’s the best ideal architecture right? I’m going to give you my recommendation but that’s usually a starting point. There’s a lot of other things. So the fact that there’s like an ideal architecture and then there is this evolution of that to there’s a lot of different other patterns you can do and so I think what you were speaking to earlier Tommy

30:49 Was the three V’s of data maybe there’s four I can’t I was they always seem to change velocity and volume volume velocity the speed of it so how much of it yeah variety or variability right that’s when how different it is and I always forget the four. I give you a solid three though. We’ll go with the solid three. There’s more out there, but you the three V’s of data changes now because at some point you would tap out. So I would argue in PowerBI Pro and premium per user if your data got to a certain size,

31:23 Data flows just didn’t seem to be very efficient and they would take a long time and they would time out. So we were constantly fighting timeouts of data flows gen one in Pro. If they ran longer than 2 hours, stuff starts failing. So you better have your data somewhat quick. And then we started doing like incremental refreshes and and smaller like we started working around that to get bigger data into our models because it was just taking too long for all the information to be loaded. All we’re doing now with fabric was just giving you more variety and you have the ability to handle larger volumes of data at scale.

31:54 There’s an irony there. Let me pause there. Yeah. So I I got a few things because I I love a lot a lot of what you said. with with premium per user is almost like the CC’s of data because you could do a lot. You didn’t have to worry about the whole capacity falling. Sure. But now even the semantic model size the semantic model size. Yeah. You had to be somewhat conscious but not really like it wasn’t the same compared to now you have all these options yet you are so much more limited. You have such a more bigger budget. Now what I want to do Mike is I want to agree with you then I want to take a sharp le left and

32:27 Vehemently disagree with you. Okay. Go ahead. So that’s that’s the that’s where we’re going to go. Start with the good news and then we’ll go to the bad news. I agree with you on a big statement. I’ve come to the mountain and I’ve come down from the mountain, Mike, and I and I realize that no person really wants a report to see their data. It’s just right now the best avenue to do so. It’s the best mechanism or platform to do so. But if you were to ask everyone what they really wanted and break everything down, how to see their numbers right now, the report just makes the most sense. But that’s not what really no one

33:01 Really wants. Just the best way to do it. Now that’s the part where I agree with you because we have all these other options. Yeah, we have metrics, we have data sets, we have real time. There are so many ways I can deliver your data to you. But that being said, Mike, when we are saying better now, I think there’s we have to split it into two groups. When we talk about better, there are universal betters and cultural betters. To me, this is the only thing that makes sense in fabric because of the additional choices that we have to

33:34 Choose from. Universal bettererss are the ones we’ve talked about. Speed, accuracy, complexity. We have to always keep those in account. Those are always going to be the case. But when it comes to the output, yeah, the semantic model is the core, but how consumers, how people are going to view their data, well, I can give them real time. I can give them a metric set, but I have to train them. I can’t just, , deliver this out into the report or the org app and say, learn it, figure it

34:08 Out. We know if I change my report too drastically, people are going to have an issue. right there. You can’t expect them just to figure things out. We any anytime, Mike, I especially when I worked internally at FTE when we would change a report drastically, we did a webinar before we made those releases because we got backlash before that. And this is not because people are idiotic or just not intelligent enough. It’s because you’re

34:40 Changing drastically the user interface of something they rely on. It’s just our job. Now I have all these other things like you said metric sets, real time scorecards, pageionated all those require to get everyone on board. the least equip a skilled person, we have to get them on board for that to be successful. So I have all these options, but that doesn’t mean that it makes the best sense for that company or that team just because I have all those options. So for me, I have to split better and

35:13 This whole conversation here on better into the universals, the tech better and the cultural better because without those two, this does this doesn’t work. And I think that’s a big part of me, Mike, where I still look at reports. And if I want to, I still want to introduce that. I still want to go through that flow. Anything else, I have to have a plan on how I’m going to get that adopted. So, I’m going to I’m going to pause on that statement and I have a feeling you’re going to disagree with

35:45 Me, which I cannot wait for. But what’s your take then on all these other products that we have require a cultural better as well besides just our technical better. I I think yeah I think this is where we were talking about in the very beginning of this is like define what better means to you right I I’m not going to right now it feels like the most prominent mediums that answer people’s questions so let’s let’s let me let me come back to your comment you said earlier Tommy

36:17 You said someone no one really wants a report okay I would agree with that statement but then I would like want to expand that idea a little bit more then well if no one really wants a report what do they really want my question would be back, , yeah, unpack that statement and really at the end of the day, it goes back to what I think this whole business intelligence thing has been anyways, right? I need information to make decisions to run the business, right? We have business objectives or goals. We talked

36:49 This a lot about like KPIs or OKRs, things that are like corporate objectives we want to get done as a business. So what are those identified objectives? And all we should be doing is the data should be should be supplying information to make the decisions to get you closer to those larger objectives. That’s that’s what the data is being doing. So I the whole reason we do business intelligence is there’s collection of information and we’re trying to make decisions with information. So the goals of us at the business intelligence side should

37:24 Be what can we do whatever medium it is what medium how can we use the medium to help people answer questions that they walk away and command action right I’m going to use this data to leverage or not leverage I’m going to use my data to talk to my boss about hiring another sales representative enough cuz we’re underperforming on sales. Yeah. Right. I’m going to use data to identify the weak sales people in our team and then go work with them or

37:56 Provide training to them to then hopefully increase their sales performance. Right. That’s that’s something that we’re doing. All this is human interaction stuff. Yeah. Right. So whether and and the way I look at the world today is we’re looking at today reports seem to be that main new medium right here’s a bunch of visuals I can click on them I can filter them and it gives me different information this is just what has been created as of today who’s to say that’s not going to live on moving

38:28 Forward in this new era of AI and so I want to just expand some of our thought here right a a couple episodes ago, and this is actually very recent. I was saying, look, I was talking very, hey, this is very Star Trekesque, right? We’re going to talk to the computer. The computer is going to build an app that serves a very specific purpose and then you’ll use that app for a period of time and you’ll throw it away. You’ll be done. You’ll move on to the next thing, whatever the next thing is. And I think this is very relevant here because what? If the next new experience is an agent-like

39:02 Experience where the agent is building a mini data application for you to interact with the data like what if the agent can produce like you have a semantic model you have information that’s documented what if you talk to the agent and say I’m looking for this question and the agent scans through existing reports the agent looks at other information and potentially could produce to you a brand new graphic, a brand new information, and this whole agent space where it starts deep

39:35 Thinking on stuff, right? What if what if you could send again, I’m just really spitballing here at this point. I don’t know if this is ever going to happen, but like what happens if I could send a table or some SQL? So, hey, talk to the agent. It builds some SQL queries against some data that’s out there or it writes it out and then produces results in a table. It takes the output of that table thinks on that data and then it would say okay based on the question that was asked and the data that I’m given it runs it to like four or five different agents and say okay five different

40:07 Agents claude gro open AI chat GPT whatever it could it could basically source that information through multiple agents and say okay agents here’s the problem statement here’s the data think about this what’s the best next course of action Right. And then the agent go like, so I think we’re at a place right now. It’s not designed yet, but how close are we going to be to actually getting to a place where we might actually do this? We’ll talk to the agent. We’ll start framing some of the data story and then we’ll give it

40:40 Back to a number of other agents where it can just think through things and then it’ll present to us like, okay, you asked this question. Here’s the data that I found. Here’s a state the SQL statement that I ran. Oh, and by the way, and I thought about this data a couple different ways, and here are some actions that you could do to provide a desired outcome. What desired outcome do you want with this? Gosh, let’s think about this. And so, it could be literally this helper thinker thing that goes back. And that’s honestly, that’s what we’re doing as analysts anyways, right? We’re always trying to drive for some outcome, some result. , and so just it just happens to be

41:12 Today the best medium to do this is in a report and let the user do that, right? But who’s to say in the future that that’s not the same medium or way to do this? I’m gonna pause there because I said a lot of different things there. There’s a there’s a lot of great things. First off, I got to touch the agent thing real quick and then I think your your bigger point, but first off, this idea on I want to create a SQL agent in Copilot and I wonder if I could do that where it’s just it only is good at SQL. It’s just a SQL agent and then we have other things talked with. I wonder there’s something there, Mike.

41:44 Rather than having an agent trying to do everything, a single agent. So, there’s an idea. We know this. We know this is true. That’s how Yeah, we already know that like we already know that agents are going to be specific to what you’re trying to produce these against. That’s when they’re the best thing. And that’s when they’re best. They’re they’re going to be the best when they’re like tuned for specific jobs and you give them a lot of one to interpret that question and say, “Hey, you’re the SQL converter.” Basically, your skill, your tool is this. Take the statement, make the SQL. another guy runs it. So, first off, we’re not there yet, but when we get

42:17 There, oh, we’re going to be running. We’re going to be running. But I want to I want to just pause just for a moment there because the agents and so just just one little note footnote. This is a footnote on what your comment is. So, keep write down what you’re going to say next time because I don’t want you to lose it. All right. So, oh, I’m not going to forget. No. Okay. Footnote on that point around the agents. Agents like if you talk about like the syntax of a language, the syntax of writing SQL, the syntax of writing. So TSQL, Oracle SQL, Postgress SQL, like all the different SQL languages that are

42:48 Out there. The agent can be trained on every single piece of documentation ever known to man about those. And it can also be trained on open repositories that have, , millions of lines of code of written SQL that some is good, some is bad, some is optimized, some is not. But the agent itself can have the most knowledge around every single syntactical thing. And to to hire to bring in anyone and say to hire them to say that they’re going to know more

43:19 Than an agent will about SQL, I think is a misnomer. Honestly, the mechanics of the language, the the mechanics of how to write the SQL statement, the agents will know absolutely more things. And look, if you want to use something like a rollup, right? You could talk to an agent and say, “Hey, I’m trying to produce this rollup on this, this, and this. It’s a it’s a more advanced SQL function. Not everyone that I’ve experienced just learns what rollup does.” And that was something I learned in my grad classes. I like, “Wow, this is a really powerful feature, but no one really asks you to write that.” So, this is where you can like leverage the agent

43:54 To do these kinds of things. And the same thing for Python. This goes for Python. This goes for any other language that’s out there. The the agent, one agent can know all the languages way better than I will ever know them. This is what we said. Welcome to the conductor. You want to know the skill to learn? It’s learn how to be a conductor because that’s what we’re in. We’re in the conductor’s era. How do how do I make sure that I can dictate and run these different agents and and even like optimizing, right? You can write your SQL once and say optimize this, make this faster, make this better. like you can have. So, I think it’s I think that’s the part where I

44:28 Like to look at leveraging AI is saying, “Okay, let’s look at what AI is doing.” And what is it really good at? It’s really good at knowing how to It knows everything about everything. That’s where you need to leverage it right now. And at some point, it’ll get better about thinking. Like, it doesn’t think the best, I think, yet at this point, but if you ask it very directed questions around the language and how to get accomplish things with the code, it’s really good at that. And the reason I’m saying this is because I’m now building throwaway apps. I’m building an app to do a very specific thing in 30 to 45 minutes that does

45:00 Exactly what I want. And with to your point, you’re conducting it as opposed to like writing the code, the technical side of it. And I think that’s you’re right. I’ve never heard the word conducting the way you’re using it. But that’s actually a really good analogy of like we want to be more of conductors on top of the data. Conductors on top of the agents that run on top of it. That’s literally what they’re calling is the conductors era. Yeah, this is this is I haven’t heard this before, but I really like that term. That’s a great term. I’m going to definitely use that more. Sorry. Okay, enough of my footnote. I just want to

45:31 Put that in. Everyone knows we have everyone knows by this point, Mike, that we’re going to have at at some point on our topic, no matter what it is, is something about AI. Something about AI about this. It’s changing though. But it’s changing how it’s physically changing how I do things. So if it’s impacting us as prodevelopers like we have to be comfortable with leveraging these tools in front of ourselves because we can do more of these conductor highle thinking things and letting the agents jump in and actually write the code. And that’s to your point, Tommy, all these things I

46:04 Named out like store, transform, load, all these different things, the different experiences, there’s a lot of technical aspects to those, but those technical aspects are being removed because the agents are able to come in and fill the gap. And also, when the matrix, , they’re going to treat us a little nicer, I think, because we talk about them so much. So, hopefully, , I I want to bring it back a little. So just the I think the bigger point that you’re making but the AI stuff has a big part to play with this but at the end of the day again you boil everything else down you you made the point though too about

46:37 Information and my core tenant especially when we’re talking about better and I I guarantee you Balaz when he asked this question wasn’t thinking about all these other things was probably just report development but at the end of the day like my a quart I always had that was a non-measurable but Something that I always look to was well at a non-measurable and the measurable. I looked at not just how many times a report was being viewed, but I looked at what time a report was being the hour of the day. And the reason why was I wanted

47:12 Reports to me is just it’s another you said this originally, it’s another source of information to dictate what you’re going to do that day or that week. So really it’s like your outlook for data for numbers instead of text which when text is in a certain order it tells you your boss is telling you guess what you’re doing today but all it is is simply letters in a certain order that’s all it is. So honestly I try to think of a report in the same way is the numbers in a certain way in a certain format in a certain

47:44 Structure that should help that person. A the best report to me or the best way that I’m proving value anywhere I’m at regardless I’m an analyst or a director or a consultant is you check your email in the morning and then you check those reports because they telling you what you’re going to do that day. And so right now the report’s just the best medium just like Google used to be the best medium. And that’s the point I was going to talk about the AI stuff, but I guarantee you I bet you’re not using Google as much anymore. And I bet you

48:18 Yeah. Just like you’re probably not using Outlook as much now because you’re probably using Teams but and I think at the you’re going to boil it down where we are here in 2025 in the middle of the summer. Reports are still a great medium. And I think when you talk about better, when you talk about what’s the most optimal way, I think you boil it down and it’s still semantic models. It might not necessarily be your data flows gen one as Bolas said especially because like I said there’s a lot of other technical sides of are you

48:51 Going to do a lakehouse or you going to do SQL database but I think the core like I’m still focusing Mike just like you are on my solutions are centered around a semantic model because now it outside of just it how again how powerful it is it does have a lot of more more output besides just a report but the report itself is such a huge part. So when I say better now I’m considering the best way to create a semantic model the most flexibility for a semantic model with my budget that I

49:24 Have with the capacity if I were to boil it down that’s where I’m at. Yeah, I would I would agree with you on that one as well, Tommy. Like it’s going to be around like a like a again it’s a balancing act, right? So everything to me this is a balancing act of like what do I we have with fabric we’ve been given a lot more capability. There’s a lot more we can do at our our disposal and today we’re talking about medians of reports. So agents is changing things. How we interact with our data is is we’re on the elbow of something brand new here

49:56 That’s going to be changing again here very shortly. And one thing you’ll notice I think that’s coming out of this is all the agent stuff that you’re building not on your browser. It’s not desktop. it’s all in your browser. It’s not on desktop, right? So agent stuff is going to require you to stay in the cloud. And so I again I think we’re also accelerating this need away from like we need to be able to build more in the cloud. We need to have everything living in the cloud. Everything has to be existing. And now with the PBIP format you can and get synchronization right you can easily build in the service and not worry about

50:28 Losing your changes right you can always download the report even if it’s just a thin report you can still download it and edit it in desktop if you want and there’s one of the main tenants from c from ruy in the pbip format is everything that you can do in desktop is 100% compatible with pbip format and that’ll all be available and built for you in the powerb.com service. So when I when I look at this this question here and if we go back to like okay I really do want to answer these questions here in rapid succession right how does the optimal archite architecture look like for a

51:00 Data set and report development in fabric I think you stay very close to the same thing right you could still use data flows gen 2 the the principle is the same land the data somewhere right you can land it in a dataf flow gen 2 inside a lakehouse the the only thing that I think really shifts here a little bit is in the previous world the data flow gen 2 or gen one was the processing of data and the storage of data we’ve replaced that with a lakehouse lakehouse is becoming very key about whatever you do right the semant the SQL servers can talk to

51:34 Lakehouse everything you write with data transformations can write to lakehouse notebooks pipelines everything else dataf flows gen 2 everything writes to lakehouse so I think the architecture the ideal architecture is it doesn’t matter how you get it in it all lands in the lakehouse. That’s where you’re going to put it. It’s going to be stored in delta format, it’s going to be in the lakehouse. That gives you the most amount of flexibility to do the next thing you want to do with it. We’re still going to stick with semantic models. So, however you get it from the lakehouse into the semantic model, still the same. Now, there’s different

52:06 Patterns like it gets more evolved. It’s like landing raw data, then processing it. , it’s still the same pattern you would use for data flows. It’s just now on top of the lakehouse. So, , my preference here is, okay, it’s still load data in with some tool. You could still use data flows gen 2. You’re going to have to have it in stored somewhere. I think the the ideal place is a lakehouse. And I even would argue now it’s lakehouse and maybe even some SQL servers. I prefer lakehouse. SQL server. I’ve been finding some really good things about it, but it’s still a little bit buggy in some

52:38 Situations. I’ve had times where I like land data into the SQL server and like it doesn’t turn on and my reports all go blank. So I’ve had some weird I’ve had some weird bug bugs with the SQL side things. So it’s a little bit deterred me a bit but regardless you’re storing the data somewhere right and then your PowerBI semantic model and I really like direct lake direct lake is really interesting to me. So I really like the direct link experience and so that’s really interesting to me. So it’s always going to be semantic model. Now the output right the the the question here is what is the optimal architecture

53:13 For data set and report development. Yeah it’s still semantic model reports but I don’t think it will be that way for very long. I think there’s going to be a lot of other new experiences around agents and agents talking to your semantic models to get information out for you. I think that will also be very useful for us as well. And so I think that’s my ideas on it. I think I would say like look it doesn’t change much. All we’re using is we have different tools. The core principles is load it in, store it, semantic model it, and get to a report. That’s still the same. So that does not change. Some of

53:46 The tooling is slightly different. And then you can pick different flavors of things depending on your department and your organization. If you’re a super SQL heavy organization, fine. Let them use more SQL servers inside Fabric. No problem. like then then you’re then you’re catering your your team’s skills towards the SQL server side of things. If your team understands Python, push them towards notebooks or if you feel like your team has a threshold of learning notebooks, push them towards notebooks. Or if you’re cost sensitive, the gap going from data flows gen 2 to a

54:19 Notebook loading data, you have data wrangler, very similar experience, doesn’t do everything as well as dataf flows genot, but you got copilot now. And so I think co-pilot is going to be you can ask it what you want it to do and it will help you get through this. I can’t tell you the number of organizations I’ve been through that have started with dataf flows gen 2. I’ve worked with them together and we’ve learned some simple things in Python and we’ve learned how to leverage the agents to teach us what we need to build. And here we are a couple months later. We’re now deleting

54:52 Our data flows Gen 2 and we’re now replacing them with notebooks and we’re we’re seeing the CU drop and we’re seeing more efficient use of our data. So I really do think there’s this mentality of just build it so it works. Once you understand how it works and how it’s built, then come back and start optimizing and and and I Mike, I I don’t think you could be more on point here on this. And I know saying this is this equivalent to Froto putting on the ring and Alex Power Powers is gonna hear us, but I think there’s a move away from data flows here because the biggest things that changed

55:25 To me are what you said around the lakehouse and I think notebooks. They are so powerful and if you are intimidated by them because it sounds like the data engineering world take some time because it it is so immense so powerful in in the fabric process it’s really hard to have an argument against it and to me I don’t honestly I don’t know if there is any good argument against incorporating that at some point into your workflow if not the the base workflow. There’s nothing you said that

55:59 I disagree with you in and in and retro or on the other side of the coin. I am doubling down on what you said. I think the core part is your semantic model and the lakehouse and yeah, you’re you’re introducing probably your notebooks here rather than your the multiple things you did with data flows gen one. It it had a it’s had a great life. Not saying in all cases, but again, if you’re on that cusp of do I really really need to know data flows or notebooks, do I really need to start learning notebooks? I think you

56:31 Do. I I think this is the time. , this is the time whether or not you’re somewhat there, not there at all. I I don’t know if there’s a good argument against it, but no, I I I think at the end of the day though, the process stays the same in terms of we need to get our data in. There’s a better way to do that now with lakeouses. We need to do some transformations. Maybe you do that in the semantic model, but what the lakehouse is and the the in a sense the legs or the tentacles that a lakehouse

57:03 Has from it can go to a semantic model, it can do direct lake, it can touch my applications, it can touch other systems because it becomes a SQL endpoint. it is it makes it so powerful that the default is a lakehouse to a semantic model but then you have so many other options going back to when again when your company defines better for how you want to see your data. So I I’m a complete agreement with you here. The biggest thing I would I would only add to what you said that I think is really the thing that people need to

57:35 Incorporate into their process for better is you’re no longer in the premium per user. your ppu playground. So you have to be very conscious of how many things you’re creating and how complex those things are. I I’ve been burnt. I’ve been burnt big time where I was like I had to increase my volume piece. I couldn’t even access PowerBI because of a single data flow. So tells you how good I am at what I do I guess. But regard but I know the biggest thing is you this just goes back to Yeah, I’m with

58:08 You. Yeah. Yeah, but this goes this go back to optimization, right? So like again you’re you’re doing just something initially to get something done, right? You knew data flow is gen one or gen two and so you just you knew what you want to get some output. Yeah, you would get some output, right? And then use that mentality and again you were also limiting yourself cuz you’re on an F2, right? If you were on a larger FQ, you would have had no problem. It would have just run and you would have been none the wiser. So yeah, you would have kept it. You would have just not changed it. So, , it’s it’s when it’s when you’re trying to

58:40 Tune and optimize, there’s now better tooling. And so, I compare dataf flows gen 2 against the internal fabric tools. You could compare dataf flows gen 2 against like talent or other visualbased data engineering tools. And yes, it’s a very comparable product from those other tools. They’re very expensive because those other visual graphical interface tools. My opinion here is like look the the the game that we’re playing here has changed. I’m no longer comparing do I use dataf flows gen 2 versus talent. I’m actually comparing do

59:13 I use dataf flows gen 2 versus a Python notebook or a a userdefined data function or something else. Right? So I when I look at the information that I’m given the the the game of which I use comparisons are it’s totally different. And so I think the pricing of whatever the data flow is gen 2 is just wrong. It’s too high. Get away from it. Move out of it. there’s better, more efficient ways. As a user, I want to be able to move gigabytes of data for as cheap as I can. That’s that’s what I want to do. And so I think these new tools that are put in our hands, Python notebooks, Spark notebooks, they’re much

59:47 Better designed. Even data flows, there there’s all these pipeline things. They’re much u they’re much more efficient now than what they used to be. And so that’s the whole game to me has changed. And this is I love this game. Yeah. This is this the game has changed because I was studying this in college in my master’s degree around this and so it’s been really nice because I love the BI space. I understood the semantic models and all of a sudden I get all this big data technology that just shows up to to work with fabric and I’m like man this is my two worlds colliding. My love of PowerBI and my love of big data

1:00:21 And technology pieces and boom now we’re here in the same place and I can do these really incredible projects for a fraction of the cost I could before. I I think a lot of people Yeah. I think a lot of people are in this space too because they love it. Yeah. Can it be stressful learning a new language, a new tool, new platform, but like I think that’s the fun we we all get some sick joy out of it at the same time, Mike, because you look back to what PowerBI was even four years ago or what fabric was four years ago. Like we’re constantly navigating what the best practices and I think Bolaz’s

1:00:53 Question here. This will be my my wrapup here, but you let’s ask this question next year. Let’s do the same thing in July of 2026 and we’ll we’ll have Bulaz our annual review of what better means. And I I have a feeling I have a I put a money on this that I think this was going to always change. This answer is never going to be the same because our technology is going to be different and we’re gonna yo the Microsoft may come out with something in October and we’re go a notebooks they’re way too expensive. It could happen. It could totally happen. It

1:01:26 Probably will happen in a few years, but at this point, , we have to play with the hand the cards that were dealt. And this is a question that’s always going to change, but I think it’s a race to the bottom at this point. Yeah, it’s it’s a it’s a race to seriously when you look at Google, AWS, and Microsoft, it’s a race to the bottom. It’s going to be who can do the most amount of data processing with the least amount of cost. That’s what we’re going to get to. And we’re we’re already seeing the race occurring. And now the race is also incurring inside of Microsoft itself, right? Like even inside their intern each team like the data flows team is a different team than

1:01:59 The notebook team. The notebook team is a different team than the SQL team. They’re internally competing against each other for products and giving you the right experience at the lowest cost. So it’s going to change. Good for us the consumers because we now have options whereas before we we had very few options. But it just it’s changing things. So yeah, I I like this. This question is a great question. I think this is a really solid I think you’re right Tommy. We’ll have to revisit this later and say what does better look like here? if I had to answer these questions directly.

1:02:31 How does the AR ideal or optimal architecture look like today? Yes, it’s landed in lakehouse. Get it into a semantic model and get it into reports. That’s what it is right now. I don’t think the prefabric era design system with using dataf flows gen 2 is now optimal. And I’m looking at it purely from a CU compute storage standpoint. there’s better solutions now. So you you need to change your architecture to get lower cost, more optimized solutions so you can buy smaller fu and serve more people. I think that’s where we’re at right now.

1:03:04 And that’s those are basically my ideas. The last question I’ll note here is does PowerBI desktop remain the primary tool for data set development in the PowerBI service and will be displaced? I think I think the answer is no. I think PowerBI desktop will be a premium tool for professional developers always it will always be there but I see the service getting a lot more capabilities like you don’t build notebooks in desktop you don’t build pipelines in desktop there’s a lot of other fabric experiences that you do data shaping and manipulation before you even get to the semantic model and

1:03:38 There’s just I would argue there’s only a couple feature gaps between what desktop is doing and what the service can already do and we’re not far away If if you notice what they’re building, everything Microsoft is building is service based. Everything now is all the there’s so many more teams building service- based pieces. There’s only a couple feature gaps, I think, that are in desktop that you can’t physically do in the service. And once those gaps are closed, you’re going to potentially see desktop updating slower. You’re potentially going to see less development around

1:04:11 Desktop. And I think you’re going to get a better agent integrated experience inside the service. So I’m not going to that last question I’m not going to answer directly. All I’m going to say is desktop team, if you’re listening, I’m rooting hard for you. But no, I honestly it’s a good point, Mike. Like I we have to be comfortable in both spaces. They’re doing a lot with desktop like and I think they’re still putting a lot of work because you can still edit a direct link model using desktop which is surprising that they did that if you were 100% right right because if it was

1:04:44 All service then why would they put that feature out there but I I agree the I think the biggest point is if you’re comfortable with desktop it’s still going to be around I don’t think it’s going to go as fast as data flows have but where is the primary tool. I spend more of my time in the service. Like that’s probably the truth. And , I do I I do a lot of report development. Yeah. In a in the desktop, but y the first place I’m going for usually modifications or something because it lives in the service is a good point.

1:05:16 But I really Mike, I really want to visit this in maybe six months. Listen changes. , we already have Dex We already have Dax Corey review in the service. We already have a lot of the other features. I can I can make Yep. I can make models, wait till Timle gets there, right? Once there’s a Tindle editor in the service, like I think that’s probably part of the road map. It makes sense to get it there, but once you get like that’s one of the major reasons why I go back to desktop is to get into the Tindle view and see what’s going on there. And so we’re getting really really close to not really requiring that anymore. I don’t need to build reports in desktop

1:05:49 Anymore. Almost all those features exist in the service. Now I will I will argue this in the service editing of report building you don’t get any of the new visuals any of the new lightning bolt card all those visuals do not exist in the service when you edit so like you are again back to your point Tommy there are some very distinct limiting features that the service does not do that desktop does do there there can but I’m also to the point where I don’t want to I don’t want to be downloading an 800 megabyte desktop file every time I want to run in desktop like it’s it’s getting absurd we’re almost at

1:06:22 A gig now. Like I remember when desktop was like 50 megabytes and now computer right on a hard drive. I still have I still have power I have versions of power designer on my desktop. Very very early versions back in like 20 2015 when they came out a video on that. Just open and install that. Let’s see what it I don’t even know if we can Silver Light Silverite. We have to have to run it on Silver Light. I don’t even know if that even exists anymore. Type in a command into your MS DOSs. I don’t know. So, we’ll see. Like that would be a fun episode to do as well,

1:06:53 Just go back and reminisce around that one as well. But anyways, that being said, it’ll be interesting to see where we go from here. I regardless, I think if you stick yourself or attach yourself to PowerBI and fabric, you’re going to have a good career, there’s a lot of new development. And just, , as a general rule of thumb, you’re probably going to want to move out of the pro and premium per user spaces into the fabric space because it provides so much more capabilities. And the challenge will be for you as the users is to get your leadership to buy into like this is something that we should invest in. And

1:07:25 I think we’re getting closer like the featured richness is getting capable there. and you can use your existing tools. You can use your existing snowflake, you can use your existing data bricks to support data coming into the the fabric world, but I think when you talk about access and control, you’re going to want to use a single system which will be fabric. It’s it’s becoming very robust. There’s a lot of good features coming out. Anyways, awesome. Thank you so much for listening. This is a long episode. Thank you so much for spending your time with us today. I hope you enjoyed this conversation around the optimal architecture and fabric. Hope this gave

1:07:58 You a couple things to think about or unpack with your team. That being said, please share this with somebody else if you found this conversation valuable. Tommy, where else can you find the podcast? You can find us in Apple, Spotify, wherever you get your podcast. Make sure to subscribe and leave a rating. It helps us out a ton. And please share with a friend since we do this for free. Do you have a question, idea, or topic that you want us to talk about in a future episode? Head over to powerbi.tips/mpodcast. Leave your name and a great question. And finally, join us live every Tuesday and Thursday, 7:30 a.m. Central, and

1:08:31 Join the conversation on all PowerB tips social media channels. Awesome. Thank you all so much, and we’ll see you next time.

Thank You

Thanks for listening to this deep dive on Power BI architecture in the Fabric era!

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

Future-Proofing Excel – Ep. 443

More Posts

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.

Feb 18, 2026

Hiring the Report Developer – Ep. 503

Mike and Tommy unpack what a report developer should know in 2026 — from paginated reports and the SSRS migration trend to the line between report building and data modeling.

Feb 13, 2026

Trusting In Microsoft Fabric – Ep. 502

Mike and Tommy dive deep into whether Microsoft Fabric has earned our trust after two years. Plus, the SaaS apocalypse is here, AI intensifies work, and Semantic Link goes GA.