Microsoft Fabric Feature Pyramid – Ep. 462
If you had to rank Fabric features into a pyramid, what sits at the top? Mike and Tommy build their feature pyramid live, debating which capabilities are most essential and which are nice-to-haves. A great framework for prioritizing what to learn and adopt.
News & Announcements
-
Calendar-Based Time Intelligence (Preview) — Tailored time intelligence that works with custom calendars, addressing one of the biggest pain points in DAX development.
-
LLM-Powered Data Agent from Mirrored Databases — Data Agents can now work with mirrored databases, expanding the reach of AI-powered data access.
-
TMDL View Generally Available — TMDL View in Power BI Desktop hits GA, making text-based model editing a first-class experience.
-
DAX UDFs Preview: Code Once, Reuse Everywhere — Official announcement of DAX User Defined Functions in preview.
Main Discussion: The Feature Pyramid
Mike and Tommy build a pyramid with the most important Fabric feature at the peak:
🔺 Level 1: The Most Essential (1 Feature)
The single feature they consider most critical to the entire Fabric platform. This is the foundation everything else builds on.
🔺 Level 2: The Intangibles (2 Features)
Two capabilities that may not be flashy but make everything else work better—the glue features.
🔺 Level 3: The Core Features (3 Features)
The everyday workhorses that most Fabric users interact with regularly.
🔺 Level 4: The Supporting Cast (4 Features)
Important capabilities that round out the platform but aren’t the first thing you’d adopt.
The Value of Ranking
The pyramid exercise forces prioritization:
- What should teams adopt first?
- Where should training budgets go?
- Which features are prerequisites for others?
- What can wait until later phases?
This is a useful framework for any organization building their Fabric adoption roadmap.
Looking Forward
The feature pyramid will look different in a year as capabilities mature and new ones emerge. But the exercise of ranking and prioritizing remains valuable—it prevents the “boil the ocean” approach that derails many platform adoptions.
Episode Transcript
Full verbatim transcript — click any timestamp to jump to that moment:
0:00 Good Morning and welcome back to the Explicit Measures podcast with Tommy and
0:32 Mike. Welcome back everyone. Happy to have you back. Hello. Happy Thursday, Mike. How you doing? I’m doing well. And it feels like I can’t believe what like this has just been a crazy week. I’m just getting back from the Fab Fabric Conference in Vienna. Lots of news and announcements. I am overwhelmed with all the new toys we get to play with features. I’m just there’s just so much stuff going on. And it’s also interesting and fun seeing the community gravitate to which features they’re liking or they’re reacting to a little bit here right now. So, which has been really
1:04 Fun. I’m I’m totally enjoying that. All right, Tommy. Today’s topic is going to be a Microsoft fabric feature pyramid. I think I said that right. I didn’t I didn’t stumble on my words there. This was a creation out of Tommy’s mind. So, if you love this episode, it was from Tommy. If you hate this episode, it was from Tommy. So, way to put it right off the bat. Go. No, I think it’s going to be a good episode. I’m actually excited about this one. I hope I can come up with good good things that we can talk about today. I don’t think we’re going to get as far as you think we will, but go ahead, Tommy.
1:36 What is this? Like what are we doing here? What we have what are we doing today? I’m excited about this. So, we’ll we’ll dive into it too, but the idea simply is this. There’s levels to appear level one, level two, level three, and what’s most essential for us. And we are taking the gamut of features from DAX to semantic models to bookmarks to drill through to pipelines. What is most essential, most important to us? What do we love the most? So it’s a feature pyramid. Level one is the pinnacle because there’s only one. Obviously, if you look at a pyramid, then there’s two supporting that and so forth and so
2:08 Forth. So we’re basically ranking all these features, but again, in a shape of a pyramid. So really, I I can’t wait to get into this. I think we’re gonna have fun with this. All right. Yeah, I agree. That’s going to be our main topic today. But before we get to the main topic, we have a number of news items again coming out of Vienna. There’s still a lot of news coming out. There’s blogs still being posted on the Microsoft blog. Things were announced and now the blogs are like people Microsoft team has been able to get back to the office and like finish the blog and complete the announcement and get everything else. So, we’re seeing a lot of new posts, I guess, coming out here recently. So,
2:42 Tommy, give us our first one. Yeah, we have a lot to run through, Mike. So the first one is something called calendarbased time intelligence in PowerBI. This was on the PowerBI blog which we don’t see as many updates. Y a lot of you may be thinking Tommy Mike we have time intelligence. It’s called DAX and it’s called a calendar or having a date table but this is a different approach. It’s not just a date table and it’s simply allowing to signify and define what columns represent specific time attributes.
3:14 Again, something that was available in PowerBI. This is doesn’t sound new, but however, this works with any type of calendar. Before we were stuck with your normal Gagorian calendar and also works with 13 months, lunar more and there’s no assumptions. PowerBI doesn’t impose any structural rules. It can be as custom as you need. Sparse dates are supported. So continuous dates which were always required are now recommended. So take that as you will. Weekly based calculations like total
3:48 Weekto date is actually now a function and again some performance gains as well. It looks very similar to the feature. There’s not a ton here. There’s some documentation but there’s some validation rules. The and there’s some more documentation as well. Also works with Timle which we’ll get into as well. And yeah, just this mapping, it’s a lot more integrated than the feature before was like, hey, what did what calendars your date table? What columns the date? There’s a lot more to this. So hopefully making date and especially
4:21 Time intelligence, which we all do, easier. So I I again at the in the end of the day, Tom, have you ever worked on a project that had a 445 or 545 or whatever the heck numbers they make up these days? Have you have you worked on one of those projects? Dude, I’ve worked on those and the fiscal year ones which are a pain and those are the ones you go, okay, I’m like I’m gonna do it because they asked me, but I don’t want Yes. What are we doing? Okay, so there’s also this concept. , again, let me jump into this
4:52 One. This all looks like a lot of these calendars are being built in DAX, right? So, this is a DAX view of a calendar, right? So, you can do different calendar types. You have you can pick the ones that you want to design. This is one of these areas that I’m going to be like, I’m just going to wait till Marco Russo tells me the ones I need to go use and I’m just going to let him build all of it and then I’m just going to go pick the one that I need. I I don’t want to spend a lot of time effort learning this stuff, but the fact that it’s here, this makes it way more easy for us to work with these like kind
5:25 Of oddball calendar things. And it’s it always goes back to like when you’re looking at a business, right? You want these really comparable year-over-year things and you’re trying to get the right structure of a calendar and months don’t always align. Random thought, Tommy, this is totally random. I had heard at one point that if you made 13 months, every month would have like what 27 days, 28 days. Okay, that would be
5:56 That’d be like deal have 13 months and then everything like is nice and easy and it all fits together very well. Oh, and compared to having days that are like 31 days and then 28 days and then 28 days and like Yes, exactly. All these different day like we have 12 months for for whatever reason. Had they just done 13 months, it would have been like a nice even every month has the same amount of Right. Does that boggle your mind? It boggles my mind a little bit. No, I don’t know if it would bother me. I don’t think it bothers me now. I don’t think I’ve ever looked like, “Oh my gosh, September, I thought it was 31. Now it’s 30. My whole week’s off.” kind
6:29 Of thing. So, I see what you’re saying, but I don’t think it’s been too much of a pain point in my life. I think you just got used to it. But I think there’s like a there’s potentially a better. Anyways, this is one of these things. It’s like this there’s a better way. We didn’t even know it existed. You just dealt with it. I think this is for me one of the things that I do in designs for models. date calendars are not too egregiously large or extremely big. maybe they are a little bit but shouldn’t be I I me personally so so let’s think about like the model design right when
7:02 You build calendars in things you can build calendars with your backend right so you can build it in your lakehouse you can build it in your power query m you can build tables in in upstream and then when you get to the semantic model you can relate those date tables to other columns and things that are inside the model of the opinion like the date calendar yes it doesn’t compress as Well, because it’s a date calendar and it’s written in DAX, maybe there’s a little bit more performance you get out of it. But at the end of the day, I’m like, I don’t mind the date calendar just being written in DAX. I do. I do. I I I I
7:39 Actually have a problem with that for whatever. , not a problem. It’s a problem actually because if you with the what we have available to us today, I would rather everyone use the same date calendar. And yes, they’re more or less the same continuous stage. You probably have start of month, but it’s just like one of those things you want to give to the organization and probably some custom things that you have as well for your organization. So, it’s fine for ad hoc, but if I’m building out there like how like how many people in your organization are building like , okay, look, if you’re building like a
8:12 Weird calendar like a 445 or 54 something, right? May maybe, but like at the end of the day when you’re doing reporting, calendars are pretty known quantities. Yeah. I I don’t really care to do , I don’t really care to do a whole lot with that. And I I will say for beginner users, turning on auto datetime intelligence is super helpful. No, what? No, for beginner users, absolutely. It makes a bunch of extra junk calendars in the model, which is not good. But like I will I will argue the fe like the
8:44 Feature is good in the fact that auto datetime calendars is is nice to work with. I like me personally I like having a date calendar that just has border year month day information in it. I like that. I think that’s helpful. What I don’t like is when you turn on auto datetime calendar, it makes like a million of them and they’re all over the place and your model can get really large. And if you have a date that’s in continuous or it has like this 99999 date or 1900 date like your your date calendar blows up and it causes other problem. So the in the
9:17 Intent what I’m trying to say is the intent of the calendar is the right intent. How it got implemented and how all the date calendars have to be like a consistent list of dates is not a good idea. And that’s why I you push more towards like just build the date calendar do it somewhere else. I like building it in DAX. I think that seems like a fair compromise. I don’t like to use autodate time, but like the the concept of wherever there’s a date column and you could just easily pick the the parsing of that date field is
9:50 Nice. The problem I have with the auto date time outside of the bloat that it provides your model is when you add it to a visual. I hate how it looks in the visual because it just shows you every month but it’s not the continuous dates. So it shows you 2016 2015 or 2015 2016. You go drill down you want to see the month January February March April. Doesn’t show you like 1 2018 or 11 20 25 and then the next month month year it’s month or quarter. I hate that view so
10:23 Much that doesn’t do anything for anyone. So it’s the visual side of it that I have the problem with. So you you heir on the side of making additional in your date calendar you have like start of month type dates in there that way you can you can format it the way you want and put it in there correctly. The ability to create a date table should be available to any PowerBI user just like you give a mouse to someone on their first day of work and mints it just should be on their desk when they get there. It’s too easy to do. It’s a simple guest gesture and it just should be part of it. when if I start a
10:57 New job, I have a date calendar and I have a card saying welcome to the company. It should be one and the same. That’s interesting, Tommy, because I I didn’t really think about the implications of using those auto date times on the the like you can get around it a different couple ways, but you got to be smart about how you handle those things. Again, back to your point, Tommy, like the issue the issue really isn’t like the auto date time. The issue is you can’t have more enhanced information about multiple columns. Like I can’t have the same column formatted three different ways in the same visual
11:31 Is an auto it’s got to be a new it’s got to be a new a new column inside the date calendar. Yeah, I can see your dates are just weird. I I would agree with you. It it adds it adds bloat where it doesn’t necessarily need to add bloat to the model. It feels like this should be a a solution that can be solved slightly differently. Anyways, yeah, I think we’ve said enough about dates, but yeah, I agree. Regardless, at the end of the day, defining different calendars I think are useful. I do think this from what I understand is this is defining things. Again, I’ll have to go to the feature a bit more. So, again, don’t
12:03 Quote me on this one, but using on this new item, , this enhanced DAX date calendar intelligence, , there is some more work to be done in DAX to make these calendars work. I think so% it’ll be interesting. I’ll see where it goes. I like the idea of it. Marco Russo seems to be very excited about it. I’m okay with it. All right. So, let’s touch on the last two here. So, one I’m just going to touch on. I just I love that this is something that they wrote about something they’re doing. the blog article title unlocking LLM large language model powered through data
12:36 Agent from your mirror databases. Which basically what we’re doing here is rather than I can only use things in my lakehouse, things I’ve generated in fabric, I can use any database and Azure, Oracle, Snowflake, SQL Server, data bricks, data bricks catalog, SQL database, Cosmos and I can mirror those databases like I can now, but I can use that with data agents. So my data is coming from somewhere else. I can use that with the data agent. This is just Mike. Again, we we we’ve talked about LLMs. We’ve talked about data agents and
13:09 Again, they run on data. The reason why if it’s not working for your company right now, it’s because your data and you either don’t have enough or it’s not structured the right way. So, give me your data. Give give the data to the agent and let’s feed it and so we can tune it. So, this is a very important feature. So, this feature that just came out, I was just actually just yesterday, for those of you who are not members, if you want to become a member, we’d love you to become a member of the YouTube channel here as well. We just did, a full building a data agent online yesterday. we did we did
13:42 A whole how to build the agent, how to create one, how to deploy it, how to consume it. So, we have a whole hourong session around building you the data agents and creating one and all the the UI that goes along with this. So this is just adding a lot more data sources. I believe the the original data sources were you could use lakehouse, you could use a semantic model and you could use custo. Those were the initial three sources. And now with this you get a whole bunch of other stuff. Everything mirrored. All the mirrored things you can go get which is going to be powerful here as well.
14:14 Thousand%. So happy to see that. And Mike, let’s end on pretty good one. Pretty good one. We’re going to I think I don’t know if we have your sound available but tindle view is now you may you got to wait for me to fit it just tindle just tim Mike loves tindle so much just anyone who says tindle we hear it so exactly well let’s get two of them because tindle view is now generally available yes this is definitely very worthy here has changed how I build things honestly
14:46 I was just going through a model the other day having to write some documentation on way faster than just going right to the timal, dragging the tables I wanted, and within 30 minutes, I did something that would have normally taken me longer than that, an hour, maybe more, just to write in some comments, adjust some things, format some stuff. Much much better. Really, really like And there’s an interesting concept here, Mike. I I don’t know how much you dive into the article, but one of the things they talked about is PowerBI desktop is now hardened to support opening and editing any semantic model. Mhm.
15:17 Which for me I haven’t heard the word hardened before in PowerBI desktop. So good thing a long time. Not in a long time. So Ruy, which obviously understood that most people would know what that means. What does it actually mean for PowerBI desktop? And again, it’s not just Timol, it’s desktop to be hardened. And it simply means tools like PowerBI desktop no longer assume that they are the only tools editing a semantic model. So they now support collaborative editing scenarios with external tools or code file editor AI. So this is interesting Mike that
15:50 We’re taking the concept and what Timol has done is not just a new coding language for PowerBI desktop but it’s also changed the way that PowerBI behaves with the semantic model. Mhm. Feel like we’re entering some middle philosophy here. But it’s simply again before PowerBI supported external tools, but it still assumed and it still worked and behaved as if it was the primary source for any semantic model. Now PowerBI desktop is treating any semantic model that you load that it could have come from anywhere. For
16:21 Example, VS Code. I could generate theoretically my entire semantic model in VS Code using an agent or copy and pasting some files and getting everything set up that way. So when PowerBI desktop opens, right, it’s not like, hey, this is not from desktop before. This is not a binary file that we done something to make it look nice. It’s expecting or assumes that it can come from other sources. So you talk about the developer role here and this is exactly on point completely on point with what you
16:53 Talked about around PowerBI is becoming more of a developer tool. The the other two things that I think I want to note here to your point Tom the hardening of desktop why this is so important. The last two items in the list here I think are very important here. It says beyond these capabilities the hardening also unlocks two major PowerBI features. the ability to download an XMLA altered semantic model as a PBX file. Again, now that there’s a consistent standard across all these files, you don’t need to download the data anymore. You can download just like the the PBI version bundled as a PBX file. So what I think you’re going to
17:28 Start seeing here is the PBIX standard version of a single file that represents a model and a report will now just be under the hood all the the only the format of the PBIP format which is incredibly important. So that’s another thing that’s really useful, right? If you’re using thirdparty tools to manipulate things in power.com, notebooks, changing things like now that doesn’t there’s no blockers now. This is just unblocking you for a lot of other pro tooling that can help you build and edit things which I think is extremely useful here.
18:00 So now I want to point out no so I want to ask you a question now that this is generally available and this is what I always like to do with you Mike. the scenario of since this is now GA this is now part of the lexicon part of is as common as DAX when would you expect someone you’re hiring someone you work at a company they are going to be a data analyst they say they’re specialists in PowerBI should they know Timol yes okay I they they this is like a this is maybe on my pyramid somewhere
18:36 Just , so, , this is something that I think is going to be very useful for people just in general just to make sure they know how it works. Again, there’s this idea of like just build it so it works and then there’s this idea of optimizing and getting faster and more efficient on things. To me, TIDLE is one of these optimization pieces, right? You yes, how to build a semantic model on desktop. Fine. When you’re building smaller models or going quickly, not a big deal. when you’re managing large amount of models or models that have to be broken apart, excuse me, it becomes much more
19:09 Challenging. And so I think as you become more of a professional developer around these these experiences, you don’t want to have clicky clicky buttons all the time. You just it slows you down. And I think there’s there’s something coming here, Tommy, that I don’t I don’t think we all see it yet. And I’m going to make another call here just like I did when when I said, , remember how in episode one? Yes. Yes, man. We are going to ride that to the sun. Ride that to I’m going to ride that forever. Yeah. Goodness. That was like vision like no other. Like I had I dialed it. You’re like someone who got three
19:40 Numbers right on the lotto like remember that one time on the lotto. The one time I won the lottery. Yeah. 25 years ago. Yeah. Four years ago. Yeah. That was pretty good. Not bad. So I will say this. I think another monumental shift in how things are going to be built here and this is not just PowerBI. This is this is monumental shift across everything. The advantation or the the invention of the MCP. I think the MCP is talking model context protocol. It’s a it’s a term people are starting to use. Models are getting so much better at writing
20:13 Code. Mhm. was working with my developers this last week and my developers were saying, “Look, we can’t we can’t write code without we can write code without a co-pilot or something, but we’re so much faster now.” Like things that used to take weeks are taking days now. Like it’s it’s that’s crazy. Isn’t that awesome? It’s awesome, but it also means there’s a level of knowledge of like I need to be able to understand the infrastructure very intently. Like what what libraries am I using? How do I want the code to be written? Like what are the standards I
20:45 Need? So there’s a lot more that you have to do like more instructions you have to give the AI to make sure that it does exactly what you want. Give it boundaries. but this whole concept of the MCP I think is going to be really interesting here. And this is going to change how we build things. It’s going to change how we communicate with computers in general. It’s going to substantially change what we do. And so again going back to when you back to Javin’s paradox again, right? When you make things very easy to build,
21:17 Create, establish, you’re just going to get a lot more of them, right? And so I think the more Microsoft pushes into , hey MCP, make me a Gregorian calendar or hey MCP, make me a 445 for my just build it and knows what we’re talking about and can build all the DAXs for you automatically. when I was doing the data agents, Microsoft is investing in natural language to SQL, natural language to KQL, natural language to DAX, and as long as there was data inside the model, it was doing a pretty
21:50 Decent job of writing some DAX and SQL that was on our our build a data agent yesterday that we did. Super impactful. I think it’s going to be useful. So it you’re going to I think you’re going to continue to see investment in this instead of programming things with like DAX and SQL code you’re going to start talking to things and describing what you want and it’ll just spit out code that you can review and then use it. So I think this is going to be another major game changer here and we’re going to start seeing a lot of tooling or tools show up where you’re going to communicate to it
22:21 And it’s going to just build stuff that you you think it thinks you need. Not only do I agree with you 110%, but I don’t think that’s outlandish at all what you’re saying. I think that’s just the start of this. This is just the start. So, no, I I think this this is a lot easier win than the your episode one that you got so perfectly. This is I think you’re right on point. Right on point. So, love to see it. So, I think it’s time to get into it, Mike. I am I feel like a school kid on their first or last day of school depending on how you did on school. So, excellent.
22:54 Can you guess which one? Me. It was the last day. So, last day. Last day. I I was probably a last day school person as well. So, we’re going to do Mike this thing called the feature pyramid. And if you’re listening, follow along. Again, the simple concept to your you look at a pyramid, there are levels to it. And we can only pick so many items. And again, we’re taking the gamut any feature in in fabric or PowerBI and what’s essential, the most essential to us. Level one, the most important thing, but there’s only one feature. It’s the top of the pyramid. So, this is the echelon, the
23:26 Echelon of everything. Level two, we have two features. They are the in a sense the second most out of what we can pick. The pyramid continues to go down. Again, even if something’s level five or level four, doesn’t mean it’s not important because obviously still we’re picking from high up. But this is really from you and me, Mike, and this is our opinion, right? This is our way of looking at the fabric in PowerBI world on what’s most essential. You can you can’t take this away from us. And again, we’re going to stop start at the top of the pyramid. Mike, do you have any
23:58 Questions about the pyramid and this idea before we begin? I think I’m pretty good. I’m pretty sure our pyramids are going to be way different and we’re going to have a lot of arguments and friction about what you’re picking on which level of what things here. But yeah, I I think this will be interesting to see where we land on on what features you’re finding are the most important features and how we get through these things. So, do you want to start off? Do you want to kick it off? How do you want to do this? And then we’ll jump from there. I’ll go first. I’ll give you I’ll just lob one up here at the beginning. This is level one. This This is number
24:30 One. This is the This is the most the most important thing you must know right here. This is this is the this is the clencher. Listen, come in come in close. Come in close. We’re going to talk about the most important thing that you must understand around fabric PowerBI. I’m going to lean on it’s going to be the semantic model. Okay, I think that I think that is by far the most important thing that you need to understand, get your head around, understand how it works. And I’ll give you some reasoning here. I’ll give you just my reasoning. I’ve been
25:02 Hearing some language around especially from data bricks and data bricks boo on you. This is I like you data bricks but like you’re definitely I think throwing some misinformation here that I do not agree with the data bricks is basically saying well you don’t really need to throw your AI at the oh semantic model. You need to throw your AI at like tables. Go throw your AI at tables and things. I’m like okay great. I was just doing a data agents experience and I had a whole bunch of lakehouse tables. The AI had no clue on how to do the joins between different
25:34 Columns. So in in the data agent, I had to go in and describe this table is related to that column. This this column these two columns are related. So it way when it needed to build a join between two things in SQL, it understood the relationships between the tables. It understood how to join these things. So when I look at the semantic model, it is becoming the lynch pin for everything you want to build, right? You have this whole world of data engineering. you have lakehouses, you have power query, you have data flows, you have
26:07 Dataf flow gen two, you have this the warehouse like all this data engineering stuff everything you do on that side of the world data engineering world it all gets shaped and designed and and consistently it makes it gets conformed into the thing that is the semantic model like that’s that is the way you serve the data right and in the semantic model I capture the relationships between tables I capture the calculations that I want with the with additional measures. I describe the columns and the names of the tables and where they came from. There’s this whole
26:39 World of like complex stuff that we do to shape data around to go from a transactional system to a reporting system to the semantic model. And then on the other side of the semantic model, we then again proliferate back out to a lot of tools, right? So the semantic model becomes the source of data for the agent, the PowerBI report, the pageionated report, the exploration. Like there’s all these other tools on the other side of the semantic model that use that in-memory cache to build everything. When I look at this world, I
27:11 Think of like there’s all this data engineering world and there’s all this reporting world. Both of those worlds have to exist, but they both have to. One one feeds the semantic model and the other consumes the semantic model. There’s not much in between in my opinion. So to me, when I when I step really far back and take a big picture at the ecosystem, one of the things Microsoft has done really really well is make the semantic model very pivotal for most if not all of your reporting you need in the
27:42 Organization. Now, let me just give a little bit of a caveat here. The caveat is there are will be there will be experts in your organization. There will be people who need to discover and sort through and and make flat really wide reports and export data. Fine. I understand that that’s going to exist. I would rather teach them SQL or build something in the system. Here’s some shortcuts to some tables. Here’s a lakehouse. Go build what you want in Spark Notebooks. like go do things like there’s there’s going to be some use cases where there’s going to be like advanced users that are going to not
28:14 Need exactly the semantic model, but if you’re talking about getting data out quickly, that’s where it’s at. And I think also the more we push into this MCP AI understanding what’s going on, you’re really going to want to push AI towards that the model side of things because it’s just going to get better results initially. Now it the AI may get better and it may be better upstream into those raw tables as well, but you’re still going to have to define all the relationships. So somewhat regardless at the end of the day, whether you’re doing
28:46 In data bricks or in semantic models or in the lakehouse or in fabric, someone’s got to define the relationships. Even if you it’s if it’s in the Unity catalog or it’s in the semantic model. I don’t care what you do. I just don’t love the fact that I have to have compute running in order just to run the the the SQL queries back to data bricks. I’d rather have a semantic model that’s just on and I’ve already paid for it thing. So, I’m just going to pause right there. That’s I think my number one for me is the semantic model. No, there’s no question here, Mike. I wanted to try to take a hot take here
29:17 And choose something different, but all roads lead to the semantic model here. I I think that’s my my level one as well, the top of the pyramid. Because to your point, Mike, if you were to look at a map of America, the roads are the semantic model. It is your navigation system in PowerBI and in fabric. We’re building data to support it. And this is why no matter what I could try to do even from a hottake point of view like a skip bailis like just say something for the sake of saying it, I couldn’t because the idea here Mike is simply that everything we do around data is to
29:51 Support the semantic model or for the semantic model to support something else. So it is the roads. If you were to look at a map, those roads are the semantic model. It is what navigates and drives us is how and why we build the tables that we do. Even what I wanted to choose as my level one, I couldn’t because the simpler this is, the simpler everything else is. It is our source of truth. And to your point, even if it was just reports, which it’s not anymore. Even if just the semantic model
30:22 Supported reporting, it still is has to be number one. This is why most people are building lakehouses. And at the end of the day, a lakehouse is going to support a semantic model in some capacity. You may use it for their applications, but I guarantee you it’s going to get into a semantic model somehow. It is how a company is going to drive their information. It is going to be how they communicate information. If you cannot get a grasp on this and you do not have this solid and down solid, everything else is going to fail. So, it has to be the top of the
30:55 Pyramid. I have no argument here. , no matter how hard I try to choose something else, Mike, this it anyone who’s working in data, anyone who’s working in Microsoft’s data platform, and again, this supports all those things. Data engineering to me, data science, you have to have a solid understanding. And actually, I think that’s conservative. I think you need to be able to master the semantic model. It is bar none what drives us and what drives the data. So no that I think there’s no argument there man
31:28 And I’ve been doing semantic model work for like since it came out right so I started at the very beginning and there was a lot of aha moments along the way as I started building semantic models and I think this is I’ve been doing PowerBI since it came out 2015 so we’re we’re now at 10 years both you and I Tommy just noodling on models and working with them 20 years experience here yeah wow that’s a waste of time so so but between our 20 years of experience the a lot knowledge around those pieces. And again, I keep going back down to like I don’t like
31:59 Writing really difficult decks anymore. I like making better tables. I like making more tables. I like simplifying the data engineering side. So, I I do a lot more man all the phrases that I’ve heard Microsoft say, Matthew Roach’s maximum, , transform the data as far upstream as possible and as far downstream as necessary. Such a great point. I I totally agree with this. So the the point I’m trying to make here is the semantic model learning you just need to get going. Like you just need to get into it. You need to start working with it. And what you’ll find is you’ll
32:31 You’ll you’ll figure out like some of the basic moments. You’ll have some really big aha moments. And then what will happen is over time you’ll just become more comfortable with what whatever that is. So the sooner you can be comfortable building semantic models and understanding them the the better or more effective I think you’ll be as a developer of data right this may be the smartest thing we’ve ever said we got 20 years of PowerBI fabric or PowerBI experience 10 years of MVP experience here and we both came to the same conclusion so I I think that’s
33:04 A perfect way to start this off with level one I’m going to jump into level two Mike All right you do level two and then we’ll see that means there are two features here. This is probably we’re going to divert. I think one of them we’re going to have the same. I think the other one we’re going to we’re we’re going to differ. So again, level two two features. My first one’s going to be the DAX. And this is where I wanted to put as my level one because if if semantic model is the heartbeat then the brains are DAX because at the end I I know you said you mentioned that you don’t want to say write a lot of complex DAXs anymore.
33:38 Nobody does. But there’s scenarios and what I’ve been able to achieve with DAX and the solutions and be able to come create the metrics possible or create the solutions possible. It’s not even just metrics at this point. It’s what DAX can do. Provide the information in a way that users need to see it and stakeholders have need to see it via the DAX language. And the better I’ve gotten at it and the more experience I’ve gotten, the more skill I’ve gotten, the methods that you can do, , this is what everything runs on. metric sets, scorecards, and again your semantic
34:10 Model. You’re not going to have a successful semantic model if you don’t have successful DAX or you don’t have DAX at all. So, I had to put DAX as high as level two just because of how important it is in what we do. And from a skill point of view, again, you can get away with your sums in a calculator with totally monthto date, but what else going to do? And I’ve learned this from Greg Baldini, , who’s been doing DAX for years. I’ve asked him questions on things and how he’s been able to solve problems. The dude’s way smarter than I am. I can’t The things he thinks about his
34:42 Head is like me. Yeah, I what I can achieve I would have never gotten to without some of those formulas and those functions and to provide those questions and yeah there’s endear rounds and things you can do but nothing on how dynamic DAX can be. The idea with filter context, the idea that in the evaluation context and how that can achieve so many business problems is absolutely out to put DAX and my second one, Mike, or unless you want to pause and comment on that. I can just jump into feature two or the
35:15 Yeah, I , I don’t think I’m going to pick DAX as my level two. I’ll just I think it’s a good a good thing. The there was an aha moment when I was learning DAX. The aha moment was, wow, I could build a bunch of SQL statements that aggregate data a lot of different ways or I can define the aggregation sum of this column and then pick any dimension that I want and just by dragging the different fields together. I could just basically rewrite SQL over and over and over again, right? So, you
35:48 Know, the aggregations are defined beforehand and then I can then pick whatever dimensions I want and the answers just come out correctly. There’s some that’s oversimplification of what’s going on there, but like that the power of doing that is incredibly useful. However, I’m going to just lump all that DAX stuff into like the semantic model at my number one level. Like I’m not going to really cuz building semantic models is like two my two opinions, right? It’s it’s like how do you make the tables and then how do you write the DAX on top of the semantic
36:21 Model. So to me it’s like the feature of understanding that experience that’s where you need to spend your time and I totally see that I focus on that. That’s why I’m going to I’m not going to use DAX as my number two. Totally. And like I said totally fine. I’ve seen people who are great at semantic models terrible at DAX and the other way around. So I had to separate them. And again from a feature point of view my second one I and I looked at this as I was trying to go through this feature pyramid. Mike, I was trying to think about the core things that make data flow and make data work right now in the world that we live in. So, this
36:53 Was an easy one for me, Mike. This was the lakehouse. And I’m realizing that I’m starting my methodology or I’m start I’m approaching problems. I’m approaching projects around the lakehouse and how that’s going to flow into semantic model. But I want to do everything in the lakehouse first because I’m really I’ve always I don’t know maybe it’s just like you like to clutter your garage but to be able to store your data rather than it being too dynamic like it can be in a semantic model because you’re at the whims of the data source. I can create a data source which is what I think is so
37:27 Important. I’ve heard so many times from organizations I’ve worked with on we have a Azure lakehouse and we have these things but we can’t control it and it gets really hard because things get slow to load but I can get things from Salesforce which is a pain in the any CRM system I can load this to a lakehouse I can mirror data what I can do in a lakehouse and because it’s an elevation in a sense of the semantic model where again I don’t have to build a lakehouse just for reporting and this is the I think this is the big game changer on what lakehouse and really fabric can do. So,
38:02 A lakehouse and DAX are going to be my level two, Mike. So, I’m intrigued to see what you have to say about your second level of the pyramid. What was your second one again? Just make sure I got that one. Okay. The reason I’m asking that question is because I think we’re going to align very well on the lakehouse thing. So, I’m going to I’m going to rearrange my two items here. My first one I’m going to say after talking with Brad and okay and talking some things through again thinking through some again this has
38:33 Been one of the advantages of being on the podcast is we can talk to Microsoft PMs that show up every month so FYI people who are just new to the podcast right we’ve been bringing in a lot of really top-notch high-end people and we’re not talking about interviewing them we come in and we talk about real features and things they build and I I’ve got a December. I’ve got a really special PM coming. You’re going to really want to watch out for December for all of us. I should I should probably I should probably tell you about it at some point. You’re going to love it. Tommy, this is a great choice. You’re going to love it. The person we have
39:04 I’m on the edge of my seat, Mike, as always with these things. So, I’m going to go I’m going to go with it’s the Now, this is something that Microsoft did not build, but they adopted. Okay. So, for me, I agree with you, Tommy. Lakehouse is extremely important. It’s a feature that is very well used. It adds a lot of flexibility. And there’s this whole mentality of when I started doing my data science, my masters in data science, I learned about lakehouse delta formatted tables. The idea of splitting the compute and the storage systems apart and using them
39:37 Differently. That was to me was the revolutionary moment that I saw happening which was like okay typically everything was on a SQL server. The storage and the compute was in the same machine. You paid for all of it together. And then Spark came out and separated the two. And so Microsoft has done a huge amount of improvements around them communicating with this lakehouse format. Now I think the the trick of or the secret sauce here is Delta. So Delta is your level two. Delta is on my level two. I like that.
40:08 Let me give you some reasoning why. So like this it pairs well with the lakehouse because Delta is part of that, right? You’re storing tables down into the lakehouse format. The delta format is also evolving. So m data bricks has also updated the delta format to also handle iceberg. So now you have this delta format that’s like a universal format. Like whether it’s hoodie, whether it’s iceberg, whether it’s delta, the the format of that storage file system thing is now being adapted so that any tool can read these different schemas al together, which is awesome. But then think about all the
40:42 Other things we’re doing, Tommy. like we have SQL databases, we have data warehouses, we have semantic models, we have direct lake all these features hinge on the fact that delta exists. So to me there’s a core functionality that has shifted when Microsoft said we are going to adopt the delta format as the lakehouse format for tables. And so this allows them to do like all these other improvements. The Veilox engine, the gluten engine, these are other
41:15 Engines that Microsoft is building to make it faster and more efficient to read and write data from this delta formatted area. The format of Delta is the same. It’s consistent. It does not change. Multiple tools can use it, but they’re implementing them slightly differently. So the fact that I have, , before Tommy, it was like if I was getting data in from a dataf flow gen one, I was making CSV files and they sat in the lake. not really the most ideal solution. However, now we’ve got dataf flows gen 2 notebooks, , pipelines, data warehouses, like all of
41:48 The tools, all of the different compute engines that Microsoft has built at some level they’re all reading and writing the Delta format. So, to me, it’s understanding what’s in Delta format and how do you use that to your advantage to insert or update. So it it plays really good on the data engineering side as a core technology that they’ve implemented, but then you need to understand how it works across all the different tools because now you can leverage it in multiple places. And I will say this, Delta format is incredibly powerful. The one thing that I feel like it’s a little
42:22 Bit weak on where I would start leaning more towards the warehouse side again even I believe warehouses are actually still doing some form of like different version of Delta anyways but when you have quick transactions or updating records quickly and you need that immediate answer like it needs to be updated you need one row of data very quick to me that feels more like the data warehouse or the SQL side of the world inside fabric. And I’ve had some projects where we tried to solve that
42:54 Like logging or real time logging with not real time but like close to real-time logging inside the delta format. It just wasn’t as it just wasn’t as efficient and it was a little bit more work to make it seamless. So I found like in those situations it would have been probably just better to stand up a SQL database or the SQL data warehouse and just do all the transactional information there a little bit more and then having that automatically mirror down for reporting. So all all this to say is my one of my number two is the integration of the delta format into all
43:26 Of the compute projects across all of fabric. That’s that’s very important to know. All right. So no honestly I I completely I see that and I think yeah from the conversation with the Brad the fact that it’s crossplatform or cross feature in fabric that it’s not just the lakehouse. So I love that. So what’s what’s your second one? So my second one might be a bit controversial here. So I think we’re doing, , if you think about all of the data engineering and the shaping and the modeling and we’re
43:57 Talking about some very core pieces here of how to consume things. This I’m going to phrase it one way but then come back to the feature. I think the ability to articulate data as information, the ability to articulate these tables of data into visuals and graphically show a story or an interactive app around the data is so important. So, I’m going to say I don’t love all the things that are coming out from the report side, but I’m going to say reports or okay,
44:28 Like that that side of things, but it’s it’s more about like I’m going to say reports, but the most essential skill is how do you visually communicate data? And I we’ve talked about this a lot at length on the podcast, Tommy, which is when I’m building reports, I’m speaking to you through visuals. we’re communicating through a different medium than other than using words. So the the language of how you communicate via data that’s the part that I think is important here and that it happens to be right now the main
45:02 Medium of doing that is going to be in reports that that’s the main part we’re doing right now. So it may change in the future. There may be other opportunities moving moving on here. I’m actively working on making that medium bigger than just reports. So stay tuned. there’s things coming from PowerBI tips and the tips plus ecosystem that are going to be really useful here I think in this community. And also I’m very invested in this. We have Power Designer. We have the fastest way to build a styled report. Period. Like there’s no other tool on the market
45:34 That can do this as fast or as cheap as we can. So if you’re not using workloads today, workloads are going to change how you do things inside PowerBI. And our our our workload called Power Designer is a gamecher. It’s super fast. You can stylize a bunch of things. We’re adding new features every week. We’re trying to really improve the product to make it super easy to use. I don’t have to spend a ton of time building a report anymore. I can spend a day or two stylizing a report and just now reuse it over and over and over again. the that that is going to be very important for people to
46:08 Get consistent goodlooking reports out because everyone can build I’ll say this a a lot of people are coming from Excel they understand shaping and manipulating data not everyone has the eye for design and so there’s a handful of people in the community that are really good at design they may not be great at shaping data one I will say is Miguel Meyers from Microsoft he’s amazing at making beautiful looking reports, but if you look at any one of his reports and the data model, it’s like what are we doing
46:40 Here? Like this is a horrible data model that supports this beautiful report. So there there has to be this balance between like a goodlooking report and not a nasty looking semantic model. So I feel like it’s they’re two different skills that need to be brought together some at some degree. I what I don’t think that’s a hot take because for what it’s worth, this is my first choice on my level three, Mike, on is reports because Yeah. close. Yeah. So it’s not too far off. And so I’m going to completely go on with this too because how we communicate our data is the same
47:12 Way and I had this aha moment years ago on as I was building reports or the approach I took because you you say this you say the words in like oh a good-looking report according to who and I think this was an aha moment. It was it’s not good enough to me. Okay. Sure. Sure. Sure. But my I’ve I’ve said this a ton of times on the on the podcast and I I say it to clients too. There’s no such thing as a perfect report because it’s going to be based on the audience and who’s looking at it. And the aha moment I had with when I
47:45 Came to communicating data, Mike was the way I were something in an email is going to shift what your behavior is going to be. Now, I can send you a message on Teams about who our next guest is going to be or what we’re going to talk about. And I could do multiple ways depending on how I word it is going to really dictate your reaction to it. And reports are the same, but just with data. How do I word a sentence is the same way as how do I word and visualize the data. This is something I I’m I’ll tell you I struggled with in the
48:17 Beginning with PowerBI on how I communicated. I thought I was doing a good job but I wasn’t getting the right point across or right getting the more importantly getting the right information and context across to people. So even if I was the best semantic model builder in the world and I got achieved in crowns and badges whatever it doesn’t matter diddly if if I can’t build a report and I cannot communicate that. So I wanted to put this in level two. It was it was a tough choice, but honestly, like I said, I’m seeing lake houses more, but no, for me,
48:51 This is the first thing off the board for level three because this this is at to your point, there is no better right way right now for someone to consume, understand, and act talk with without a report for from an organization point of view. We have metric sets, we have other things. , actually that’s one of the things I’m talking about in Orlando coming up, so stay tuned. But it’s still reports are going to be at the end of the day where people go. So I love that. I love that. So let’s dive into level three. Speaking of since we’re already shifting into that, I
49:23 Had reports. That was one of the first ones I had. So and the other two were actually notebooks and apps. So I had up a notebooks up there and it’s crazy. I don’t think if it was for you, notebooks would be as high for me. because I was like what we have data flows and I have to now get into this new workflow but it’s PowerBI what are we doing Azure data factory here. we are. But at this end of the day, the amount that this is now the primary tool for my data ETL, , and the fact that we also have AI,
49:56 So you could put AI in there, too. But the the adoption for the personal developer to use notebooks, there’s no excuse because you take it from me who’s again not the smartest cookie in the jar, but utilizing and understanding that I can simply take what I’m trying to do, feed this to AI, learn, , the only thing I’m good at is learning. I’m I learned how to learn. And what notebooks now, it’s like this is just an essential part of what I do. I would be if they took away notebooks today, I would be wrecked because it’s such an
50:29 Important part of the data integration side, especially if I have notebooks or lakehouses as my level two, I need notebooks on my level three. And then the last one I had there was apps because to your point, the communication side is so important and if I build the best report and everyone says it looks great, but no one knows how to get it to it. And maybe I put this because of my own experience and I probably did. I’ve been burned by not being able to get people to a report or to the right data because it can be overwhelming for people. So, it’s so essential.
51:01 This is why also, Mike, I always focus on the adoption. Whenever I talk to anyone is what does your adoption look like for PowerBI? I care that you have good developers, but I care just as much that you have a process and an approach to getting the data to the people the right way. So, reports, notebooks, and apps are my level three. Interesting. You go down the apps. I didn’t think about apps that way, but I think that’s a really interesting option for number level three here. So, I’m going to agree 100% with you on the
51:33 Notebook side. I think notebooks are a very code-centric way. So, when I look at notebooks, there’s two things I think about notebooks. In one direction of notebooks, I’m thinking about notebooks are good for data engineering, shaping data, doing things. Notebooks are also very good at tooling, right? So notebooks provide the ability for us to build many tools that I can use on top of models, right? I want to go get a bunch of data from the admin portal. I want to go get a bunch
52:04 Of data and and scrape it out and put it somewhere. Notebooks are good about, , I want to use a notebook and create a report from scratch. Now that PBIP format exists, we just did a project where we were able to then build a report grammar a grammar a grammar grammar grammar around report building. And then from that we were able to then read that input spec file and generate reports. And so this would this unblocks the ability for you to make 30, 50, 100page reports,
52:35 Right, with just a click of a button, right? So you can just program it and it just goes. So you’re you’re removing all the clicks and it becomes really consistent to build things, right? That’s massive. , that’s a that’s a huge win. So the fact that notebook can build tools that enable you to operate faster is a huge improvement there. So I I love notebooks. I’m going to disagree with some of your other ones here, Tony. I’m going to pitch down some ones here that I think are slightly different. I’m going to say co-pilot, but I’m going to say co-pilot in lie of code, not
53:11 Co-pilot to do reports, right? So, I don’t really like copilot to tell me how to build reports or other things that that copilot produces. However, Copilot’s really good at writing SQL. Copilot’s really good at writing Python. I almost have little to no fear building anything in a notebook now because I have co-pilot at my side like that to me I I know enough about what the notebook should be doing and what the data engineering should be doing. I can describe to the function I have this data coming in. I need you to build a reg a regular expression a reax
53:45 On this thing. I can describe it to the AI and it spits out an answer really good. So there comes a point where I could probably figure it out, but I don’t want to take the time to. I’d rather just take the time and write this in. So it’s one of these balance of value to time investments, right? The investment of time for me to go speak to the AI or the co-pilot to help me build the code function, right? I need a Python function that does something. I’ll just ask Python to do it. , ask the copilot to do it and just does So is is co-pilot as code deeply integrated into the tools? No, not quite
54:20 Yet. where’s C-Pilot in my DAX window in desktop, right? There’s no C-pilot there that I can’t I should be writing DAX in there and Copilot should be helping me writing DAX inside the DAX window in desktop. Like that should be a thing. it copilot inside notebooks is clunky. It doesn’t quite work as well. But when I use co-pilot inside VS Code, very happy. So a lot of times I’ll take code out and go right to VS Code and I’ll use C-pilot things over there. So the what I guess I would say is the the VS Code experience in co-pilots is dialed. They know what
54:55 They’re doing 100%. I hope the Microsoft PowerBI and experiences teams are leaning on what’s coming out of VS Code and just borrowing and just bring it over. Just literally take what’s coming from there. Bring in the agent mode. Bring in all like there’s already a dedicated team around making amazing code and doing things in VS Code. just lift that experience to notebooks, just lift that experience to my SQL SQL analytics endpoint or my SQL data warehouse. Like that’s what I want. So for me, I’m looking at the C-pilot code experience. I think it’s my
55:28 Number three. It makes me more effective and I don’t need to spend as much time learning all the little intricacies. I can just let it do its thing and then I can read the code and correct it afterwards. This is one area here, Tommy, that I think I’m going to be a bit more visionary on this one. My last one for number three, I’m going to say is an underrated feature for organizations. I’m going to also preface this with some observations. I am really intrigued to hear this. Embedding your reports. Now, yours was apps. Your yours was apps, Tommy. I totally agree with you. Apps is very
56:00 Important, right? But apps only works in the context of your organization. I think I’m I I I look at this a little bit bigger and going look we’ve already built let’s go let’s let’s go back to the first two layers of our pyramid. We’ve already built these rich semantic models. We already have good-looking reports. We’re already using this super cool structure called Delta. How do we maximize the leverage of all those things that we’ve been building to me? You can embed things in Teams. You can embed things in SharePoint. You can embed things in a custom application.
56:33 Embedding gets your reports out to not only your internal team better. I also think reports give it better to your external team better. And the more I use powerbi.com, the more I start thinking to myself, it’s too busy. There’s there’s so many things to click on. It’s powerbi.com is wonderful. I really really like it. But when you add all of Fabric and all of PowerBI, PowerBI.com is a cluttered mess of just a million tools. People don’t know where to put
57:04 The data. They don’t know how to go get the reports. And there’s a huge audience of people that just say, “I just need to get to the report and open it up. I just need to get to the report, copy it, and make a slight change, and I need to see that data every single day, every single week. That’s what I need.” Right? So there’s this there’s this idea of like simplifying the output and I think embedding does that and again I’m heavily invested in this area. I think this is a major win. I am so invested in it. I’ve made up an entire product
57:37 Around helping companies accelerate their need to get to embedding. There’s a lot of capability there. So we’ve built a product called Entexos. You can go get it on the Azure marketplace. You can download it. It helps you start from like embedding projects typically take companies anywhere between 6 to 9 months to build the project. You’re like, “Oh my gosh, that’s that’s awful, Michael. Why would you recommend that?” That’s because I already have a solution that gets you started in an hour. So, you can go download the product today. And so, if what you’re doing, it’s very easy to deliver lots of results very easily and quickly with an
58:11 Embedded project. And so Microsoft I’m my company is one of like five partners. We’re part of the Microsoft partner showcase around embedded accelerators. So to me I think this is like let’s think about okay my company’s a thousand people. Okay you could use embedding there but how many customers do you serve? What apps do you build? How do you how do you make customers sticky to your organization by giving them data? And to me, that’s where, like, if you’re really trying to
58:43 Talk impact to your organization, embedding is where it’s at. Add, you don’t need to have developers being going out and building new visuals and interactive visuals across each other. , even now, I look at data bricks, , reporting their AI, BI dashboards. It’s not that great. , yes, they have a much better calendar selector, but none of the visuals interact with each other. They don’t cross talk, and it’s just not nearly as good of an experience. I’ve done Looker. Nah, not really happy. No, no, no, no. I’ve played with Tableau. Extremely
59:14 Flexible, but the learning curve is super steep. So, these other like I just really like this idea of being able to say, look, how do you make your customers absolutely love you? You give them good data and you give them an an easy way to get embedding. And that’s what I think I’ve tried to develop is this really rich experience around embedding things. So that would be my my last one for level three is like I’m now not only focusing internal to my company I’m focusing what is the impact external to my company that I can really impact largely with embedding
59:46 And most people when thinking betting too because I honestly because of the barrier to entry it has been it is not something for the faint of heart to try to get into but no the fact I I I messaged you a few weeks ago because I was talking to someone and they’re like it’s such a process like no you can just go to the app source app store and you skip and they’re like wait what and the concept still weird too. So no I I really like it is very common when we’re talking about apps is getting the data to the user. So, all right. So, you had notebooks co-pilot, which I’m
60:19 I I’m going to count it. I know. I know. Code copilot. You’re also talking only the code. I don’t know if it’s a fabric feature. It’s something I use in fabric. So, but we’ll count it thing because and then we have embedding. All right. I know we’re getting near time. So, what I would like to do, Mike, if you have a level four, I want to see if we have any that cross over and then we can call it a day because I don’t have any at level four. I ran out at level three. All right, we let me see if you would agree with any of these and then we’ll go from there. So for my level four and then we’ll call it a day.
60:51 I I we need to do this again. This is a blast. So I had pipelines and also before I actually got in I’m surprised after we talked about Tindle that Timle didn’t make your first three levels and maybe it’s just because the other things are so essential but still regardless. Do do I do I push that like we could yeah I could probably I could probably push PBIP and Tim I think those are two game changers but like those are heavy developer like I’m trying to think about like a large impact things that make a big difference to a wider like they’re
61:24 Developers they love this also I didn’t hear anything about you editing a model in the browser in your top three either so it’s a given now like this really some feature so let’s see if you agree with any of these I had pipelines I had tindle I’d drill through so pipelines because I honestly I’m using pipelines more and more. I love the user interface and what you can do. You don’t have to scale like you did in data factory and have to know all the knowhow to use it and still you can get a lot done because like I said I’m you’ve got me on board with
61:58 That. I have drill through because let’s just go to the reporting side of things. It’s a feature that I I apply in every report. If I build a report, I have drill through in some capacity. And the last one I could have, it was a more developer and I was like, do I do XMLA? Do I do API? I actually stuck with VS Code, the integration that they have with the extensions because that’s not a feature in fabric. How is it not? Okay. How is it not? Because VS Code is your favorite. I understand, but VS Code. No, I’m not going to count it. If you’re not counting If you’re not
62:29 Counting my I did co-pilot. I’m not counting. Yours is GitHub copilot that I’m using with fabric. I’m telling you the VS Code extensions. Okay. All right. Extensions that Microsoft created. If they write a blog, I’ll let it slide. I’ll let it slide. They’re writing blogs saying we’ve now released four extensions for fabric that you can use. And that’s Gerald’s extension as well. Why on earth do I need four extensions? Just make one good one. Like stop it. Like that’s because I have 130 extensions. I want more thing. H I don’t need more
63:01 Ext build one good one that I can use with everything. It should work with all of it. Like I don’t need it should be the fabric extension like Gard’s extension. It’s just one that you use. I get it. But I use that. So I use that. I use the integration. Call it integration. Call it an API. I don’t care. But the integration there I use extent. I use so much and more and more too. And the the things that I can edit go through that I’m not in the service is pretty incredible. Mike. So again, Microsoft built. So, oh, and Tintel is also an
63:35 Extension as well. So, speaking of all these extensions, so but honestly, so but I drill through and pipelines. I I didn’t think it we’re going to get to my level four, but yeah, is is a big is a big part. So, that’s what I got. Mike, I would I I like your level four things. The only thing I would maybe add to mine if I had a level four I would probably maybe drop VS Code because I think it’s cheating, but okay. I would not I would not pick a drill through. I would I would definitely tick tick the PBIP format is something that’s immensely powerful. but again the further I go down this pyramid I start
64:08 Looking at more like prodeveloper level features. It’s not like general audience. I started getting like very technical with a lot of the things that was like what makes my life easier dayto-day and it’s those things. The one I would I would I would ditch I would agree with you on pipelines. I think pipelines is very pivotal for what we’re doing. Very useful. There’s a lot of new data engineering things that are coming out. I I really am hopeful for copy job. Seems like that’s really interesting. I think that could solve a lot of problems. Mirroring right now for me is really hot. I’m really liking a lot of mirroring. I’m doing a lot of that
64:40 Between data bricks and fabric and like getting short shortcuts to me is becoming like a really interesting feature set. So for me, shortcuts would be probably on that fourth level. I’m definitely going to pick I’m going to smash two together cuz I want five items in my number four level. So, I’m going to say tim PvIP. I’m going to say Tindle PBIP is like the things that I would agree with. That’s fair. , and then, , there was one more I had here. It was pipelines. Makes sense. Tindle, PBIP. ,
65:11 Shoot, what was the other one I just said now? I I’m losing track of stuff. You said PBIP. Tindle, which is one mirroring, I think, or or shortcuts/mirroring. I think that’s something that’s going to So, your level four has six six. Well, I don’t know. Some mirroring is shortcuts. Other mirroring is actually copying data. So I to me it’s confusing. Like is mirroring actually a mirrored item? Like if you’re doing SQL database, it’s actually copying the data. But if I do a mirror to a data bricks, it’s like not copy the data. It’s like a shortcut. So like I don’t really know what it is. So it’s confusing.
65:43 So I I I want to say because I don’t understand the feature or why there’s what the the lineation of it is. it should be data brick shortcuts to uni catalog but they call it mirroring but it’s not. It’s not but really what whatever. so anyways those are the three that I have there. there was one more that I thought was really impactful and it’s escaping me. I’ll have to send it out over tweet if I can get it later. Anyways I think those are some good things to understand and know is at that at that level four tier. We’re getting very technical by this level. I feel like this is becoming more prodeveloper level things. Oh I remember
66:17 What it is now. Good thing I stalled for a second. I’m going to call it This is going to be weird, Tommy, because it’s going to be a bucket item, but this is still forming. It’s the MCP land, right? MCP is going to be your level four is is in level. So, there’s already two MCPs out, I believe. I believe there’s one for semantic models and there’s one for Spark data engineering. So, two MCPS are already existing today. And I believe if I look at the landscape of what Microsoft is doing, they’re it’s easy for them to build the MCP server
66:50 And let you use that against their existing tooling because it it bridges the gap of pick whatever large language model you want to use and then here’s the list of APIs it can go talk to and then it can go do things. So like the the MCP land is a bridging the gap between just talking to something and then having it do the agents doing things for you. So, I think I think that’s going to be a from a prodeveloper side of things, I think it’s going to be a big win long term. So, I have to put that there as level
67:22 Four. It’s just something that’s going to be we need to stay mindful of it. It’s going to make it be very impactful at some point. Interesting. So, I for me that’s a trending, but I because it’s not I agree in terms of it’s going to be there. It’s not there for me too. But so, okay, for level four, you either got three or five you’re picking. So, , shortcuts, mirroring, tindle. , those are interesting. I’ll make a few notes here and then we’ll call it a day. So, I had I went through level six and I’m sad to say that Power Query just made it into level six. And I never thought I would
67:55 Say that a year ago. I love Power Query. I use it extensively, but I couldn’t I couldn’t rank it higher. Things I ranked higher than Power Query here, data agents, APIs, XMLA, workspaces, bookmarks, SQL endpoint, metric sets. Those are all higher to me to power query. So man, we’re looking near the end of Power Query what it is. Well, I think I think I would have said that last week, Tommy, after going to the Microsoft Fabric Conference in Vienna, I think Power Query has turned a
68:26 Corner. I think the team has listened to the community. It’s not going to be as expensive to run. It’s going to be easier to use. , if you’re I’m looking at a I don’t know what the number is now. 30 million like Microsoft in their their March call said there was 30 million monthly active users. This is back in March. So, I’m guessing now it’s up even higher than that. Maybe 40 45 million, right? I’m just projecting numbers here, right? So, let’s say you’re at 45 million monthly active users in PowerBI. Awesome. Love it. Well, if if that is the audience size we’re talking about, those people are
68:58 Going to be comfortable using desktop getting Power Query, using the Power Query experience, dataf flows gen 2 will be right up their alley. So, if you want those people to come over, make the cost a bit more effective on getting it to run. And I think Tommy, after we do some testing, which we should definitely do, we need to have a revisit on this one. We need to have an episode at some point is revisiting and giving data flows some love. We’ll use your tenant then because I think yeah we’ll use your tenant. I think I think we have another
69:31 Opportunity to revisit the data flows gen 2 now with the parallel processing and the faster running time and the new compute engines. Like I think there’s a lot of under the hood challenges that they needed to fix. And I think once that’s done, I think now that they fixed these things, I think now we can step back and take a revisit on them because we’ve been harping on it for so long. The team has finally listened and it’s made some changes. We got to do our part. We got to do our part and we got to go back and say, “Okay, look, they’ve made some what they’re calling really big improvements.
70:03 Let’s go back and take another look at this one.” I think we owe it again. That was both you and I. The first thing we did to get into PowerBI and Excel was Power Query. Like that was what brought us over to the SP like this is gamecher. So I think now we’re finally getting to a place where it’s competing with the scale and the and the size of Spark and other things and notebooks. it’s going to be more competitive there. So I want to take a revisit on that one. I want to go back to that and look at that again. and moving there. , King also in the chat here says people are
70:37 Coming from Altterx and desktop and was talking about data engineering and pipeline making in the in the desktop era. I would also argue Alterx and Talend are the two like desktop applications you do for data engineering. And I think those are all just going to be gone in a couple years. It’s just going to be all fabric and things in the cloud. And I don’t think any longer you’re going to want to pay for licenses to have machines on your computer running. So yeah, 100% do I love Being said, this is a long episode. Thank you all for participating and listening and hanging out for us today. We had a lot of things
71:09 To talk about today. Really appreciate the discussion here. Chat was been very active today. So, thank you very much for chat people. If you want to become a member, if you like this episode and you want to watch these with no ads, if you didn’t catch it in real time, feel free to go become a member. We’d love you for become a member of our channel. It helps support us. that helps do things here and improve the quality of our episodes and our content here as well. We really really appreciate you the community. We do this because we love this space and we really enjoy the people here that make it so fun. That
71:42 Being said, Tommy, where else can you find the podcast? You can find us on Apple, Spotify, or wherever you get your podcast. And make sure to subscribe and leave a rating, please. It really does help us out a ton. And share with a friend. Share the news. We do this for free. Do you have a question, idea, or topic that you want us to talk about? There’s something you disagreed with the feature pyramid or did we miss? Head over to powerbi.tipsodcast. Leave your name. Please leave your name so we can mention you. A lot of people don’t do that. And a great question. And finally, join us live every Tuesday and Thursday, 7:30 a.m. Central, and join the conversation on all of PowerB Tips
72:16 Social media channels. Awesome. Thank you all so much, and we’ll see you next week. down.
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
