PowerBI.tips

Fabcon Vienna – Ep. 461

September 24, 2025 By Mike Carlo , Tommy Puglia
Fabcon Vienna – Ep. 461

FabCon Vienna brought a wave of Fabric announcements. Mike and Tommy break down the highlights from the European conference, including the September feature summary, the new Fabric Extensibility Toolkit, and the long-awaited calendar-based time intelligence preview.

FabCon Vienna Highlights

September 2025 Fabric Feature Summary

The September feature summary brings the usual round of improvements across all Fabric workloads. Mike and Tommy highlight the most impactful updates for BI professionals.

Fabric Extensibility Toolkit

The Extensibility Toolkit is a significant announcement:

  • Enables building custom Fabric experiences
  • Opens the platform to ISV and partner solutions
  • Standardizes how third-party tools integrate with Fabric
  • Signals Microsoft’s commitment to an open ecosystem

Calendar-Based Time Intelligence (Preview)

The calendar-based time intelligence feature addresses one of the longest-standing pain points in DAX:

  • Custom fiscal calendars supported natively
  • No more complex DATEADD workarounds for non-standard calendars
  • Preview release—expect refinement based on community feedback
  • Combined with UDFs, this dramatically simplifies time intelligence patterns

Conference Themes

Recurring themes from FabCon Vienna:

  • AI everywhere — Data Agents, Copilot improvements, AI-powered experiences across workloads
  • Developer experience — TMDL GA, UDFs, extensibility toolkit, better tooling
  • Platform maturation — Fabric moving from “new and shiny” to “production-ready”
  • Community — Strong European Fabric community engagement

Looking Forward

FabCon continues to be the primary venue for Fabric announcements. The combination of extensibility, time intelligence, and developer experience improvements signals a platform that’s listening to its practitioners and investing in the right areas.

Episode Transcript

Full verbatim transcript — click any timestamp to jump to that moment:

0:00 Heat. Heat. Good morning and welcome back to the

0:32 Explicit Measures podcast with Tommy and Mike. Good morning everyone and welcome back. Good morning my friend. Welcome back to the States. I am back. It has been a a whirlwind trip. Last week I was out at the Microsoft Fabric Conference out in Vienna. Oh boy, were there some announcements. things were just humming along. Lots of new things, some really I think some substantial impacted features here that we need to unpack and discuss here and really go through. had I did a session at the conference

1:04 So that was super fun. I was able to go out and do a session on translitical task flows. So that was that was there. We really don’t have any news today. It’s not really like a news day. This is this is this is like one large episode I think around all the news basically is what we’re going to go after here today. And as always, we do it in a little spirit of the There’s so many news, Mike, that yeah, we could just go through item one by one, but we like to spice things up. We’re going to do a little draft style, pick our favorites, and we’ll just Yeah, we’ll pick

1:36 It from a draft point of view. I think there’s enough here where there’s a few I’m intrigued on where they’re going to go on how high we’re going to pick them. And honestly, I think this is a whole episode rightfully so devoted to just what happened at Fabcon Vienna. , this is also a lot from the Fabric September update. So, if you want to follow along or pick your own, there’s an article here that aligned very well with those updates. Regardless, Mike, I’m in I was blown away. This was an overwhelming

2:08 Article or of updates, a month of updates, which you love to see. And this is with Ignite coming up too in just a few months. I I would say I guess my feeling is Tommy, I think the Microsoft fabric conferences, like the two they have throughout the year, they have like one over in Europe, they have one over in the US. So there’s going to be another one in March. I feel like those are the like the main milestones. that’s like a long enough time to be like they’re about 6 months apart and then they land a bunch of like features at

2:41 Those conferences. So it’s like twice a year you get these like big exciting releases with lots of things coming out. and so I think I’m not sure Ignite really pushes the needle for like let’s announce a whole bunch of new features. Ignite’s interesting but it’s much more developer centric. I think fabric conference is more of like business users, everyone, , not just super technical. There is a technical track to it, but it’s not just ultra technical all the time. Well, Transit Analytical was announced at Ignite last year. We were there together in Chicago when they talked

3:13 About it. Ignite also talks about big picture updates that are going to come out. Yes. And the Fab Cons are usually like it’s like and look under your seat and there’s something you can play with right now. So that’s always it’s a little more appable or more pressing on for the person right now. So regardless Microsoft holds back on features like it was very light last month when they released things. You could definitely tell they’re holding off and trying to get everything polished, everything out the door. this is Yeah. it doesn’t they don’t release features like hey I designed a feature

3:45 And two weeks later it’s out. they had to do like months of testing and getting it going and the releases and it’s much slower to have things land in time for a conference like that. Yeah. Well, let me ask you. You did a demo in Transnalytical. Speaking of which, how do you think it went? And doing a demo on something. So, speaking of technical, , I feel like that literally is the definition of a developer centric feature right now. How did you feel it went? Yeah. So, let me I’ll say this. We were in the main hall and they moved us to a smaller room, but the smaller room

4:18 Had standing room only. We had people just standing room only in the room, which I thought was like a really good Yeah. a good feeling to have your room entirely packed out. I got tons of feedback. There’s actually a number of people on LinkedIn who were able to to mention like one of the best sessions was that they had interacted with was the translitical one really good demo lots of good like in in-your-face demo pieces and so one of the things I trying I tried to do in our session it was with Sugata one of the PMs from Microsoft who was heavily involved in the translitical task flows project I was really trying to convey

4:51 It’s not just right back everyone says okay translitical right back yes I understand what it means it’s more than that and like our whole demo and what we were doing was trying to explain it’s way beyond just making a table and then clicking on a a cell and then writing the data back to a database and having that update. So it was supposed to be way more than that. I guess we had those examples at the beginning but towards the end we actually did like a real time demo where I had a Raspberry Pi that was on the stage with me and I

5:26 Plugged it in. I had turned on and it was listening to messages coming from IoT hub. And so the report was was allowing you to enter text into the report using a text slicer. This is a new slicer that they just for task flows, transitical task flows. They had to make that. You put your text in, you hit submit button and it then passes that information over to a user data function. It’s a fabric the user data function which then sends it to IoT hub which would then communicate and push the message down to my device. And so I was able to go into those the session

5:58 Type out my message fabric rocks and then hit send and it showed up on the device. And then we were able to do that a couple times to show people like it was actually doing a full round robin of sending the message through the system and returning the messages back to the device itself. So things like this I wanted to expand people’s minds that it’s not just writeback you can like do actions you can make data move you can send messages to a function the function can make business logic that does something and then it can then trigger real devices to go action or do

6:32 Things as well. So really this is this is a like a replacement let me say this way it’s kindment for power apps but not 100%. Yeah. So, Power Apps has some context of like being able to get context of data in the report. It was clunky to set up. It’s not that bad. It it it wasn’t too difficult to set up the transitical task flows. And you’re right, Tommy, it’s heavily designed for prodevelopers of PowerBI, but it definitely makes the enduser experience extremely easy to use. And now you can

7:06 In real time or with rightback or do things from the report that take direct action on something else which is I thought was really interesting. No and honestly the only the only part I’ll take offense is the powerups. I still I still have clients who hammer on it but the point I think you’re trying to make here what I see with it with the trans analytical task flows is it’s also open source. The problem with power apps is you’re playing in that playground. you can only do the functions that are available with Power Apps, but you’re using Python. So, to your point,

7:40 I can do the Raspberry Pi and yes, that is possible. I know people can tell me it’s possible with Power Apps. A lot more workarounds to do that. So what you’re able to do with translate analytical task flows and I love too the fact that you are pushing that it’s more than right back because I don’t know why that’s the con the misconception right now and maybe because there’s not enough examples like yours out there on on what’s possible here and so I I think this is going to be one of those things I don’t want to put it I don’t did we

8:13 Did we classify trans analytical as a game changer does it go on the t-shirt what? I’m I’m I’m going to say yes, it is different because this is this is a brand new capability. Yeah, we’ve never had again t-shirt. Maybe it’s on the back, but the idea here is like this is something we’ve never been able to do as easy as we have been. And the fact that it was fairly I showed the code that I did for the IoT example that I was doing, right? Writing to the Raspberry Pi and sending messages to it. I had like six lines of code. It

8:46 Wasn’t it wasn’t crazy difficult to run. That’s it. It was very well the library to talk to IoT hub was already built like someone already had made a Python package to to use it. So when there’s when there’s already an existing framework out there to to use things. It was fairly easy to set up the IoT hub. It was fairly easy to make the device listen to messages from the IoT. Like all this stuff was like it’s been around for years. I just had to like tap into it and learn how to use those specific items in our presentation. So yeah,

9:17 For your demo, did you use VS Code or were you all in the browser? So then there’s then you could do anything. It’s again it’s just Python like it’s it’s writing it’s just using code to run a function. So the user data functions are basically as your functions retoled and stuck inside fabric. There’s a little bit of ease here where you can make it easier to connect to things. You can have connection strings that automatically attach. you’re not saving usernames and passwords directly in the user data function. But in general, it’s it was pretty rock solid. So anyway, I I think

9:50 You’re right, Tommy. There’s not enough examples to like open people’s mind of what’s possible. One of the aha moments someone gave me was the creativity of what you can use Transitical Task Close Floor for hasn’t really been explored yet. It it we’re just on the beginning like it’s just been announced. There’s a gallery for it. You can see there’s items out there. So just getting your head around what is the beginning portion of this how to get started with it is it it needs some time to mature. People need to come up with creative ideas and

10:23 So because of that someone was asking me at the conference they were like well you showed examples of like adding like a group of things. We had an example of that. So you have like 10 rows of data you can filter things down by some categories. filter, filter, filter, and then with these three filter contexts, you can then update individual rows of data and they’re like, what if we wanted to update multiple things at once? Really, what people are looking for, and I think there’s there’s a really good opportunity here for someone in the marketplace to build one of these things. What everybody really wants is they want Excel to appear inside PowerBI, really is what they want. they

10:58 They they really want to be able to like embed the Excel experience inside a visual and be able to say save this data and it gets pushed out somewhere. But with user data functions, this is now possible, right? You can actually save notebooks or you could have embedded things and you could have these functions pick up this data. So the question came out during the conference was how would I edit multiple items? And we say, well, you can edit multiple items by writing better DAXs that would just concatenate multiple

11:30 Items together and get the ids and then you could set it for what you want. So like there’s definitely ways around this. So the the again back to what’s going on here, the functions, the data you send to it, it’s highly flexible. It’s going to be limited only by the creativity of people building things. you can do data validation, you can do checking, you can do the function can talk back and forth. and for me while I was doing the build of the demo the aha moment was there in order for transitical to work there’s multiple like full communication cycles that

12:04 Are happening the report says here’s the data transitical the the function user data function has to say I acknowledge your send of data it has to complete something and then the function has to submit back to the report I’m done here’s the message I either passed or failed oh And by the way, I also would like you to refresh the reports. There’s a there’s a whole bunch of birectional communication happening between functions and all kinds of other things that just make this really exciting. So for that reason, I think I would call it a a game changer.

12:35 And for listeners, , he knows what he’s talking about because he just navigated some very tricky waters here because you could have said userdefined functions, you could have said user data functions and DAX, but he knew what he was talking about because we have three UDFs and one just got announced two. So, we’re not even going to touch that right now. So, we’ll wait for that. But, no, I I love that the demo went well. Happy to hear and and I think more examples are going to just really make this take off. I think so, too. That That was That was fun.

13:08 Good. Let’s go into the the review. So, we’re going to go through the article that is there wasn’t really Sometimes they do like a book of news. Yeah. Around all the things that come out at Fabcon. For whatever reason, it didn’t submit they did a book of news, but it was just the September 2025 fabric feature update summary, and we’re going to go through that. Basically, that’s going to be the list that we’re going to go through right now and basically go through that entire blog post together. All right, Tommy, do you want to kick it off here for the first

13:40 One? picking out the first item in the list. Yeah. So, usually I I offer to you, but this one I’m actually going to be a little selfish because I’m really excited about this one and I want to make sure I talk about it here. Sure. We’ve talked about this for a while. I’ve complained about this for a while about user interface, the experience for us building in Fabric. I like my idees. I like my desktop. And I think the browser is a subpar experience. And someone might have been listening because my first pick I’m going to put

14:13 Off the board is Fabric’s getting a multitasking like you user interface and a developer friendly upgrade. This is in preview right now. Oh yeah. Yeah. What the Okay. All right. I want to see your Let me hear your opinions on this one because I do have opinions on this one. Go ahead. Let’s hear it. It’s we’re we’re it’s the progress here. It’s not the results right now. It’s the journey. Okay. So, we’re in the pursuit of happiness. We’re not at happiness right now. So, the idea here is simply that I can be rather than having actual tabs on my

14:46 Browser, which gosh, I have too many tabs of fabric on my browser. I have bookmarks and all these things to try to navigate because I’m usually not just working in a single feature or artifact. I’m working multiple things because they all connect. This is no longer the day of a single semantic model doing everything. So they’ve taken that feedback and now we can actually have multiple active workspaces with color coding based on those workspaces. These horizontal tabs that allow me to be working in different products. So I don’t have to go to tab to tab and oh no this one had it needs a refresh in the

15:19 Browser. So it has to refresh and maybe it saved my work or not. It’s going to remember that throughout the open limit. I have a tap cap of 10 tabs. That’s fine. And this hopefully is working towards navigation, context switching, multitasking. And for me, Mike, this is such a huge thing because I I was speaking to someone yesterday and it’s like, how would you define your role? And I’m like, it’s it’s still that solutions architect life feature and that’s working in so many different products and artifacts and items. And in

15:53 Order to do that effectively, I need those different tabs. I need those different things open. So that’s why that’s going to be the first one I’m taking off the board. But you have some opinions here. Yeah. I don’t like it. Oh man. Okay. Okay. Let’s hear it. I don’t like I play I’ve been playing with it slightly. there are certain parts of this that I think work well, but it was also really difficult to get back to like your list of workspaces. I’m not used again. The navigation is okay. what they’re what it feels like

16:27 They’re trying to go after is more like a VS Code where you have multiple tabs across the screen and you have these different multiple files open and there’s you’re in a pipeline you’re in a I would agree with them like at some level I’m constantly switching through different items or artifacts inside the workspaces but it’s very jarring to have the colored groups open and the icons that they’re using it just it doesn’t resonate with me very well and I I was playing with it the other day and it was just an awkward navigation experience. Maybe I’ll get

17:00 Used to it over time. I jump around a lot between workspaces, but I’m used to like navigating through to the workspace level anyways. I feel like there needs to be a better breadcrumb on everything. I feel like a lot of times I’m going between an item in a workspace and then there’s like a folder and maybe like the the workspace itself because I’m trying to pick another item. I got to be honest, I don’t think I’m going to use this very much. I’m gonna probably I’m going to do what I normally do, which is I don’t want to navigate one window with multiple tabs. That’s just not how I operate. This

17:32 Computer I have has a very wide screen. I have the ability for opening multiple browser windows. I’m going to continue to open up multiple items in multiple different browser windows. To me, that’s the most efficient when I’m trying to work on things. I’ve got a lot of screen real estate. Now, that’s how I operate. I will argue when I’m at conferences and I’m in, , training sessions and I’m watching people’s machines, the fact that many people do not increase the screen resolution blows my mind. Oh, for demos.

18:04 I I don’t even know how people build reports, honestly. Like, they’re on a computer laptop screen and they’ve got the font size or the screen size is so large that I’m like, how do you even do anything on the screen? There’s like barely any space to put a visual on the page. It’s almost like wow. Like there’s there’s physically not enough real estate on the screen to do anything. Yeah. 100%. Yeah. You can’t you’re in a notebook. You can only see like one one cell or two cells max. Too much. Yeah. So I would rather squint than go through that process. I I was at the conference

18:38 And I was working on some demo pieces and I turned my screen resolution all the way up and then there’s like another feature in desktop or in Windows where you could like increase the the size of the the ratio of the resolution size. So typically the computer picks like 200% on a very small high resolution screen so the things get big enough you can click on things. I went all the way down to like 125% size and got I made it really small so I could have a lot of space to see all the things. I’m with you. So, I feel like this is just one more piece of clutter. And what it doesn’t do here for me is it doesn’t

19:10 Add it doesn’t really actually get me to what I need, which is I think a better way of getting back to like the list of items in the workspace. I also was finding some weirdness where again, this is probably just a bug at this point, but inside this new object explorer that they’re calling it, I had a folder with items in it. Oh, it would not let me collapse the folder for whatever reason. I don’t know if it was a bug. I don’t know if that was just intent. So, there’s just some weird things around like if I have a lot of items in a workspace that needs to be able to be better of an experience. Here’s what I

19:44 Would love to see. And I think my ideal look at that piece. Yeah. , at the end of the day, Mike, I’m a I’m a tab guy, too. I have extensions to manage tabs automatically because it’s just it’s a problem. But I would love to see what we’re doing in VS Code with all these extensions. And here’s the thing. If a lot of these are already files, like pipelines, they’re already a a file type. Just embed. I don’t even care if you’re embedding in a tab in VS Code, a pipeline or a notebook. But let me see the fabric user interface. But I love

20:17 How I can navigate with the extensions. right now it’s this the collapsal accordians even with the workspaces or it’s or I think it’s the fabric toolkit that extension is it’s organized based on the artifact you look at the you expand the workspace and then it’s expand by oh here’s semantic models here’s notebooks and just show me that and I can open the tabs there because at the end of the day I know we’re working in the browser but you’re giving me all these products that are I I I don’t think are browser based

20:52 Or or align to a lot more things. So, it’s a step in the right direction. And I I agree. I still have tabs open. I have tabs in tabs basically right now. But we’re we’re working towards something. For me, it’s the idea that we’re finally working towards a different user interface. And that’s a huge that’s a huge win for me. Even if it’s not where we want it right now, it’s a huge win for me. Yeah. I I would also agree like what we have today. , again, I’ll I’ll pick on both of these things, right? I’m going to complain about both of them. Even having the workspaces on the lefth hand side nav and then opening all

21:25 The icons in the lefth hand nav isn’t really ideal either as well. So, like there’s really no, , easy way to get all this stuff done and dialed in. , it it we’re powerbi.com experience is becoming, in my opinion, powerbi.com is becoming more prodeveloper every time we look at it. It’s becoming more pro, more pro and more pro and there’s less and less experiences in powerbay.com that are focused on like the consumer the re just getting access to reading reports and

21:56 Building reports. See, for me it’s different because I there’s nothing more I hate than when I go to a different workspace. I feel like I’m such out of the experience I was in and I I want things to be seamless or feel that way. I don’t know why. Maybe I’m just being first world problem here, but the when I have to go to a different workspace or even if I’m in a notebook, right? Do what I do every single time? Rather than getting out of that notebook, I click on controlclick because I want to keep that tab open because every time feels like I’m out of that workspace or I’m out of that experience. So, no a big thing. So, all right. So, maybe not the

22:29 Biggest first first round we’ve had. So, what’s you’re going to be what’s going to be your first pick? We got a ton to go through. Oh, we got a ton to go through. So, let’s let’s go through some other other features here. So, one of the ones that I’m extremely excited about, and this is going to sound weird just because the name of it is just awkward. It doesn’t really look good. Notice under the data flows gen 2, there’s a ton of new features there. The one that I’m most excited about here, there was a in the keynote and I took a screenshot of this and put this out on my Twitter page and posted this around. Dataf flows Gen 2 is getting an

23:01 Incredible speed up. They’re they are drastically improving the compute engine on the back end to make it really really fancy. So, this is like a couple features all rolled into one. accelerate your data flows gen two with preview only steps. So the preview experience is getting a little bit better. So that’s helpful. The modern evaluator it’s a netbased for dataf flows gen 2 with CI/CD attached to it. So again finally dataf flows team is actually building dataf flows with continuous integration continuous deployment as part of the

23:33 Product. Every time you build a new thing if you’re building a new dataf flows gen 2 experience it’s just going to have CIC attached to it. Great. Don’t ever build anything from dataf flow gen 2 without CI/CD already built in. Right? If if it’s not part of it, I don’t want it. , and then here, the one that I’m really excited about is partitioned compute for dataf flow gen 2. So when you click on the partitioned support, what this gives you is it’s going to be drastically increasing the speed and the performance of it. So you have a couple now options inside the menu. You have a

24:08 You can use fast copy, you can use partitioned compute, you can use query evaluation, use the modern query evaluation energy energy engine and now you can add concurrency, limit the number of concurrent evaluations. So all these things are going to make the data flows gen 2 run extremely fast and they were showing huge improvements on speeds and numbers which will bring down the cost and make them much more in line with where they should be. Interesting.

24:39 One of the feedbacks I was talking with some of the Microsoft team they were saying look we realize that we have gotten a lot of feedback that we’re too expensive compared to dataf flow gen one we’re too expensive compared to copy job and my argument has always been Microsoft yes you were competing against visualbased engineering tools like talent or other programs but now you’re competing against spark now you’re competing against user data functions because these are things that exist inside fabric. So if we want that

25:13 Data flows gen 2 to stay performant and yes it’s a graphical interface. I’m okay paying a little bit more for the compute that comes from that system but I’m not going to pay double. I’m only going to pay like 10 or 20% more than if I was going to do it in a notebook. So if you make it so expensive compared to everything else that I’m running, I’m not going to use it. So I think this is this is a major improvement. and the the major notes here is what it’s deliver parallel processing automatically partitions data sources and evaluates each partition

25:45 Concurrently. It will now support things like data lake gen 2 fabric lakehouse folders Azure blob storage it automatically generates partition keys to optimize performance when you’re combining files. These are things that people are just going to do. So all there’s a huge amount of improvements on the back end that I think is going to make dataf flows gen 2 extremely more performant and this is what we’ve been complaining about. Like we’ve been complaining about this for months and they finally got around to it. And I’m really surprised you took this as your first pick off the board Mike

26:17 Because I’m looking where I immediately went here because it sounded too good to be true. where I immediately went after hearing this performance considerations on the docs and just a few things to keep in mind because here’s the thing I’ve to your point I’ve been a huge proponent of data flows again daughter’s when I my daughter’s born dataf flows came out still upset that I missed the announcement but I’m very happy my daughter is born but there’s a few things here that I’m just

26:49 Concerned with right off the bat and we’ll say as it’s announced because it’s preview. Both of these things are preview and both of them are preview. Both of them are preview. Not all transformations are supported and I wish I really am peeved that they didn’t let which transformations. It’s keep in mind on both of the articles that they have around the partitioned and that they have around the modern evaluator, they say, “Hey, just so , it only supports a subset of transformations.” The other one says, , complex transformations may not be supported.

27:24 , let’s be real here, like like I’m if I’m doing complex things, bringing it to the lakehouse anyways. No, no, no, no, no, no, no, no, no. This is where I’m going to disagree with you because the point of a data flow the data flow should be able to do what a notebook does. There are two options. Why then have data flows? because it’s it’s it’s your it is your stop gap until you get to until how to actually write real code and do notebook data engineering. Like that’s that’s really what we’re talking about here. So you can there’s not a lot of if you in situations where so I would argue if

27:58 You’re going to be doing complex things you’re probably going to get one better performance and have more control about what you’re trying to do in notebooks anyways. And so of course I I think if you’re doing like where I see the majority of business users starting out they start out with PowerBI desktop they like Power Query because it’s already there that’s the equivalent to dataf flows gen 2 right so mo a lot of people are needing the ability to say look we need the tiered pricing to start with okay dataf flows gen one it cost me a,000 cus to run that right dataf flows gen 2 now needs to be more

28:31 Performant than dataf flows gen one it’s got to be like improvement 600 Right? And then you’ve got to have notebooks that could run that even more efficient like 300. Right? So now you’re tearing down into like if you write more code, it becomes more cost effective potentially, right? So that’s fine, but the fact that I can go from the data flows gen one to dataf flows gen 2 and get better performance and get the CI/CD that I want and it like that’s already a great message to business users that are just pure PowerBI and they’re they’re willing to go into

29:03 Fabric things. I would also argue a lot of the stuff that’s happening is just general shaping, dropping columns, not a lot of complicated things like in Gil’s book around learning the M learning power query, learning M, right? 75% of your stuff is just simple. It’s just bring bring data in, drop a couple of things. That’s fine. But I know when you get to those more complex transformations to your point Tommy the business should really be paying attention like okay we’re doing more complex transformations data flow 2 may

29:36 Not support it in these accelerated ways which is fine but you I would agree they should let us know but then I then I would argue well if you’re doing complex transformations maybe the notebooks isn’t the best place to do it. Again this is really that entrylevel space for where people begin. Yep. I hate that you say that it’s entry that and and I know what but I know what you’re saying. I think for me I’ve seen what data flows can do. I’ve seen what power query can do. Heck, I don’t even have to say data flows. And how much is possible? Like why support then

30:09 Iterative functions? Why support custom functions and power query to do what it can do? And it’s like that that is possible, right? Why have that even as part of the language if it’s going to be wonky, slow, and say no, we’re going to want to push you. Like what I would love to do and maybe you’re right maybe you’re right in in the scheme of things data flows gen two are for we’re adopting fabric marketing or sales is using fabric for the first time hey here’s a nice user interface you can push your data here oh look there’s some nice pretty buttons to click that you

30:42 Can manage your data that way anything else we move on maybe that’s that’s the role of it I don’t want to limit it that way because again what it can do and But the like it’s just almost selling it short. But maybe that’s the where where it lies in the world of fabric. but no, I think these two these two things at the same point, Mike, if we didn’t have these two features announced, we’re still at a stop gap where I’m not recommending data flows at the end of the day either, right? Because of the cost. So this is actually finally a

31:16 Reason why I would tell a team, what, actually data flows might be a good option for you. up until now. Up until now, I have I have not been recommending it. Same. And and that’s this is why I’m making this point is like this is like so we’re talking about game changers. This is changing again. I need to do some internal testing on this. I need to run some of these myself and see what happens here. But like I’m looking at this going, okay, if this is going to t the performance that I think I need parallel processing, dividing queries, a bit smarter, it’s got some net stuff behind it, like more

31:47 Efficient to run. I’m I’m seriously going to say, “Okay, I will try this thing again. I will I will test it out.” And so, , I’ll do my test I did last time. It was a real world use case. I had a client doing some stuff and I I moved some data through. And that was exciting. Having that move through that way was incredibly important for me to like get my head around, okay, this potentially could start changing how we design again. And when I think about game changers, I think about what recommendations would I give to differentiz businesses and if

32:19 This changes the recommendations, I’m saying it’s going to be a game changer. I love it. I love it. That’s that’s one I I’m and I also have to give the team some credit, right? We’ve been harping on them for months and saying how bad it is, how much don’t use it, get away from it. I think this is the a bit of a redeeming note here to say we should probably re-evaluate it and look at it again. All right. And I as well. All right. So, I had this is a tough second pick because I I wanted to go one or two directions, but I’m going to go where my heart is. And Mike, I’ve been having a ton of fun with data agents.

32:52 Really testing things out, really putting some projects together, but I’ve still had some frustrations around the really essential part to any agent or any concept of an agent, which is called fine-tuning. And that’s going through and understanding how it did it, modifying that, changing that up. and we just not have that ability. So, there’s a few things with data agents that I’m going to just bundle together. We’ll call this a a blue chip trade. But what we have with data agents now is the main one I’m going to highlight is

33:24 This idea called discovering query the the query results used from the data agent. So, you tell the data agent, hey, I want show me this or do this. Before I would just spit out an answer and you’re at the realm of saying, well, I hope that was what, , I don’t know where it came from, but I hope that works. But now what we actually have with data agents is actually the ability to simply see, hey, this is what the query examples that it used. It’s going to review the example queries that you provided and helped guide the reasoning. So I’m I tell the data agent, here’s the

33:58 Different queries I want you to use. Which one did it use when it actually generated a certain result from a certain question? This is such a big part of having a data agent work effectively and and consistently maybe is a better word to put here is to use it consistently for trust is okay I want to make sure every single time it’s going to use this answer or this query or this the examples that I provide of it. So this is such a big part of what we do. I also have in here the ability with the SDK to use it with the Python client to use it which is awesome. So I

34:32 Can use this in VS code. I can use this in other applications. But I get this in git now which is makes me very happy. But I think again a big part for me is that I’m providing examples. I know when it’s using a certain example so I can fine-tune that data agent because this is we have to think about data agents in the world of developing co-pilot models differently than we do with semantic models. It is a generate fine-tune generate fine-tune type of approach here where we need to understand we’re not getting it right the first time and

35:05 Right’s even a wrong word to put here again anthropic who develops claude this is probably the second biggest AI model their biggest white paper to date is their best guess on how cloud works so we’re not even understand how this works but what can we do to help provide it to the right direction we want so for me this is a huge win for people who are using data agents. I would agree with that one. I I haven’t spent as much time on data agents and that’s one of my I’m playing around with it. I’ve got a session coming up this week. So on Wednesday we’ll have a a live building a data

35:38 Agent is coming through it as well. So I’m again getting more into it trying to figure out where we can use these things. What can we do with them? how do they interact with your data? Again, I think a lot of the concept of data agents is this is exposing your semantic models, your lakehouse data, your tables, giving users the ability to ask questions against that experience and get, , basically have it write SQL for you and try to generate simple visuals to help you get answers out of the data that you may not be thinking about as a user. Well, again, it’s still very undeterministic yet. , it’s getting better, but I

36:13 Think this is a way better experience than when we were doing like Q&A on our reports. Well, but again, and the big point here, Mike, too, again, just like trans analytical, it’s more than right back. This is more than just the Q&A generate a visual because again, the the the key here is that I can integrate this with Copilot Studio. And for for what I’m doing, for what where there’s a lot of projects there, it’s such an essential part of it. This , goodness, it’s amazing. So, the fact that we can do that and then again fine-tune it is a well, I’m saying fine-tune very

36:46 Liberally here, but the ability to do close to that is pretty great. So, yeah, that’s going to be my second pick here. Mike, one thing to Yeah. So, one thing I was just shocked that wasn’t inside the blog post is there’s no mention of in this blog post. There’s no mention of DAX userdefined functions in this post. Yeah. Did they talk about that at FabCon? because I know there was a separate article on that that came out. Multiple people were discussing it and talking about it and actually Michael in the chat here is actually describing that he thinks that DAX userdefined

37:19 Functions is going to be a game changer and I do think so as well. I believe Marco Russo has just done built a SQLBI user DAX functions library or collections or something. I’m not really sure exactly what it is. but there’s a new tool that they’re coming out with. Again, this is one of these options like it’s a great feature. Where’s the library? Where’s the community part of this? Where where how do we easily get this into our semantic models? And I I think that’s another one like Microsoft is building like the structure of things, but there’s no deep

37:52 Integration to like how to get it in there. So, we’ll take a we’ll see what how this goes. that’ll be interesting. I I think DAX user defined functions will be really good. , I also think it will be probably a game changer for many people to reuse DAX, but unless until you get a collection where you can centralize all these DAX functions for an entire organization, this won’t be super effective yet. It’s so it’s interesting you say that and the reason I haven’t mentioned that because we might be talking about it in the next week. , but a big part here,

38:25 Mike, is this is very similar to what Microsoft has done with some of the other Power Platform tooling. , like for example, I can I’ve been able to do this in Power Apps for years and it’s a big part of it, but honestly, it is very the library side’s a little different here, right, Mike? Because there are things you you can do from a universal point of view, but this is very different from like a macro and tabler editor or a script that you may have in VS Code that’s easily shared because it’s going to be so custom to your

38:57 Organization, team or department. So, and this is almost like the what what we just talked about when it when it comes to the user data functions where it’s what’s what you’re limited to from your imagination or what’s possible. Now, I think the big part here is it lives in a semantic model and all these se any userdefined function that you do or user data function excuse me lives in just that single semantic model. This is again, I’m not going to talk about it too much

39:30 Because we may or may not be talking about this for a full episode and I want to give that its due. I put this as a game changer. Personally, I seeing the announcement that came out I think even a year and a half ago. It was a while ago when they actually said that this was something they’re working on. This is going to be big for a lot of a lot of use cases. Now, whether or not it goes across semantic models, we’ll see what the impact is. But what it is right now, I think this is a huge feature for us and that may change how

40:04 We develop writing our DAX functions. I think it will. I think again I I’m of the opinion is there is a small group of people in the world that are extremely smart and will figure a lot of this out. Right? I I don’t this that knowledge of those individuals needs to be as quickly disseminated to as many people as possible where we can go pick things up reusable. That’s the difference here. That’s the point. It’s reusable DAX. And so I built a tool like four years ago, 5

40:38 Years ago now that was basically a DAX library. You could bake DAX. You could make it parameterized and then you would just inject it into your model and it would then allow you to say here’s some ver here’s some variables that you need to inject into this DAX statement from your model and help you use this one. Functions fix this. Functions fix functions allow you to have input parameters to your DAX that allows it to be parameterized in a way that let you move it between. it’s proper code design. Yeah. But also but it’s complex. like it’s it’s going to take some effort for

41:09 People to figure out how best to use these things. I I think it’s a big win. Code reuse is going to be huge, but again, my my main issue here with this one is code reuse is awesome. We just need a place where we can go get all the examples from. How do we easily get it from Tommy’s brain into a collection my company can use and then easily inject it into my semantic models? That’s the limitation right now because again it it you create the user defined function in a semantic model which is great but to me this is something that should

41:41 Live a higher step up a level up in in terms of in the tenant that can be reused because I think it’s almost there’s almost a redundancy right now my initial impressions with user data defined functions because or user man a lot of user data functions because right now it only lives in the semantic model. And if I’m creating a measure for let’s say our sales tax, I’m probably going to need that more than just a single model. So then that means I’m creating the same UDF in each model, which almost misses the point to me,

42:16 Right? Because so that that’s my initial thoughts here. And again, this is something that just came out. So there’s a lot to be foreseen. And I any listen anything that Marco Russo does and Kurbeller I’m all for give bring it on thing here’s take my money type of scenario but yeah I think right now there’s a bit of redundancy when it lives in a single semantic model but regardless yeah I I think I’m expecting in two years at the end of the day in two years from now if I’m interviewing someone for a data

42:48 Analyst role in PowerBI they have to know this they have to they have to more than just know that exists they have to know the the use cases, be able to create it, and that’s part of their process. That’s that’s how I see it. Just like a measure table, I think prodevelopers, well, I wouldn’t use , I would also argue measure tables are dead. I don’t I would not recommend that. But you’re right. This is this is something I think is going to be so pivotal that people are just going to need to understand and know how to use, , DAX userdefined functions. That’s going to be that’s going to be very important. And I think you’re going to get away with a lot more things like, hey, I want to write this spark line,

43:21 SVG, all these other different things. you’re going to want to consume those things. You’re going to want to make them as functions. NJ Park has been already talking about it already. , , about how to use these things and create these elements as well. So, I think that’s a a key movement here is to get some of that stuff working. Would totally agree that with that. Beautiful. All right. So, that was your pick. , that was your pick. My pick was I did No, I I did the multitasking fabric and then I did the data agents. Okay. Well, that wasn’t my pick. ,

43:53 I’ll give you my pick next. That was just That was a trade. Okay, that was a trade. , I just traded that one. So, let’s let’s go after another really interesting value here. So, another one that I think is extremely useful here is this is one that’s going it’s maybe a little bit more subtle to people, but I deal a lot with organizations that are larger and trying to scale out and build out things. One of the main challenge points that we’ve had has been how do you handle multiple environments in fabric? been a challenge for a while. Variable libraries is going G. I like variable libraries. It makes a

44:27 Lot of sense. It definitely helps out with CI/CD, continuous integration, continuous development of things. And now it’s supporting data flows gen 2 and also copy job. Yeah. And to me, this is like a great win. Like this solves a lot of the problems we need with deployment pipelines. This allows you to change server names. It allows you to change things in line as you deploy things. This is really, really sharp. and it’s actually not that bad to implement. , they’ve done a pretty good job here. So, I really like variable libraries. And as a general word of note here as well, if you go

45:01 Through and search for the words on the blog post, general availability, oh, I was going to mention this. There’s a lot of it. And and I like the fact that the team is really trying to make feature complete items and saying these are all G. If you look at every single section, there’s almost an announcement in every single section. Data flows gen 2 had 1 2 3 4 GA items. Pipelines has 1 2 3 4 5 GA items. gateways is getting generally available items. Mirroring is stuff is becoming

45:34 Generally available. So, , in addition to being able to use this , AI powered code generation in data wrangler now generally available. Like there’s a whole bunch of really interesting generally available features coming out and this is one that I think is going to be extremely useful for people and I’m very pleased that the variable library is getting some love. I would like to see it continue to extend to more things. I think there’s a lot more things people want to switch per environment but I think variable library is going to solve a lot of challenges and the fact that it’s being integrated

46:05 With dataf flows gen 2 which is another big pain point I’ve had. I can’t use dataf flows gen 2 cuz it doesn’t integrate with like different environments. solving some of those sense. So, I’m very very excited about that one as well. So, that would be probably my next pick here on the again professional view of things and getting that added into GA and having new features come out for that one as well. Really like that one as well. I I think that it needs to be said though too, Mike, to really reiterate

46:36 That the fact that I noticed that too. I was going to mention that the fact of how many GA generally available features there are. Mike, you and I have a I think a bianual or episode where we should call kill it or complete it and they’re they’re knocking a lot out for us. They’re they’re solving a lot of our episode here because honestly it’s very frustrating as a consultant or as anyone in a company when you want to recommend something as adoption and this is hard for this is very hard for me right now where a lot of companies like

47:08 Should we invest in this like well it’s preview when’s it going to be out of preview we don’t know and a lot of companies are hesitant to move forward with those items for me yes Oh, we just lost Tommy there. So hopefully Tommy can get back here. No, I’m here. All right. I lost you at first in that. Your internet’s been really It’s It’s really lagging there on the internet side of things. So maybe you need to check on some stuff. Make sure nothing’s running. Oh, no. We’re We’re We’re going fast on my end, but I have a lost you. All right, we’re back. We’re back. You see

47:40 Me? We’re good. We’re good. I can see you now. Two fingers. Two fingers. All right. thing. the fact that anyways that we have so many generally available things is is pretty it’s essential to what we do and and to move forward with fabric. This is we need to see more of that. So that’s a huge huge win. Mike, I’m going to take from here. I I wanted to go two directions. I’m surprised for you who are one of the nine companies who’ve done it have not talked about this yet. So I want I really want to get your thoughts here.

48:13 But the fabric extendability toolkit is well speaking of hey Microsoft giveth generally available ability and it taketh away preview thing and this is a new preview feature here but the extendability toolkit is really the evolution of the workload toolkit or the development kit. This is this is going to be my this is going to be my final one if I had a final item to give out as Okay. So why don’t I pause for you? So I’m going to do a quick one here and then I I’ll give it to you here. Fabric CLI command line interface is is now

48:49 Open source and it’s also 1.2 the versions here which means a lot more command line arguments and autocomplete folder support output formatting. So Mike, I’ve developed with cursor this Python package that will basically look at any command line ex exeute exe exclication and find all the arguments and options and then basically add that to a autocomplete feature that I have in terminal. So I can say hey autocompleter feed it the fabric CLI and now I have

49:26 All these options available to me. So this is a huge thing for me because I actually really like working in the terminal. there’s a lot of especially when it comes to local model testing that I do with the AI stuff. This is a pretty essential feature. but I also do a lot obviously in VS Code. So I I love for me that interface and I love the ability to make sure that I can I’m I’m more efficient. You’re more efficient if how to use the terminal even the command line interface. So give me that. Give me some AI assisted contributions. Make life

49:59 Easier for me with AI agents and give me the CLI pip install MS fabric CLI. So that’s what I’m going to do for my last pick. Yeah, I I think it’s the CLI thing is interesting. I don’t I got to be honest, Tommy, I don’t use it very much. I’m not sure if that makes a lot of sense. I think for admins, this is purely I think an admin level feature. Like you’re just you just want to use admin level, , write code, know exactly what you’re doing. You really got to understand what you want to do once you get to the CLI. So, I I like it. Am I there? Probably not yet. I’m probably not there yet. , but anyways, , it’s

50:34 Interesting. , okay. So, I’ll go to my last feature here. So, a couple announcements, right? So, I want to also be very clear. If you’re interested in PowerBI embedding, I’ve been doing a lot of work in this space. We have a product that we use , we we sell this to customers. It’s called PowerBI in , Intellex. an embedded accelerator for organizations to get started with embedding. we also build a lot of software for Microsoft. has any anyone may have used power designer. So if you are using a theme generator, we now offer power designer which even does more cool

51:06 Things. you can do build your report the design inside fabric and you can one button press publish it directly to directly up to a workspace and even connect it to a semantic model. So super cool there. Mhm. So, we’re doing some really impressive things with workloads version one. Now, workloads version two has just shown up and we’ve got on the docket. We’ve got three additional workloads that we’re ready to get out the door. Wa. So, we’re we’re going to build a data marketplace that’s coming. So, stay tuned. You’re going to see that in preview here fairly shortly as soon as

51:38 Microsoft can figure out how to get the marketplace working. We have a private preview and a and envision fabricto fabric tenant sharing. Right. I’m going to I have data in my tenant and I need to give shortcuts or tables to another organization. We’re going to enable that. So, we’re going to do fabric to fabric tenant sharing through a workload. , we have another workload that we’re going to be working on that’s going to extend DANB and charticular directly into fabric, give you another way of interacting with those tools. , and we have more workload designs coming up as well. So, our team is really loving this

52:12 Fabric extensibility toolkit. We’ve got versions out there. We are in in preview right now with some private preview on the data marketplace items. It’s really exciting. So I’m very pleased to announce like this is this is giving extreme capabilities to users who need to get into fabric and build experiences dedicated around data data products things that are frictionfilled that Microsoft hasn’t developed. This is a place where you can do all that extra work. So wow stay tuned. , Carlo Consulting and and PowerBI tips is going to be producing a

52:45 Lot more workloads here in the near future. All through the version two that you’re doing all through version two. Yeah, it’s it’s we’re all going to use be using that one moving forward, I think, and we might even take some of our original Power Designer stuff and move it over to version two as well. So, huge improvements from our side. It’s extremely powerful for us. , and it gives the user these are all the gaps that Microsoft doesn’t really fill in. we’re going to just make it easy to use and use them directly inside the fabric ecosystem. So, I think it’s massive, super super useful here. Anyways, I can’t speak highly enough the new fabric

53:18 Workloads. I’ve been working closely with the dev team to build out this feature and give a lot of feedback and what things are working, which ones are not. but that I think you’re going to see workloads become a huge part of part of what fabric is going to be in the future. there’s going to be a lot more useful tools coming out and right now I think the challenge is most companies that are trying to build workloads they don’t understand the pricing model I think they’re overpricing their workloads too high and they need to think about better pricing models so we are being very price sensitive in these workloads and making them very efficient to run but

53:52 Also very easy to get out to as many people as possible so that’s our goal so anyways that’s what we’re working on how we’re handling things with workloads there’s a million more things in this update, but we are almost right at time. Yeah. So, I don’t think we can pick up any more items. There’ll probably be more of these topics as Tommy and I explore them as we build stuff with them. We’re going to need to do some workshops around this. I think Tommy, this should change part of what we do in October at the at the Chicago user group. Speaking of which, you should probably give us a pitch for that as well. What’s What’s happening in October? Oh, there’s a Chicago fabric user group downtown in

54:27 Person at 300 PM. So, actually not at the end of the day. So, if you want to join us, go to meet up our Chicago PowerBI fabric user group link. You make sure to sign up this week by the way because we need to have your full name in order for you to get into the door. But what we’re doing, Mike, is we’re doing a fabric Chicago’s crash course on fabric. So for a lot of you, if you actually want to play around with it, you actually want to navigate, you’re going to have the ability to. If you want to learn about fabric, what it can do for your organization, bring your

54:59 Boss. Bring those people around. It’s $3 to sign up and you get Jimmy John’s. And again, the only reason we charge is because that’s actually what Meetup recommends to make sure that we actually know you’re coming. So put your money where your mouth is, so to speak. But yeah, it’s going to be from 3:00 to 5:00 p.m. So then you can take the train out of there, do whatever you need to do. We’re going to be in person. Mike and I will be there and we’re gonna have a lot of fun. So, yeah, a ton of cool things going on there. Mike, I would totally agree with that one as well. And I think this is going to be immensely useful. Get your hands on the tool. ask us questions. we’re

55:33 Going to try and go through some of the all all the very touching some basic features, but also giving you some what is an agent? Let’s build one of those. We’re going to create a workspace from the ground up and whatever you guys want to do. Sorry, interrupt. Yeah. No, it’s good. I like it. I think this is great. I think this is going to be something where you need to get your hands on. I like learning by doing things. I think this will be a great experience around being able to learn to do things. Yeah. Awesome. Love it. That being said, thank you all so much for listening to the podcast. I hope you enjoyed this summary of what happened in Fabric Conference. I hope this also pointed or highlighted some of the

56:04 Features that we think are going to be extremely valuable to you as a user. And we really hope that you enjoy in, , interacting with these brand new features that have just been published. So, that being said, Tommy, where else can you find the the podcast? You can find us on Apple, Spotify, wherever you get your podcast. Make sure to subscribe and leave a rating. It helps us out a ton. And share with a friend since we do this for free. Do you have a question, idea, or topic that you want us to talk about in a future episode? Do you have an argument with what we said? Do you want us to expand on something we may might have glossed

56:36 Over? Well, you can do so. PowerBI.tips/mpodcast. Leave your name in a great question. We love the mailbags. Keep them coming. And finally, join us live every Tuesday and Thursday, 7:30 a.m. Central, and join the conversation on all of PowerBI tips social media channels. We thank you so much. We appreciate your time, and we’ll see you next time on the podcast. See you on Thursday.

Thank You

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

Composite Models vs Reusable Datasets – Ep. 460

More Posts

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.

Feb 18, 2026

Hiring the Report Developer – Ep. 503

Mike and Tommy unpack what a report developer should know in 2026 — from paginated reports and the SSRS migration trend to the line between report building and data modeling.

Feb 13, 2026

Trusting In Microsoft Fabric – Ep. 502

Mike and Tommy dive deep into whether Microsoft Fabric has earned our trust after two years. Plus, the SaaS apocalypse is here, AI intensifies work, and Semantic Link goes GA.