PowerBI.tips

Fabric Capacities – Ep. 226

Fabric Capacities – Ep. 226

Fabric makes it easy to light up powerful workloads fast—which means it’s equally easy to start spending CUs before you’ve built the habit of watching them. In Ep. 226, Mike, Tommy, and Seth talk through the practical side of capacity management: what to monitor, why preview-era artifacts can skew your metrics, and how to set guardrails so your pilot doesn’t turn into an accidental production bill.

News & Announcements

Main Discussion

Topic: Fabric capacities, CU visibility, and avoiding “mystery” spend

Fabric capacity management isn’t just about picking a SKU—it’s about building feedback loops. The conversation starts with the Metrics app (so you can see what’s actually burning CUs), then moves into a tougher question: what counts as ‘expected’ workload versus background activity you didn’t intend to run.

They also hit a theme that’s going to keep coming up during Fabric adoption: preview features can be valuable, but you still need production-grade habits—cost visibility, ownership, naming standards, and clear rules for when compute is allowed to run.

Key takeaways:

  • Install and use the Capacity Metrics app early—capacity decisions without telemetry are just guesswork.
  • Assume background artifacts have real cost until proven otherwise; validate what gets created automatically in your workspaces.
  • Treat ‘preview’ as ‘still needs governance’: define owners, naming standards, and a process for turning experiments off.
  • Use pause/resume strategically to control spend during off-hours, but make sure it won’t break refresh or downstream jobs.
  • Scale capacity based on real workload patterns (refresh windows, concurrency, interactive usage), not on vibes.
  • Licensing/entitlements are part of the operating model—align access with who’s accountable for cost and reliability.
  • Build a pilot playbook: measure CU burn weekly, review anomalies, and only then expand to more domains/workspaces.

Looking Forward

Build a simple ‘capacity ops’ checklist—Metrics review, anomaly triage, and a pause schedule—before you expand Fabric usage beyond a controlled pilot workspace.

Episode Transcript

0:31 good morning and welcome back to the explicit measures podcast with Tommy the Seth and the Mike this Seth is here Mike ready to go man the mic is here to go Seth the mic the Tommy we’re all here and just in case you three times wasn’t enough let’s do it again guys who’s all here again roll call off one right [Laughter] oh wow oh man is it felt like a long

1:01 oh wow oh man is it felt like a long week for you Ty you guys I feel like it’s been a long week for me so far yes yes to all days catching up on things been utterly crazy it’s good though so let’s pick off some introductions any news around the web I think we have a tweet from Bernat we’d like to pick on here a little bit who found this tweet was this you Tommy Tommy yeah I as of going through the Twitter threads and something that’s a bit near

1:32 threads and something that’s a bit near and dear to me in terms of what’s going on with fabric so we’ll put this in the chat as well yeah here’s the chat here’s the thread the the Crux of this is when you create a lake house and there’s some things being built and we’ve talked about this already but there’s a bunch of these data flow gen 2s sometimes like four extra items that I don’t know necessarily what to do with and I it’s one of the more

2:04 with and I it’s one of the more frustrating things because I’m been dealing with trying to create a data set in like the default data set in fabric like let me just recreate something just see how it works see how quick everything is well there’s I don’t say problems but the getting used to right where let’s say you’re using the direct link query mode and how do you direct link query mode and how do the last time the data was know the last time the data was refreshed even though from Power bi from desktop well it’s the latest you have in

2:35 desktop well it’s the latest you have in the lake house there’s a test so that’s a good question so directly apparently you just link it together and it’s just supposed to know when there are additional files or when the Delta table changes the power bi report automatically updates I don’t know what the Cadence of that is I don’t know how fast it can pick up those new partitions or New pieces of data what I wanted to do is I’ve I’ve wanted to have like a report that’s automatically refreshing like every minute and then have data being entered into a table every two

3:05 being entered into a table every two minutes or something like that so like every two minutes I can see okay the last piece of data that entered was five minutes ago it keeps adding I know that I know the Delta table is different because it keeps adding data to it okay what what is changing on the report side and how long does it take for the Delta table to update to automatically get the changes into Power bi which should be interesting I’ve heard I feel like I’ve heard someone in some presentation somewhere say it was about 15 minutes between when the Delta table updates itself to when

3:36 the Delta table updates itself to when the data appears in power bi and frankly that’s not that bad what was I who was I talking to oh well I was also talking to someone have you have you ever run into a situation where you were presenting or you were refreshing data sets multiple times per day and you now have multiple people looking at the same report and getting different results because their browser cached

4:06 results because their browser cached something or the data changed throughout the day and we’re now having disputes about the information because the data changed throughout the day day I haven’t noticed that but I have noticed that recently when like making a change to a report that’s published in an app yeah I’ll publish the change and people won’t see it and then they’ll refresh their yeah and they won’t see it and they have to log out and then log back in

4:38 to log out and then log back in interesting actually considering like heavy caching I have no idea so I feel like what we just encountered a situation where people were questioning numbers in a report and they were refreshing it too fast and I think the case they were trying to make here was don’t refresh your report multiple times per day if you can’t make a decision between when the report was refreshed like if there’s not a decision being made between the time of when the data was entered in and between the refreshes

5:08 refreshes there’s no need to refresh your report more than once a day period that’s to me that’s the decision point I just I had never really encountered this before and to hearing customers struggling with they were fighting over numbers because the data set was changing throughout the day like it was refreshing like every couple hours it was and then it was causing problems because someone had a different number than somebody else and they’re like ah this is a pain and they’re like well we should probably just refresh only once per day yeah

5:38 per day yeah I agree right so what is it what is the outline here Bernard saying that he created a data flow yeah he created a day for four other others yeah he created the data flow and four other items appeared so if you look at his image data flow one appeared and then he got a data flow staging warehouse warehouse a data set of the data flows staging Warehouse Warehouse a lake house did a float staging lake house with some awful good that said stuck to it which again I hate that and

6:08 stuck to it which again I hate that and then it gets another one right behind it another data set so it gets two data sets and it looks like two Warehouse objects but one’s a SQL endpoint one’s a warehouse and there were two data sets just off of that one data flow yeah the curiosity is Think Like is like what’s happening in the data flow though are these objects created as part of that or that or my limitation that like all that just gets contained within the data flow right like I don’t know after after like kick the tires on a little bit but what

6:40 kick the tires on a little bit but what is curious to me that that maybe should have come to my mind earlier is if I’m in the in this case where it’s like you have a SQL endpoint you have a warehouse you have a data set what what it it corresponded to a thought I had where I’m looking at fabric preview in power bi desktop and there’s connect to a lake house connect to a warehouse connect to whatever why yeah connecting to those instead of just looking in one like

7:12 just looking in one like agreed why is it why is there a separation yeah and I it’s like they’re it’s like automatically creating data sets for you yeah for what but like yeah understand here’s the premise that I’m gonna I’m gonna I I have to go test out now today my my my my premise has been it doesn’t matter what which one of these paths you use whether it’s lake house or Warehouse or Jupiter notebooks or SQL interface

7:44 Jupiter notebooks or SQL interface it’s Delta tables that we’re dealing with yeah it’s all the same stuff on the lake side so I don’t care where or how the data was manipulated I hear what the end object is but if it’s all about the end object why do I have multiple connections to go into these things if they’re not separate separate so here’s where I think again I’m I’m gonna so one thing I can confer is or not you are 100 correct creating a data flow created a whole bunch of other artifacts that we don’t know what

8:16 artifacts that we don’t know what they’re there for hey we’re not gonna chat hello so I didn’t do anything in the data flow I just created it well good good information and and so I also I did a similar thing where I created a Delta lake or a date Delta lake or a lake house I guess would be the lake house what you would call it I created the lake house and by default I get two more artifacts so just by creating the lake house the one object of the lake house you get by default a SQL endpoint and you get by default a

8:48 SQL endpoint and you get by default a data set from that lake house so when you make those objects other things or other artifacts are like immediately being created for you and in my mental model I’m thinking here the SQL endpoint is nothing but there really is nothing there’s no physical items behind the SQL endpoint because it’s like SQL serverless right it’s a it’s an environment that you can build in that you can then go reach into the lake and pull out those tables so to me the SQL endpoint is more of like a that

9:19 the SQL endpoint is more of like a that SQL serverless entity now if I relate this to what I know inside synapse when you create a synapse environment You by default get like a default database that is the SQL serverless portion and maybe that’s what they’re trying to represent there is hey there’s a SQL serverless database that’s running now you have the ability to connect to those tables directly so that part exists but then on top of it they’re also giving you an automatic data set which I think is confusing now because now you have this entire lake house so does

9:49 have this entire lake house so does every table in my lake house automatically get added to this lake house data set thing maybe so now they’ve got this this definition of all these tables inside the lake that are now being immediately applied to this Dev Lake 01 data set and so now you have this Bim essentially there’s a there’s a Bim being created on top of all these Lake elements it’s just in general it just doesn’t seem like it’s very clear when I create one like if I I feel like I would want to create

10:20 if I I feel like I would want to create or on my own say well do I really want to consume this data from the lake house using the SQL endpoint or data set like I feel like I would want to add have the choice to say Auto create me these things or turn it on so I know what it is well it feels slightly better Alexander popped in the chat and said the dataflow objects which I assume are the goods

10:39 objects which I assume are the goods good ones are internal objects that are supposed to be hidden the lake house stuff is on purpose oh let’s see so we’re down to three [Laughter] then it’s like then you have one one of each object though yes right so the data flow staging Warehouse table and yes I guess the home for the warehouse yes yes all right it’s getting a little it’s getting pretty difficult as I’m starting

11:10 getting pretty difficult as I’m starting to build in terms of my tier the previous Point not just what what nourishing whether it’s the data flow or you’re doing a job but trying to understand the refresh schedule all up to date your film data sets going to be because again the big thing is it creates that default data set which again is a pain in the butt if you can’t do anything in tablet editor and trying to edit on the web and sometimes it doesn’t sink in display folders don’t work but there’s other

11:40 folders don’t work but there’s other things things I’m trying to understand like okay I just have something very simple a Gen 2 data flow pushing to a lake house and I’m like okay is the data set up to date well it really doesn’t matter about the the direct import mode unless I the the direct import mode unless the data flow is updated so how do mean the data flow is updated so how do I know that and I don’t know if you need a this is probably a bug but the default data set there’s I can see the data in

12:10 data set there’s I can see the data in the like the lake house view but trying to create a report off of it nothing never tried it before and there’s measures that can be in a data set there’s measures that I think would be in the warehouse and then you can create another data set so it’s it’s a little getting used to right now all right I think I’m gonna at some point here we can probably talk about about a use case scenario which is unique to one I’m that’s near and dear to my heart which it challenges a lot of the lake house where you have

12:41 a lot of the lake house where you have like multiple tenants but that’s not for today not for today we’ll get on to a tangent we’ll get in 30 minutes in discussing other things that not even the topic of today speaking of lake house there is a new Gateway connection where apparently they’ve added support for the lake house which is interesting to me would I need a gateway to connect to my lighthouse but you can now then there’s another blog yeah there’s there’s also beginning on March 15th just in

13:13 also beginning on March 15th just in case you didn’t know any power bi data flow using an on-premises data Gateway version older than April 2021 might fail that’s just a reminder to keep up to date on your oh wow periodically but those are the two main big things in the June excellent good to know about that one gateways are one of these annoying things I just wish they would just go away knowing it’s an administrative thing I know it’s just I just wanted to work I don’t want to have to like hey what when

13:43 to have to like hey what when you’re when you’re all in the cloud in the same Azure ecosystem it doesn’t matter because it usually does just work yeah without it that’s true that’s true very true point there so one more incentive to get you off of on-prem and there you go Cloud right just one more compelling argument all right so I guess we can transition into our main topic for today this is one that I’m trying to wrap my head around so I’m not sure if I have any more answers than what’s presented to us in the documentation but the

14:14 to us in the documentation but the conversation for today is for today is around how do capacities work inside fabric what capacities can you get and what do you need to know about the capacities that fabric provides to you what are the feature enhancements you get there from using fabric so let’s dive in here Tom you want to give us some introduction pieces here I want to give us a a landscape of when fabric started what did we what has changed now since we now got fabric what has changed is a whole new set of

14:45 what has changed is a whole new set of Licensing and which is I believe coming up if I’m not mistaken it’s the end of June or the end of July that our trial period ends yes anyone who jumped in early got is going to be having this you got is going to be having this it’s going to start know it’s going to start running out of your trial subscriptions yes and once that happens the cost comes into play and there’s been a few articles a few documentation on people

15:15 articles a few documentation on people testing out what the cost would be but still right now it’s it’s a little ambiguous what you’re going to be paying for and not just that but the big part here is obviously there’s licensing and what you need to purchase but really what do you turn it on for and again this goes back to are you migrating everything to the fabric what does fabric do where you’re willing to spend the money

15:45 willing to spend the money I think right now once you enable the fabric in the tenant settings even if non-fabric capacity because you’ve actually look at the the type of workspace that you in the settings there’s a premium trial capacity outside of Premium pre-user or premium capacity so I guess they would get converted but the biggest thing right now is really where do we go in terms of what do we turn on how much do we allow from ensuring that in a few weeks our cost

16:17 ensuring that in a few weeks our cost doesn’t go through the roof yeah I think I don’t think this is the same thing that Microsoft has done in the past with premium and pro users so hopefully you don’t have a whole bunch of people using fabric I did have a client recently contact me and say hey look my my ID Administration reached out to me and we have 30 fabric licenses applied what does that mean so

16:47 in their tenant they had turned they had allowed the ability for people to have a power fabric environment however they turned off the ability for the team members to use fabric right so there’s inside the Microsoft settings they said they said turn on fabric we’re going to allow you our team our company to use it but it’s only restricted to the security group so only these people can create a fabric workspace workspace however in their tenant they did not turn off the ability for people to get Trials of fabric so people were getting Trials of fabric applying the license to

17:17 Trials of fabric applying the license to their their identity but they weren’t able to create an actual fabric workspace so we had to do like a quick like triage okay quick let’s go look at all the workspaces are there any fabric workspaces laying around no other than the one that you created as a test user okay great all right let’s let’s go through all the users let’s quickly fix the setting inside your admin portal no one gets trials anymore and of the trials that were out there let’s turn those off for those users and push them back to their regular premium or Pro

17:47 regular premium or Pro licensing so make sure they only have that and not trying to use the premium licensing or the the fabric licensing so anyways I thought that was interesting I have not seen any pricing so when you look at these so let’s start with the first article maybe that’s I don’t know where should we start with this one there is a buy a microphone blah blah blah buy a Microsoft fabric subscription page so let me let me start with this one I think maybe this is the getting started page we should begin with

18:18 page we should begin with I’ll put the link here in the chat window window this is the page where you go to learn about how to buy a fabric license it sounds like they’re taking a lot of the same playbook out of Premium so the Azure a SKU and then we have a Microsoft 365 SKU 365 SKU have you guys been able to find the Microsoft 365 SKU pricing yet for fabric I don’t see it in my Microsoft admin portal to buy Microsoft

18:50 fabric does that sound right to you I’d say what what so so there’s two skus there’s an Azure SKU so if you go into Azure to go buy a subscription right you can go into Azure you can type in fabric a license just shows up here’s the license for fabric when you go into Microsoft 365 or your admin admin. microsoft. com no you can search for fabric but the only thing that comes up is magical soft fabric free and that’s what I see Microsoft fabric free so I don’t think there’s any

19:20 fabric free so I don’t think there’s any pricing announced for the Microsoft 365 licensing ad yeah I think Alex is confirming here the the Microsoft pricing thing isn’t announced and they’re not going to be sold in office 365. what all right well that would make sense why would you why would you have them in that’s how you buy P1 skus today why would you have a different purchasing method than that right but the whole point of getting out of a p-sku would be

19:50 p-sku would be getting a f skew correct you would change you would change from a p skew to an F skew so you get more features right so I would assume the P skew is going to be whatever the price it is and an Su is going to be more than the psq that would be my assumption because you’re getting more features well more right but that that’s also what was I talking to I read something along the way where somebody was doing some analysis right like yeah yes in in Azure it appears as

20:20 like yeah yes in in Azure it appears as if a equivalent p-sku in the f is more expensive however yeah vast majority of organizations have discounted rates in their Azure subscriptions okay right so yeah I think what they had like and I’ll have to like crunch the numbers but I don’t know the blog or whoever you I don’t know the blog or whoever was reading whatever showed that know was reading whatever showed that like essentially it would be the same price price or it should be near the same price as a

20:52 or it should be near the same price as a p-sku if you were rolling that way so I think like I have an interest like a lot of all we have a lot of embedded capacities yeah Etc like I’d love to manage all that in Azure so if there’s a possibility for me not to deal with 365 and P skus yeah I agree with that one yeah and so I would love to pull it out but I have been able to see so in if you go into Azure and you look for add a new asset or add a new artifact to a resource Group

21:17 add a new artifact to a resource Group you can see fabric there and you can see the pricing and it does show you the F skews going down to F2 all the way up to like f like 165 or some some ridiculously big number herein lies so this is where capacities start getting a little bit confusing to me well let’s keep going that one so one the Articles feels a bit misleading because it talks about the Azure ASU which I can see and then there’s like literally one sentence talking about the Microsoft 365 SKU it basically says Microsoft to 365 skus

21:48 basically says Microsoft to 365 skus also known as P skus are power bi skus that also support fabric which is enabled on top of your subscription so maybe it is it’s not AP it’s still just the psq the psq I don’t know a psq is a fabric SKU I don’t know it doesn’t it doesn’t make sense to me well it works alongside Fabric in some in some cases like it supports fabric a psq or the premium the capacity SKU still will support some fabric things okay so fabric content so okay so the

22:19 so okay so the so what does that mean then so does that mean AP skew doesn’t give you all the features of fabric but just some of them this is where I’m confused this is where I’m getting confused about the licensing again the Microsoft’s life scene is just absolutely nuts would I yeah and it’s getting worse on the licensing page now well I’m on the buy a Microsoft fabric subscription that’s the page I put the link to in the chat that’s what I’m trying to read through so this is the super simplified one super simplified I think the one that makes it starts to bring a little Clarity is the Microsoft fabric licenses

22:51 Clarity is the Microsoft fabric licenses and I’ll paste this okay here because there’s a table in here okay let’s go to that one it it lays out the capacity license types user capabilities and whether or not it supports fabric okay okay the The Tweak in my reading of this was like you read through this and it just walks through the fabric trial yep shared capacity meaning PPU yep so there is no support for fabric there in the p-sku there is Azure sqs there’s not and then obviously

23:23 Azure sqs there’s not and then obviously the fabric capacity what I found interesting was the the star and quote underneath some of the organizational license mentioned in this table are Legacy licenses inherited from Power bi I’m sorry what does that even mean what you are I’m sorry are you referring to my beloved power bi and the licensing as Legacy really put up your dukes wait a minute who wrote this article there are five

23:54 who wrote this article there are five contributors I’m going to be following up with them oh my goodness I’m going to read this one again anyway let’s get into the table but that one irked me and so this is this is what you’re talking about in the workspace element right so the workspace is residing site capacities and so it’s talking about the workspace feature set that goes along with workspaces right right because it’s ultimately about what what you’re able to do in the workspaces right that matters

24:24 right that matters so how we’re assigning capacities would associate to whether or not content is available or Works within a particular workspace how this how this merges together in different workspaces I don’t know or if we have to cut over all of the same at the same time what I will say is I and I can’t find it immediately but I swore I saw an article where you can upgrade like or move your

24:56 can upgrade like or move your you’re I think it’s specific to embedded but you can move you’re embedded I think to a fabric capacity but you can’t go back which would make sense because in one of these cases like yeah you move to a fabric capacity yeah so if you have an A skew you go to if you if you went from an A SKU to an F skew changing over your workspace it will fully block you from going back to the ASU right it will it would not let you go backwards

25:27 so Azure SQ independent software vendors okay so the thing I don’t understand is can you do embedding with Microsoft fabric skus like does it does an F skew let you embed con so this is where things start getting very weird to me right where I think I think yes I have to test this out I’ve been this has been one thing I’ve been meaning to test is to test out the fabric skew or Microsoft fabric no would you say not so A’s and EM skus do not support Microsoft fabric correct that’s what I understood but well they don’t yeah they don’t

25:58 but well they don’t yeah they don’t support fabric but those fabric support embedding embedding that’s that’s right see what I’m saying like like and also here I think I think the work so when we talk about capacities we’re talking about capacities at the workspace level in a lot of cases here right so another thing that I was trying to get my head around was Microsoft had this piece of information that said look if you are an f-32 which is like the higher end of the of one of the fabric skus if you’re an F32 you don’t get powerbi. com

26:29 you don’t get powerbi. com like you still need a power bi Pro license to be able to go into powerbat. com to use that F32 which feels a lot like the Azure a SKU to some degree right it’s still pre so Pro users are still required to be there however when you get to the f64 when you move up one level higher which is the equivalent to a P1 SKU You Now by default get all of the Power bi. com portal elements right so there’s this like really weird like even though the fsq is a consistent SKU number from F2 all the way up to f-256 whatever the

27:01 all the way up to f-256 whatever the number is number is there is a threshold in which there if you get to a certain level in fabric you no longer need Pro licenses and you can just transition all the way over to free users to consume content from the portal because now you’re officially at the key level there’s there needs to be like a feature list a feature intersection here of all the different skus and and purchasing power you have for these different licenses to help you really understand like what is included in each of these things I feel like fabric is like they include

27:31 I feel like fabric is like they include everything but then they do silly things like well you need a pro license not to access a workspace that’s on a fabric then it’s like it’s then it’s not all-inclusive it’s now exclusive so it’s it feels like it’s half-baked or it’s just on a transition path maybe maybe the the yeah I’m not gonna get into like how I think this is all gonna like roll but anyway speaking to different capacities

28:02 anyway speaking to different capacities and skus if you’re not familiar there are a whole slew did we did we paste the one that we’re talking about right now and do yes yeah I did paste it in there okay so I think it does a pretty good job of at least outlining where like what sorts of workspaces are supported by the different capacities and then if people aren’t familiar I I am happy that they at least with the new F SKU aligned it to the power bi equivalents because people are

28:32 power bi equivalents because people are familiar with power bi equivalents if you are working in capacities as admins and applying that to your organization it’s nice to at least have that like oh okay if I wanted to cut over here’s what I would cut over in and and I would like test out the the different scenarios of can you just roll into a an F 64 from a P1 right right pretty easily so like that that’s pretty good the whole capacity units

29:02 pretty good the whole capacity units thing I’d love to see the white paper on that and what that means but it to your point earlier and I don’t see it in any of the other documentation that that the challenge that I have in my head and if Alex knows about this please if if I think about the workloads of a power bi P skew right it’s specific around around end users

29:33 end users using reports and hitting tabular models like hitting models on the back end compressed memory storage units whatever in which it’s it’s read I’m reading a lot of data and I’m doing that as efficiently as possible with fabric I’m now introducing a whole bunch of other stuff yeah I’m introducing loading up data into database like into tables and cleaning and like keeping keeping

30:04 and cleaning and like keeping keeping lots of moving lots of data around and doing data things those types of workloads are significantly different and if you’ve worked in Platforms in lake houses or other things you can pick many different versions of machines to streamline your workload onto the appropriate data set data that that you’re doing is it super memory intensive or is it super CPU intensive or are you doing data sciencey things right

30:39 where like is is the F SKU just going to figure that all out for me right like yes there is like there’s turn it on or turn it off pause or resume yeah right and I think what what I caught in and maybe this is getting into the pause and resume thing too is too is if if folks aren’t familiar it certainly sounds like when they say pausing can make content unavailable that just like other environments I need I need to have a cluster running which

31:10 I need to have a cluster running which is you need you need to keep on different all the time yeah different from people who like may not understand like yeah your SQL database it’s always running like there’s always there’s always an engine that’s that’s on behind it but we get into this false sense of like oh I’m just accessing data in the table it’s not not reality so I think I I wonder if Alex’s yes is to fabric just automatically figures this out which would be pretty amazing simplify simplify the whole experience yes

31:40 whole experience yes yeah that’s my understanding too is fabric is equal to the serverless capacities of Microsoft right it’s a it’s a platform as a service you want to use it you just turn it on and it works I think the Microsoft’s gold sounding yeses for all of our our

31:56 sounding yeses for all of our our friends on podcast platforms that Microsoft has figured out how to do this much more efficiently than we could so let me so let me all my costs are going to go down when I roll onto fabric right well here here’s the thing right what happens when you want to cost optimize this thing and I’m going to start throwing out some weirdo scenarios here right so where I’m going with this is step one I’m looking at this going okay what if I want to turn on a fabric workspace because now that fabric

32:26 workspace because now that fabric workspace is able to create the one Lake right which are flat files and then and then can I consume those flat files from another workspace like so say for example I have a workspace that’s all about data engineering or I’m loading data into that data that workspace okay I turn on my Fabric in the morning I run all my jobs in the pipeline so I turn on fabric the pipelines run the data gets loaded into the one Lake at that time would I then be able to trigger loads of various data models

32:56 trigger loads of various data models across my other workspaces okay so now the other workspaces are now accessing fabric grabbing the tables loading their data sets and basically hydrating all the data and by 8 AM or whatever whatever 6 a. m all my loading is complete and then I turn off my fabric SKU so I could literally turn on fabric for the morning run all my pipelines into Data engineering and then officially turn off the fabric because once the models have been loaded with information I no longer care

33:26 I no longer care to have fabric running Fabric’s done it’s done its job I could just let the models import their data and then those models can run all day long without refreshes refreshes and it serves everything in Pro licenses so you could still stick with the pro licensing and only turn on fabric for when you need to load data stuff and I think I was hearing there was a presentation recently from Casper on BI talking about this exact thing like customers are actually looking at the scenario of okay what what can I build

33:56 scenario of okay what what can I build in the fabric workspace that can be reused or leveraged in other workspaces of different capacities and how can that then continue to cost optimize what you’re trying to build does that make sense what I’m saying there there I would greatly wait until the documentation is longer than five bullet points because right now that that DOC for pause and resuming is literally how to pause it how to resume it without any of the effects that it may have I’m I’m gonna argue here the documentation is

34:26 gonna argue here the documentation is not going to ever answer these types of questions I think this is going to come from the MVP community of people trying things out because right now all these new technologies just landed in your lap you don’t have anything there’s no patterns here there’s no white paper there’s no best practices there’s no guidance everyone’s trying to figure out what’s the best way to play with these things so we’re it’s like a race right now to figure out okay what are what is the cheapest way to do something what is what is the easiest way to just generate some ETL that’s not a super burdensome on the the engineering team but

34:58 on the the engineering team but yet still serves all the reporting stuff that we want at a lower cost we don’t want to sit I think again I think Microsoft is naive to think that micro everyone’s going to turn on at a p64 SKU and just let it sit around all the time not being used all like it doesn’t make sense I think people are going to going to scale That Thing Up and Down based on their needs yeah I and that I really like the point you made and it’s unfortunate we don’t have a white paper yet related especially because I think this dovetails onto our previous conversation around workspaces

35:28 conversation around workspaces right like the the choice of does this mean that we have a workspace that has a capacity that we pause and resume like like Alex is saying like based on an API or some flow where we’re doing our heavy lifting of all of this data and it is a it is a capacity that is larger and I probably would allow to scale up scale down if that if that feature is essentially the same as me saying min max workers if I’m assigning

36:00 saying min max workers if I’m assigning a cluster to certain things exactly like my problem is can I set it to not go above a certain threshold but either way that’s the one that would spool that was that would be the one that is like man major major workload that has to get processed through in a certain time period That’s sure we’re maximizing things loaded into the table done and I have a different workspace workspace yes that has a different capacity yeah in which I could read or interact with that data exactly hack fashion or with people needing

36:30 hack fashion or with people needing access to that data throughout the day now the other interesting scenario you you point out is what if I’m not using direct Lake what if I am like interesting data into a model yep you interesting data into a model yep and that is only needed like on a know and that is only needed like on a daily basis which a lot of our analytics especially when you’re talking about large data volumes like it it is one of those things where it’s just like yeah you’re not refreshing data all day long yes so could you could you spool up and then shut down sure I like there’s

37:00 then shut down sure I like there’s another use case right so how many workspaces am I at already yeah exactly yeah just just between like the the data engineering work this is why we this is this is why I was doing clickbait yesterday with rip workspaces because it’s rip workspaces because there’s going to be hundreds of these dumb things and now we’re talking about like design patterns that’s going to now need three workspaces just by themselves to spin up and spool up and refresh and kick refreshes are not refreshes like it’s you now have the ability to flexibly do that well I’m

37:31 ability to flexibly do that well I’m glad we’re talking about it because if anything if I would read through this documentation be like oh okay great I can I can buy a capacity I can assign as many workspaces as I want to this compact and then things are just going to work but to your point like like should they exactly like maybe you don’t like maybe if you do have a p-q you need that but that should still be for your psq and If You’re Gonna Roll other things onto here maybe those should be lower skus yes maybe you need to you lower skus yes maybe you need to actually have a a higher end one

38:02 know actually have a a higher end one that are like lumped down and it’s like yes workspaces are free but not really workspaces are free they’re not free whatever you’re a pro license they’re always 10 bucks a month no what I’m saying though is is I wouldn’t want to throw all my workspaces that are doing all of these different tasks into the same like using the same capacity because they don’t all need the same capacity capacity and if we know and we talk about these data loads into different tasks and like

38:32 data loads into different tasks and like all of the things data engineering are now thrown into workspaces I think it’s important that we are talking about there’s a segmentation you can do if you need to need to as opposed to same capacity throws many workspaces as you want in there yes so I I don’t so what I’m feeling right now is again if I think about what fabric is versus Pro users or premium users inside power bi right so we’ve had this ecosystem of power bi and again as you’re saying this set this is becoming clearer in my head there is this in the in the Pro and

39:03 there is this in the in the Pro and premium layer of Licensing for power bi we deal with data sets we deal with reports and we deal with sharing that’s the primary function of what power bi. com does what we’ve just added is two or three more new personas the data engineer and the data scientist right these are these are purely data and these are different workloads that do not belong in the power bi ecosystem from what we’ve been using or building the last nine years in power bi whatever how long it’s been right since 2015 right so now what I’m looking at

39:33 2015 right so now what I’m looking at and saying okay well that means now I can still let power bi service those users but I don’t think I need I’m not going to go in and say change all my workspaces to now fabric because I think now fabric is now addressing a specific workload that is only required for those different personas and when we did the whole personas dive diversion around like oh let’s look at these different personas around Fabric and how they would be used inside they’re all wrong I don’t think Microsoft is is really

40:03 don’t think Microsoft is is really acknowledging these like what people are really building right so I feel like there’s going to be a handful of fabric workspaces and then it’s going to be up to people to figure out what is the combination of how do I load data in those data engineering workspaces and what does that do to serve the broader part of the organization I think I think if you if you look at the very bottom Matthew roach’s pyramid around personal reporting all the way to Enterprise reporting which I think is a great document I refer to it almost every day now but at the lower end of

40:34 every day now but at the lower end of that pyramid right you could you could have a single user do their own data engineering build their own data science build their own reports build their own data sets build their own you could have a single team or person build everything on their own they don’t need anyone anyone’s help from anyone but then as you’ve moved farther up the organization towards that Enterprise level reporting you don’t want everyone building random things on top of data sets you want to have a more curated and regulated process around governing that information so in that case you you don’t want to give fabric to

41:05 you you don’t want to give fabric to everyone because then they’re going to be building things they shouldn’t be in an environment that’s not designed for them I want to I want to give you the data set and here’s what you get at that Enterprise level so I think it changes based on that that level of governance that you need with it I think we should write a white paper it’ll literally be it’s a little bit of paper that’s just all white we’ll have 100 pages in it and we’ll say fabric governance and power bi governance and then every other page would just be white it’s just literally a white paper

41:36 white it’s just literally a white paper yeah yeah and at the end it’s good luck good luck right I don’t know if anyone’s used the utilization app but I think that’s part of the cost too and I think that’s going to be a big thing but what do you mean utilization now now so there’s the Microsoft fabric capacity metrics and I don’t know if that’s going to also include the cost because I think that’s going to be a huge part rather than writing the one I’m still lost on me you catch me up what are you talking about about I am talking about when it when this

42:07 I am talking about when it when this launches and whatever the end result or once our trial goes off actually seeing what elements are costing the most utilization okay so let me you you jump like three steps ahead of where we’re talking hey we’re talking about something else you jumped all the way back into okay there is now a power bi app in the app store called the Microsoft fabric capacity metrics app that’s what you’re referring

42:35 metrics app that’s what you’re referring to okay I’m with you now yeah you went way after a while you were thinking about you weren’t you were thinking about something totally different okay so you’re saying okay with that app have you installed that app yet not yet but I just saw the document here that we’ll make sure to send it into the chat as well yeah maybe send that one out so Microsoft capacity fabric metrics app is now available which is now other so you now have a premium capacity metrics app the old one you now have a newer premium capacity Power bi app

43:07 newer premium capacity Power bi app and now there’s also a Microsoft fabric capacity app which I will now install that and we’ll see how that goes so this one now tracks the capacity usage of fabric is what you’re saying right okay because the the biggest part here is is going to be trying to understand okay not just how much are we spending but what what are the elements what is in certain jobs is this notebook this has been a challenge I think I’ve had from day one is hey just because so

43:37 had from day one is hey just because so this is Seth we’ve had the same challenge right if you have a single spark cluster in your fabric workspace and someone decides to write a very nasty query to go get a bunch of data or do a big data processing does that kill all of the capacities for everything else in that fabric workspace because it’s what’s going to get throttled who’s going to be affected by this does my data set slow down because someone’s running a really nasty query on something like I don’t I don’t have the details of that yet and that’s where I’ve I felt like when fabric got introduced it’s going to be a

44:07 got introduced it’s going to be a nightmare for admins admins are going to have a really hard time figuring out how much of this capacity is being used for what yeah I like you I like this point Tommy I think this is really good and I think and I think that’s going to be be before I think organizations jump and head first with okay we’re just going to keep everything on and let everyone play around I think it’s needing to know like hey it’s not just we’re spending this money but what is costing the money what is the optimizations behind right

44:39 what is the optimizations behind right and I and I I I think what’s challenging to me is like going back to what I said around like all of all of a P1 capacity is for redo and and Report usage oh I see you’re saying yeah all the interaction [Music] [Music] and now we’re talking about opening up the doors to all of the data and data manipulation yes on the same capacity and that’s what that’s where I’m like guys like there’s no way that works so seamlessly so if this if if

45:12 seamlessly so if this if if what is the recommendation I’m a business unit okay so like just previously we’re talking about engineering like the the bi teams and how we would structure potentially different workspaces for data tasks for ML workspaces for different capacities Etc and then report workspaces right so now I have three different kinds of workspaces that maybe we’re doing on an Enterprise level but what’s the recommendation for the business exactly opened up a whole new world of data and of data and data manipulation

45:44 data and of data and data manipulation and and yes it would be fantastic if we could all work within the same workspace but that workspace has a single capacity yes so what am I supposed to do yep what is what is the recommendation then okay actually business unit what you want to do is just keep where you’re at in this workspace we’ll create another set of workspaces for you to do your data engineering on yes which is fine with and is that the recommendation then though right because like this isn’t the

46:15 though right because like this isn’t the same workload and it does you don’t just simply like Slough off like oh I can easily go from a P1 to an f64 because now I have to worry about all of these other things that are going to start sucking up my capacity okay I will say this the fabric capacity Microsoft app is now working and you can go install it and you can see in your Microsoft fabric trial what is using size in gigabytes it’s talking about like performance data it’s

46:46 talking about like performance data it’s talking about how long queries are running and and durations of things as well so well so I’m gonna dig a bit more into this and figure out what the heck is this app doing it’s a great app too plus it shows you like all the certified data sets and yep it brings in some of the purview feature things which I’m interested in understanding whether or not that’s like only in the preview but it is it is cool like the visibility is great I think or is it great first step it’s the to your point earlier like what the tweaking so maybe we should

47:17 what the tweaking so maybe we should maybe we should start building the white paper right like as you implement these things if you see if you’re seeing that you’re going off the rails and capacity now you have to create a different workspace migrate all of your objects into that workspace yeah or like how does that work exactly I think they’re what if all my workload what if everything I built is in one workspace and now I need to like now things out yes how easy it is that experience yeah I also wonder how it works too if you create a data set off of something in

47:47 create a data set off of something in Fabric and it’ll say it lives in another just like a normal Power bi premium SKU SKU is that relying on like in a sense instead of relying on something with fabric if you were to turn it off or the cost point of view I’m trying I don’t I don’t know like it’s not just Seth you made a really good point and it’s not just premium before was consumption but now we’re really dealing with that engineering side of what’s going on where if I were

48:17 side of what’s going on where if I were to create a data set off of something shared from fabric where does what usage does that even so what does that eat I think I think you’re speaking to Tommy one of my main questions here and again I’ll be honest I need to do more testing on this so I can get my head around it right one lake is this magical thing that is blob storage accounts right if I’m if I’m able to use something like data bricks and write data into one link and then can use it in my power bi data sets

48:47 can use it in my power bi data sets great right because that’s that’s an external engineering source that I’m then pushing it into my one Lake that multiple business teams potentially could use could use if I’m looking at that going okay now that we have this concept of one leg what other licensing would allow me to attach myself to one like is is a one Lake threshold only going to be living again again it’s just a storage account right can I can I go connect to these other things what part would other parts of power bi grow or premium per user workspaces will they allow me to get

49:19 workspaces will they allow me to get into that one Lake without having to have the fabric level subscription is that fabric level subscription going to block my access into that one Lake so to me that’s like that’s my linchpin on a lot of this right if I can you on a lot of this right if I can one Lake ice I actually pay for know one Lake ice I actually pay for separately than fabric so my understanding is one lake is a blob storage account which is paid for separately than your fabric subscription so that is an azure-based price that comes with fabric it’s not included

49:49 that comes with fabric it’s not included in the price of fabric well you’d start to pay for storage but then if I’m paying for storage separately then it shouldn’t matter where I load the data from it should just all work at that point right so but you see by saying like if if the if I’m paying for the storage account independently of fabric then the fabric threshold of being able to pay for fabric should not limit me to be able to only play with that Lake or that blob storage account right I should be able to create a bunch of artifacts inside that one Lake and then I should be able to load any of those artifacts anywhere Pro premium pre-users

50:19 artifacts anywhere Pro premium pre-users my workspace all of them should work and be able to connect to the blob storage because I’m already paying for it I would agree with that does that make sense no again that’s logically making sense that doesn’t mean Microsoft’s pricing purchasing department will agree with me but in my mind my mental model is saying I think that makes sense man there’s just so many scenarios you’re talking about now Alex Alex pipes in here and he’s giving some context so thank you Alex it’s not an actual storage account it seems like a new

50:50 storage account it seems like a new storage service that also uses the ADLs apis and billing I’m like okay well it still acts like ADLs then so that’s yes I understand so no yes no that’s also uses the ADLs apis and billing so if I know because you would what that’s saying is like it’s potentially being locked behind the fabric service so there’s a thin wrapper on top of it but it gives you all that but if I can access it if I can write data down I gotta figure this out

51:22 data down I gotta figure this out because that because they’ve they’ve said in their documentation and in their presentations hey if you use data bricks you can write things like data bricks down into the lake right and or hey I created a lake house and I can modify it with the SQL warehouse and and that’s why I was just like oh it’s all the same object oh it’s like all the same underneath and why it was so confusing at the beginning of the conversation when it was like why do I have different data sources why don’t I just have one yes either way interesting interesting things developing

51:53 developing still developing I still have it I still don’t feel like Reza put out an article I think think or radicad right so radicad I haven’t read this one but I just saw had a had a across my radar here for the radicad blog I think they just did a a deep dive on fabric or fabric licensing The Ultimate Guide I’ll put that one in the chat as well

52:24 I’ll put that one in the chat as well I think this gives some context to what is the licensing pieces and I think it goes through a lot of the Microsoft documentation as well I still don’t feel like oh yeah Dan thanks for posting that one here so this is this is that’s where I saw it from Dan there’s an ultimate guide from radical talking about the different structure there is pricing on the page so Reza has gone through and added all the pricing for all the F capacities hourly and monthly pricing for those things and then it talks about a bit more in depth about how that looks inside azure

52:58 and then it actually does a better job explaining where is the threshold for how do you share content F2 through f32s not sharing but Power bi premium benefits kick in at the f64 level and talks very briefly around what’s pausing and restarting I still don’t feel like I have a good understanding of like patterns of moving data yet and

53:14 like patterns of moving data yet and that’s that’s really where I think I’m yeah that’s what I’m most interested in at this point how can I move data around between the different skus or the different Power bi licensed environments what I find interesting is is like if the preview is going to end in July there certainly are a lot of features that that need to get implemented what do you mean the One Security oh one security layers on top of things

53:45 oh one security layers on top of things like the other thing that like there are certain use cases that this works for if you’re talking about analytics internally inside an organization like this is it’s it’s gonna fit that need much more so than if you’re using Power bi as an embedded solution in applications like there’s just there’s so many gaps right now in terms of like if an application is is utilizing for a level security in different ways and you’re not using aad yep and there’s no off that way like all of that is not existent with like the one one like one

54:15 existent with like the one one like one direct connection so there’s a there’s a lot of gaps as it relates to like being supporting all analytics Solutions One Security I think is very is a very encouraging feature based on what was described because if I think about these Delta tables yes Delta tables don’t technically have

54:45 don’t technically have you can partition them separately so you can then segment out different sections of data to different customers right so if I’m thinking about thinking about a data table right maybe imagine I have a data table for sales for all my customers right if I have a customer ID column in there I can partition the data creatively such that only one partition lives inside that now databricks has this already figured out they already have a there’s like a server that sits inside databricks that is like their wrote call it the row level security

55:15 wrote call it the row level security right so the row level security of data bricks gives you this ability to then break and slice and dice and call Delta share basically Delta share allows you to be able to share portions of a Delta table which is kind portions of a Delta table which is what I think one security should be of what I think one security should be doing and we’ll have to see how this plays out with Microsoft and how they’re going to implement it but I wouldn’t I would hope that Microsoft is looking at what they’re doing and mimicking a similar fashion because today direct link does not support row level security and I think

55:45 support row level security and I think that’s going to be a major blocker for a lot of organizations and based on your feedback to roll on to like yeah one of the number one biggest things I was excited about and still m is is the potential for the direct light connect directly connecting to Delta tables because that that removes the whole processing layer of all the I like it I really like the feature all of the right but I can’t if it’s not usable it’s not usable to me in my use case right and if you’re not supporting row level security

56:15 you’re not supporting row level security at this point and I know it’s coming right but great look forward to that roadmap when you’re telling me it’s going to be there but if it’s not here now the challenge is now is the preview and that’s what I’m saying if you’re going to end your preview in July without some of these core features you’re not going to give us enough time to understand whether or not it’s going to work for our production scenarios I agree with that oh I see you’re saying okay yeah so you’re saying there’s a there’s missing features that need to be evaluated before I can really say light

56:46 evaluated before I can really say light up some fabric workspaces in a production environment yeah I see that but think about how you do any any sort but think about how you do any any like big transition I have I have a of like big transition I have I have a production workload this is the performance this is how it’s working if I’m going to make a huge change on platform or the way I do things Etc my POC is a direct representation of what I have in production so that I understand cost I understand how how it performs and I can do an Apples to Apples

57:17 and I can do an Apples to Apples comparison as much as possible to be able to sell the rest of the business on this is this is worthwhile for us to do without that you’re I’m gonna eat that cost later on because I’m going to be out of preview and then you’re going to give me the feature and that’s the only time I’m going to be able to do the Apples to Apples yeah yes well you could you have you’ll have to spin up then at that point you’ll have to pay for the fee like you have to go buy something from Azure try it for a weekend and say okay did this work with my knee

57:47 my knee [Music] [Music] all right so I think we’re at the time we’ve burned through a perfectly good hour of your of your day I hope this was a good walk or run or whatever you do on the people who listen on the podcast this is go runner go biker whoever you if you’ve made it this far keep going you’re almost there so I’ll encourage those I’ve heard a lot of people say yeah I listen to you guys when I’m running or are going for a long walk so for those of you who are exercising congratulations keep up the good work being said our only request and and oh actually

58:17 our only request and and oh actually I forgot I did do some chat gbt stuff over here so I did ask a chat GPT and Bing some questions around the answers here so one thing I asked chat GPT was how do I know which Microsoft fabric license to purchase for my Enterprise and it gave me some gobbledygook around like well you can get a Microsoft Enterprise agreement and you can give me so here’s a subscription a Microsoft products and services it was a bunch of crap that was not right and then it said to determine

58:47 not right and then it said to determine if a suitable licensing program is relevant for your organization it gave me five bullet points that were interesting first one it said it said assess your organization’s size and needs evaluate your number of users I would agree with that engage with a Microsoft representative or partner now my guidance here was start with the partners first before you go to your Microsoft representative because if you talk to Microsoft Representatives every Microsoft representative I’ve ever talked to anyone who talks about premium in power

59:17 anyone who talks about premium in power bi though yeah you need a P1 yep that’s what you got to buy every single one and I’m like dude no they don’t like you don’t that’s not every single Microsoft rep I’ve talked to always pushes her apple mic like the most expensive SKU so I’m like I would not recommend that one so I would say start with a rep a partner of Microsoft because I think they’ll give you a better picture of what your business needs and help you buy the appropriate licensing then it says evaluate your licensing costs yes consider future growth and scalability

59:47 consider future growth and scalability anticipate your your organization’s growth and determine if the chosen licensing program can can interpret future expansions I thought that’s a good point I felt that was pretty relevant and then finally it said review your licensing terms and agreements good luck finding any of those for power bi premium because they’re like non-existent and it’s very difficult to nail Microsoft down on whether or not you can use power bi Asus for embedding to things or whatever so yes anyways that sounds good there I I’m fairly

60:19 that sounds good there I I’m fairly happy with with the comments that came out from Microsoft Bing gave me some other gobbledygook here just talking about it bing was a little bit more current it did talk about the two different licensing models the Azure and the Microsoft 365 licensing models but it didn’t give me any additional features other than what’s in the articles on the website today so not a lot of help to I I do want to make a point of clarification here right like this is this is our conversations around fabric are and we’re having a lot of them

60:49 are and we’re having a lot of them because power bi is part of that ecosystem now yep all of this is going to affect us at some point a lot of the challenges and the challenging conversation is because this is in preview right it like there somebody made a comment about like organizations with that are government or big privacy like you’re gonna have to wait a while right like this is the way Microsoft runs right oh yes get it out early yeah get it into people’s hands so we can start playing with it so that’s a good point Alan the challenges that we have with the tool right now are not

61:19 have with the tool right now are not going to be the same challenges that we have in six months yeah they’re certainly not going to be in the in in a year right but our our motivation is to try to get engaged with this tool as quickly as possible and these are some of the things that are part of our conversation all the time some of them will be eight some we don’t like and some more we love right and yes and it’s the same thing we did with RBI right like when it first came out I can’t believe I can’t do this thing and then it’s like woohoo Game Changer right yeah we’re gonna go through the same like

61:49 we’re gonna go through the same like rigor ups and downs ups and downs yeah this is the conversation that we have and I think it’s valuable because there are a lot of folks that wanna like kick the tires and see how this works and figure out whether it’s part of your ecosystem and you joining us along for this ride is nothing going to do nothing but strengthen your understanding of the tool sets now yes and at least put that kernel in where you’re like you can look for the optimization or that opportunity where potentially you can start to do like the comparisons yes with your

62:19 like the comparisons yes with your existing tool sets and I think where we’re at now is not the final part of the tool I think the conversation is going be completely different we’re going to have different like less problems problems in in a year from now I would agree with that statement and I feel a lot of the same way it feels a lot like data flows online initially they released it it was very rough but they made good improvements and a year later it was much more solid and much more reliable I feel like this is another similar one where it feels a bit rough now but it will it’s going to

62:50 rough now but it will it’s going to continue to smooth out as they continue adding features to it right all right now I’m going to end the podcast all right so thank you all very much for your time we appreciate you our only request of anyone listening to the podcast is please share with somebody else if you found this conversation confusing and thoroughly frustrating to you about thinking about licensing around power bi well we did too so share it with somebody else I’d love to have more Community conversation around what this looks like what organizations are doing we really appreciate you chat for jumping in and

63:20 appreciate you chat for jumping in and giving your challenges and opinions around this as well it was super amazing as well thank you all very much and then Tommy where else can you find the podcast anywhere podcasts are available Apple Spotify make sure to leave a rating really helps us out a ton join us live every Tuesday and Thursday 7 30 a. m Central and thank you guys for putting in some mailbags we saw a few come in if you want to provide a topic for us to talk about maybe a little more fabric you can do so at powerba. tips slash the

63:50 you can do so at powerba. tips slash the podcast wonderful thank you very much and we’ll see you next time

Thank You

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

RIP Workspaces – Ep. 225

More Posts

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.