100 Dollars for Fabric – Ep. 243
Fabric is exciting… and occasionally infuriating in the exact ways ops teams care about: Where did my capacity go? Why can’t I trace lineage all the way back to the source? Why is this experience different from the other Power Query experience?
In Episode 243, Mike, Tommy, and Seth do a simple exercise: pretend you’re funding Fabric’s next improvements with a $100 bill. What gets prioritized first? The picks gravitate toward the unglamorous stuff that unlocks adoption—visibility, consistency, and governance.
News & Announcements
-
Power BI vs GPT4 Code interpreter - Amazingly Quick Analysis — A quick demo of AI-assisted analysis workflows—good fuel for thinking about how Fabric + Copilot-style tooling could evolve.
-
Submit a topic idea — Send a topic you want covered (or a problem you want the team to argue about).
-
Subscribe to the Explicit Measures podcast — Follow the show and browse the complete episode archive.
-
Tips+ Theme Generator — Generate Power BI report themes fast—especially handy when standardizing across teams.
-
Mike Carlo on LinkedIn — Episode updates, demos, and Fabric/Power BI tips.
-
Seth Bauer on LinkedIn — Engineering-focused takes on building analytics systems.
-
Tommy Puglia on LinkedIn — Practical leadership and delivery lessons for BI teams.
Main Discussion
The ‘$100’ framing is a proxy for a bigger question: what are the friction points that slow real-world Fabric adoption? The discussion repeatedly comes back to three themes—traceability, cost/usage transparency, and a coherent developer experience.
Key takeaways:
- Lineage has to connect the whole chain (source systems → ingestion → transformations → lakehouse/warehouse → semantic model → reports), not just show isolated Fabric artifacts.
- Cost and capacity tooling should be artifact-level, so you can answer ‘what did that Dataflow/refresh/job cost?’ without reverse-engineering IDs in a template report.
- Congestion-aware scheduling is a missing superpower: if a capacity is overloaded at 7 AM, the platform should make it obvious—and help you shift workloads intelligently.
- Admin ergonomics matter (bulk actions, sane defaults, governance controls) because small friction becomes chaos at tenant scale.
- Power Query / Dataflow consistency is still fragmented—teams feel the differences between Desktop PQ, Dataflows Gen1, and Dataflows Gen2 every day.
- Cross-tenant collaboration needs love: guest user behavior and multi-tenant development edge cases can derail consulting and partner workflows.
- Region support is adoption support: if Fabric isn’t where your production data lives, you’re stuck designing around the platform instead of with it.
Looking Forward
Do a quick ‘Fabric ops readiness’ review: map lineage gaps, list your top three cost/visibility questions, and decide who owns capacity governance before your first serious workloads land.
Episode Transcript
0:00 [Music] foreign good morning and welcome back to the explicit measures podcast with the one the only the Tommy Mr mad biker man Seth
0:33 the only the Tommy Mr mad biker man Seth Bauer and Mike Carlo ooh the Mad biker man man biker man yeah we we talked on our previous episode I believe that Tommy just got his bike back it’s like Stella Got Her Groove Back but Tommy got his bike back so Tommy has a bike he’s now peddling like mad now these days and we’re back on we’re back on track now I didn’t want to see anything busy I couldn’t you got to let you if you gotta
1:03 couldn’t you got to let you if you gotta you gotta figure out if we know what you you gotta figure out if we know what what’s going on here this is a know what’s going on here this is a pre-recorded episode so it’s a little harder to figure out what day it is really that we’re actually recording on so it makes it a bit more challenging oh yeah let’s Seth pull his weight Tuesdays it’s a hot take episode I start the episode is it Tuesday yes it’s Tuesday sure let’s go Tuesday for all of you that are just joining us the the couple minutes late excellent today we’re gonna take a little we’ve done some of these in the past we’re going to jump right into our main topic for today
1:34 jump right into our main topic for today in the past we’ve done some of these if you have a hundred dollars what do you invest in where do you put your time effort and or money as a program manager or the lead inside Microsoft what do you do to improve either power bi or something like that in the past we’ve done you’re given a hundred dollars know you’re given a hundred dollars you distribute that money however you want and that hundred dollars then it gets distributed to build a feature like this or improve desktop or adding to neb to the desktop as a you
2:05 adding to neb to the desktop as a you adding to neb to the desktop as a base features whatever whatever the know base features whatever whatever the things you want are So today we’re going to go through and we’re going to take that same approach we’re going to get 100 but we’re gonna spend it on things inside fabric what would where would we spend our development time and or money inside fabric what would we improve how would we refine the program or the software as it stands today so anything else you’d want to add Tommy for the the setup that’s the setup for today yeah I think we’ve done these before just we’re going to throw a hundred dollars where would
2:35 to throw a hundred dollars where would we invest if we were running fabric where would we put the money and time for people to work on things so pretty open-ended so I’m I’m always confused on how we always do this one or what’s the best way do we do we do one person at a time and you give us all of the numbers at once or do we just do one item each time and discuss each one as we go I think we do one at a time okay let’s do one at a time can we go in like order of like largest dollar amount to smallest dollar amount is that what you want to do you want to go from small cement let’s go to smallest to biggest right the smallest Investments and then
3:05 right the smallest Investments and then up to the largest one through the end of the episode sure that way maybe we have one person by the end of it we could do that you could do anything we want to that sounds like that’s true it’s our podcast we can do whatever we want all right all right Tommy kick us off let’s give me your give me your lowest let’s start from lowest dollar amount investment and we’ll go towards the higher one so start with your lowest dollar amount investment and we’ll go from there and we’ll work our way up so my first one is going to be at ten dollars put in the
3:35 going to be at ten dollars put in the bank it’s going to be for better lineage the lineage feature is not bad but as Mike and I have seen if you’re going from a data flow to a lake house there’s for whatever reason no connection yeah these would be some discrepancies I need a better UI but I need those connections between not just what’s in my fabric artifacts but also that connection from let’s say a data flow to a SQL database to the lake house to the SQL server and then everything else so right
4:06 server and then everything else so right now it’s such a huge and important feature especially with so many components being created yeah you need a better way to look at it I like that idea it’s a good one good idea too that’s actually on my list as well so we’ll I’ll I’ll dovetail on that one I have some other ideas or thoughts around that one but I’ll actually add that one as well my cheapest one for ten dollars is give me dark mode oh man and there is too much white going on like it just it blinds me it’s not
4:38 on like it just it blinds me it’s not it’s very washed out right now like yeah that’s true feels I could get it out that functionality get it out the door preview implementation I just give me some dark mode the rest of everything I can do isn’t dark let me stay in the dark oh there it is well Seth does work in a basement no you used to work in a basement you no longer do I work in a basement now so when I look at the mirror dude exactly I’m the one who worked in the basement so if anything dark mode would be great for me because I could actually work here in the dark for once
5:09 here in the dark for once awesome so I’ll start off here with my I think my I think this is my cheapest feature here I’m gonna throw five dollars around this is maybe a bit too Niche at this point but I’m gonna throw five dollars at making sure they synchronize more of the features around power query Gen 2 with powerpury Gen one okay one of the features that I’m really looking forward here is I really I know incremental refresh I think it was announced on the roadmap that it might be coming but
5:40 roadmap that it might be coming but incremental refresh around dataflow’s Gen 2 I think would be really good I don’t know how how it actually synchronize this effort like let me say it this way desktop power query is different than fabric power query gen 2. which is different than data flows gen sorry data flow is Gen 2 data flow is Gen 1. so there’s like three flavors that I can see around power query I would love the whole entire power query engine to be more synchronized and have the same
6:11 more synchronized and have the same experience between what’s happening in desktop what’s happening in Gen 2 and what’s happening at dataflow’s gen one so I don’t know how they synchronize all that work together I like where they’re going with things but I need some more help there I think I want to simplify that experience a bit more so you just want they see it like basically some Version Control to say hey I’m in the power data flows gen 1 but it’s version nine and not really like the UI looks different in all three of them there’s there’s like little differences in each one desktop looks like very old it’s not as polished as what
6:43 very old it’s not as polished as what you see inside the service you don’t have like the anything in desktop you don’t have the blocks like the block diagrams of like okay do this transformation then this one then this one one it’s just it’s just different and I want it all to be like be the same bits or code be under the hood because I want that I want that experience to be a similar experience across all power query experiences so just syncing up the UI yeah or making sure like all the you UI yeah or making sure like all the if I have a function or feature know if I have a function or feature that’s in one of them that should be in all of them right if I have a connector in desktop that’s not in the service or
7:14 in desktop that’s not in the service or if I have it connected to the service it’s not in desktop they should all be there I want I want I want all of it all the things how much do you put into that five dollars okay it might take a while we’ll put the intern on that one I do have a few on data flows but I am definitely saving those okay so I have four more I’m gonna quickly go through the next one it’s the same ten dollars as the lineage it’s cost management right now we have the template
7:45 right now we have the template file that they create for looking at where like you’re monitoring your costs and your the activity I want to know much easier rather than going into Power bi report able to look at one of my artifacts like a data flow run you see what was the cost of that particular run and not based on a fabric ID or some convoluted way in that template report which I even myself don’t own because it’s a live connection so I really can’t even own the that monitoring data but I want to know hey
8:16 monitoring data but I want to know hey Lake this lake house because of these runs and these tables this is where the usage and this is where the cost is going oh interesting yeah well I wanna I wanna pick on you there a bit bit where the cost is going are you just talking about more of like a where the consumption is coming from because yeah so because power bi is a bit different than like you do in Azure because it’s not really consumption based you buy a capacity and you get what you get I’ll make a metric this is so cost so but no it’s looking at the
8:46 cost so but no it’s looking at the consumption like if just like the premium capacity yeah yeah yeah that will show me okay not just the volume but what percentage am I taking up of my capacity I feel like that’s a really good metric and I feel like you also would want something that says here’s my congestion time right here’s the time these refreshes are occurring this time in the morning it’s congested you may want to not do that or have it be smart enough to figure out hey I would love for it to be like there’s AI all over the place why can’t
9:17 there’s AI all over the place why can’t AI figure this out hey look I’ve got a thousand data sets that are refreshing on a capacity have them all done set set the delivery time what time does that data set need to be refreshed by
9:27 to be refreshed by let power bi figure it out like like hey yeah data needs to be done by 6am it can’t start any earlier than midnight okay Power bi it goes into a queue of all these other things that need to be refreshed overnight figure it out I don’t care what you do I may put a little more than ten dollars into that with the AI that’s a great idea that’s an intern thing just yeah we’re trying to do that one just give me the basics man I want I want more than Basics because I’m going to dovetail and I would throw 20.
9:58 20. same thing because well Administration and and processing of the different compute types and the costs associated to those right like yeah organizations spend so much time trying to figure out Azure costs tag tagging clusters figuring things out aligning like hey where all these things booling up and now you have an environment where we’re mixing nothing’s tagging that together yeah onto US onto the same fabric capacity so it’s really important for us
10:29 capacity so it’s really important for us to understand what’s ETL yep what storage even though not not much what’s what is the consumption that’s happening because of report usage or report refresh or whatever the case may be and the fact of the matter is is like you really should have better like usage and projection modeling of like hey are are you optimizing things are your is your usage appropriate like blah blah like there’s so many things that they could do with that ecosystem to make it easier on understanding costs
10:59 make it easier on understanding costs that I think it would it would benefit them as it relates to people being able to project out and understand like hey man we want to split up this service we know it’s going to cost X here’s how we plan that through Etc whereas on the flips side because it’s so hard to navigate through these spaces Finance departments are extremely still like what do you mean you don’t understand like what do you what do you mean you can’t give us an estimate well like let me let me go try to set up
11:29 well like let me let me go try to set up this environment and admitted somebody grabs this tag up we missed that and now we have to figure out like where this it’s a nightmare 20 bucks for sure the other thing I would like to do in here is add something related to billing per bu right because all this is workspace driven oh I like to share if I have to share a capacity billion per domain this is yeah per domain or whatever the domain thing is like group them all together how do I like not only understand my usage but but hell give me a give me an automated way
12:00 hell give me a give me an automated way to say I now have these two businesses that are sharing capacity whatever the usage of this capacity is in a percentage that is how you build them yep yeah makes total sense really good I like that one so I I think I under spent on this one my my Enterprise customers would not be happy with my five dollars I’m allocating towards the admins and I literally called it like admin stuff right and I wasn’t even thinking your guy’s a direction on like I literally like the idea of like you
12:30 I literally like the idea of like you I literally like the idea of like billing per workspace or per domain know billing per workspace or per domain so I can like back charge other groups I was thinking more around admin and administration things around just better like better integration so we have apis that give you data today but it’s not it’s just like an API that just dumps out information to you you have to do a lot of work on your side to it’s get it all together and even when you get all the data nothing matches each part of the system that is that is power bi the API is down all kinds of weird new activity events
13:00 all kinds of weird new activity events there’s new properties there’s like it’s very inconsistent of like what information do you get from that and so when you try to make a consistent data set from all the data that comes out of a single API it’s all over the board you’ve got you’ve got to merge columns you’ve got to make up new things that merge multiple things together you merge multiple things together data that was captured in one know data that was captured in one column is is not captured in a different column there’s two columns the same thing with different names it’s weird stuff stuff I get it there’s a lot of different teams working in power bi and they’re all trying to send data somewhere so
13:31 all trying to send data somewhere so there’s probably some consistency issue there on their side but as I look back on this going man I would like to spend more time making that a consistent experience and make it easier for me an administrator of power bi to get that data together so that’s why I was thinking about admin stuff stuff so really just the UI just grouping things a little better or they’re actually features then because I’m talking about like the so the API the way it is today as a grab bag of information
14:01 as a grab bag of information yeah I would like it to so what I would what I really would like here from the admin stuff I’m talking about is I’d like it just to be able to say look I’m going to turn on a knob or two inside power bi and say Here’s my storage account account just put the data here and then just do it like I don’t want to be I don’t want to build data flow pipelines I don’t want to build if ever I don’t I just want to say hey this is where I care about putting my data whether it’s logging data from the actual workspace right the analysis Services right because you can turn on Azure Auto logs or whatever
14:32 Azure Auto logs or whatever that is the analytics logs you can turn those on hey just just put it here you those on hey just just put it here I could have a knob that I can just know I could have a knob that I can just turn on boom and just starts sending data and it just arrives where I want it it shows up right there and then I figured out from there and I’ll pick it up and build what I want to build on top of it and probably getting easier access to your server settings you’re actually endpoints all that stuff yeah it needs to be more of a Consolidated experience I think in my opinion it’s good it does a good job but in larger organizations I think it gets more unwieldy to manage all the different
15:03 unwieldy to manage all the different things that you have going on and it just makes it harder so I’d like a little bit more Simplicity around the administration of collection of data as well as easier ability to to pull more data together in one spot and I think to some degree they’ve started that right like if you if you look at just the in what I’m thinking about in the top of my head is like the workspace Administration yes right as as a global admin like you can see yes all of the workspaces and like finally select them all and add yourself as an
15:35 select them all and add yourself as an admin right to understand like other people using this what’s going on yep what problems are we having Etc or all these people don’t realize we have a capacity that I can assign this to and they’re assigning Pro licenses to people and they shouldn’t it could be that too yeah a little bit more I should be able to control it a bit more as an admin or right I know I really do want to not have people have personalized workspaces I don’t want them my workspace to be there because we tell people to not share content from it but they still do it anyways so even
16:06 but they still do it anyways so even though we have a policy even though we fall I think that just gets pulled up by default though it does there’s all the only way I’m aware of you can like disable it is you have to hack it a little bit you have to turn on an Azure a capacity a sign inside the admin portal to say every new user who gets added as a pro user gets automatically added to a premium workspace and then you have to disable or pause the workspace yeah or pause the premium oh wait so there’s like a how so you have to like turn on a premium
16:36 premium and you have to attach by default every new user is created is automatically attached to that capacity and then you pause the capacity meaning it then shuts it off basically for everyone so it’s a little bit backwards but I feel like as an admin especially for new organizations right ones that are just starting out that’s not featured I would want just to hide and get rid of right away day one I wanted to not be there that sounds like somebody thought outside the box creatively too just for five dollars just for five dollars that solution was free for you right there
17:06 solution was free for you right there you don’t have to worry about that one that was that was a zero dollar investment right there all right Tony we’ll go back over to you I think you’re at 30 so far you did lineage and admin cost for so 10 10. so now we’re good we’re up in our game right now oh sorry you’re you’re 10 for admins right yep so we’re at 20 right now we’re gonna up it to 25 and this is the UI in fabric so you’re spending 25 on this on this feature yes whoa yeah about a quarter of
17:36 feature yes whoa yeah about a quarter of a hundred big spender unless I’m mistaken okay so I was spending 25 on the UI in Microsoft fabric looking at workspaces the organization of the amount of artifacts getting created there I know you want you want a folder you want 25 for a folder or my own custom interests usually I I want my things organized if I want to go from lake house down to all the other artifacts right now Mike if you look at our training workspace oh it’s a
18:07 look at our training workspace oh it’s a mess it’s already a mess a mess and there’s there’s automatically random things being spilled up in this whatever data flows workspace lake house thing that just gets automatically created that’s supposed to be hidden but it’s not really hidden it’s actually there and I’m like what is that doing there I don’t really know why that’s showing to me yeah I’m with you Tommy creating a lake house should not require me to have three objects automatically created at once and then the amount of objects you need to create if you want to keep doing doing anything so I let’s see and I have a
18:38 anything so I let’s see and I have a workplace here I have three working like houses I have one from doing a demo which creates nine assets yes because yes it’s yes it’s which is ridiculous so I would house curse three three that’s sales
18:55 curse three three that’s sales I I would also order like I feel like Tommy you and I’ve had this conversation I don’t know if it was on the podcast or or when we were doing the training stuff we’ve been doing earlier around fabric I agree with you Tommy and I think the lake house should have three icons and it should have more settings like the lake house should be one line item inside the workspace and if you have a SQL endpoint or if you have so there’s a SQL endpoint that goes with it and there’s a data set goes with it just make them different icons on the same like it’s all the same lake house you’re
19:26 like it’s all the same lake house you’re I’m not building a SQL Server I’m not bringing a provision SQL Server into that thing it’s literally just borrowing Microsoft so it’s it’s in my opinion yes it’s a SQL endpoint it’s just a SQL endpoint on top of the lake house so just give me the it’s one item it’s a lake house and I can access it via a data set a SQL endpoint or going straight to the files and the tables and for my lake house and subfolders count how many objects are there and bring back colors icons I can’t take it I don’t know what though they are when
19:56 don’t know what though they are when you’re looking at a lake house right yeah everything’s white so no the UI is a huge part for whether it’s folders some organization that way that way and then I wish it was even a better experience going in and out of artifacts like if I’m going through in the data flow just to make sure or test something to see if what queer that is or a pipeline or notebook still a lot of loading a lot of tabs opening up again something like vs yeah I like that that’s a good point really upping the UI
20:27 that’s a good point really upping the UI especially if we’re just testing things out and we’re not doing anything business critical right now or Enterprise how many artifacts is that going to create that’s a really good point I like the folders idea I like being able to group things it would it also be relevant Tommy for you to hide things in there as well thousand percent like yeah yeah like just older yeah this is I don’t maybe maybe hiding is making sense but make something I could I don’t need all these objects I don’t want and everyone just I don’t know
20:57 want and everyone just I don’t know maybe that doesn’t make sense I don’t know give me some custom views on the top ribbon that I can create myself yeah maybe so interesting I like that feature Seth on over to you I think we did I forgot your first feature set that was ten dollars for what was it our code oh dark mode yeah and then Administration for 20. yeah I’m I’m hot taking switching things up okay but this this one is very real near term oh a a really heavy frustration so I’d I’d
21:27 really heavy frustration so I’d I’d throw 30 right right now to just give it to me yeah allow me to use Fabric in central us region because that’s all all of my big production data is and where I maximize the use of this in my organization and I and I can’t and I can’t yeah because for organizations that have like multiple tenants yeah you’re stuck with multiple different Power bi yep instances yep right like
21:58 Power bi yep instances yep right like which which I don’t know how to throw money against is this ancillary like I’ve got to figure out how how what what if I have a one like in two different tenants and I need to start interacting with data across yeah these things I I don’t know what that’s like yet or I haven’t been able to vet or test because once again central region not supported not supportive so getting access to your data and all the things I can just move my power bi tenant to to one that is is supported so that’s true statement too
22:29 so that’s true statement too sucks sucks why does I want where does do they do they say where central us is based out of is that like the Iowa area I know there’s North Central us which is more like Chicago Chicago yeah yeah and then Central us is just more more at the Plains area I think so yes and then there’s like South Central us right that which is like Texas there’s like some data centers down there or something yeah we know we know things tenant set up in place just give me Central can’t
23:00 up in place just give me Central can’t really like kick the tires on things until you sport it there it there that makes sense huh maybe maybe this is and again I’m trying to give them a little credit here if you like this is one that they’re gonna fix when they get to ga oh for sure right because that that would be like hey we’re just gonna we’re just gonna roll it out in some data centers that we actually have some space for this stuff get the get the bits working during preview okay we’re getting closer to GA then you’ll start seeing it appear in other regions as well is that
23:30 in other regions as well is that what you’re it absolutely okay yeah makes sense I like that one as well thirty dollars for that one well dude like it’s one thing to kick around the tires right but yes actually one of the biggest challenges that we’ve had even in in talking and discussing is understanding like what are the costs gonna be on this okay how do I throw a workload at something not be able to test it out right like how do I get Max usage like right now in the preview is where you want to be bashing it against
24:00 where you want to be bashing it against your production data sets in a Dev environment right to see what those those trade-offs are going to be and like when you can’t do that because your entire ecosystem is bound to a region right then the preview doesn’t allow you to do that so you’re you’re playing catch-up behind other other organizations that may may have data in those and it’s like we’re not in a position where could I could I move data sets and like incur additional cost
24:30 data sets and like incur additional cost like no not at the volumes that were no yeah what would you so I’m not sure another changing things have you checked recently so the preview of fabric probably won’t let you pick wherever you want right so if you’re inner if you’re so you’re going to an apartment. com and you’re saying spool up an actual F capacity or something well again it’s more cost right but it I I want to say no because it’s it’s bound to the power bi tenant right so with those fabric
25:02 tenant right so with those fabric features are not available to me like I can’t connect to a any of the lake any of the SQL Warehouse like this those don’t even light up in that tenant yeah at all so they’re not available it it shows me that everything’s fabric green now right yeah every it shows that I have like the German UI but all the you have like the German UI but all the core components of know core components of connecting to and utilizing any of the big new things that we’re combining in terms of
25:32 combining in terms of lake house one Lake notebooks that whole experience is not available gotcha awesome I like that one it’s a good one I don’t think I would need a heavy lien on the central U. S region so I think I’m back up to I think I’m up to spending some more money now hopefully finally I’m gonna be the the slow I’m I’m trying to divide it probably a little bit two two many times here but I’m gonna throw ten dollars at syncing the data pipelines experience
26:02 syncing the data pipelines experience and fabric so fabric has pipelines which is essentially Azure data Factory there’s just a lot of there’s a lot of data sources or or locations of information that are missing from the full Azure data Factory experience again this is one of these things where I feel like they’ve divided the code between the different environments that they’ve built for Azure data Factory so there’s Azure data Factory there’s synapse and now there’s Fabric and all of them seem to be similar like whether they’ve taken the the root code base of
26:33 the root code base of Azure data Factory pipelines and they’ve like watered it down a little bit for each of the different environments maybe maybe there’s a lot of investment to get an airity but I think I would spend another ten dollars of trying to keep synchronizing the parity between ADF ADF synapse and the fabric pipelines experience so that way if I’m developing in ADF or fabric the experience is identical I have the same connectors I have the same options there are some really interesting features Administration features that come out of the Azure data Factory space that are
27:05 the Azure data Factory space that are not in synapse and then they’re not also in pipelines inside fabric so I’d like to see those features be brought over so that way that way the fabric pipeline experience matches the same experience I would get with Azure data Factory Azure data Factory you’re all about syncing everything up well I’m well I’m well let me bring it this way I’m used to building things in data Factory I’m used to building things in synapse and so when I see new like so I’ve I have this tainted experience of I’ve already
27:36 this tainted experience of I’ve already experienced these tools like when they they have been developed and they’ve been matured for years in these other platforms these other areas so when I walk into fabric I’m like my opinion right now is from a business user the fabric experience is great and it greatly enhances the business users experience because I’m giving them two things they’ve never had before they’ve never had pipelines and they’ve never had notebooks these are two great data engineering tools that help you really build that ETL however if I’m coming at fabric from a data engineering perspective I’m going
28:07 data engineering perspective I’m going where’s all my features where’s all the stuff I used to be using like why would I come over to fabric and use and have less features than what I was already using in Azure data Factory or data bricks like I already have solid and data engineering tools what’s the story for me to move over to power bi and I
28:22 for me to move over to power bi and I think think to me that’s that’s the Gap that needs to be closed right I feel like of the two personas the business user is getting a better deal because they’re getting a whole bunch of tools they’ve never had that are Enterprise grade whereas the data engineer is actually getting a watered-down version of data engineering tools that they used to have but now they’re getting less features so I think the story is less compelling as you bring the data Engineers to power bi I think it’ll get better but right now it seems like it’s a little bit less compelling does that make sense when I’m
28:52 does that make sense when I’m saying there if this is only the ten dollars and five dollars for this oh you wait I can’t wait till my thirty dollars you wait till my thirty dollars my goodness all right Tommy think back over to you yep so we’re gonna do another 25 so that should be whoa what yes expander plus two I believe that what would that bring us to your last 25 was UI and fabric yes that would bring us to 70. yep and this one is a a quick little
29:22 this one is a a quick little add-on to add-on to based on something that you said Mike so it’s the tools it’s external tooling okay application awareness and going along with SPS is related I think fabric is going to be more and more for collaboration not between tenants and teams and consultants and people like like you and I have been trying to build a lake house I can connect to your workspace but I have my own like my tenants domain I can’t connect to your lake house we’ve
29:52 can’t connect to your lake house we’ve seen some very weird Behavior between a guest user yeah exactly right a guest user inside a fabric workspace can’t quite access all the the core elements of that workspace I don’t know if it’s a bug or whatnot but when it pulls things Tommy owns inside a workspace that is in fabric it keeps pulling things from Tommy’s tenant like the lake house like hey I’m trying to connect this data flow to a lake house it’s not connecting to the lake house that he’s a part of in the power bi tips tenant it’s actually
30:22 the power bi tips tenant it’s actually connecting to his lake house that’s in his own tenant he’s trying to grab data or land data there’s strange which is very odd but it’s it makes sense because he’s using the credentials of his his other email address right his email address is coming from his other tenant yeah but what is but it’s got to have guest user access in ad right I used to do everything in the power bi tips tenant in the workspace in the fabric so let’s create a lake house but when I create stuff a lot you can’t write data to it say hey connect to a data flow or push to a lake house
30:53 to a data flow or push to a lake house what it shows me is my poolia bi items right so so one of the things that is interesting to me is what is the experience for that user if somebody shares something with you from that power bi right right now no I don’t know I don’t know if we’ve actually tested that exact thing so for example one of the biggest challenges we’ve had because we have multiple right yes is originally we wanted to basically share share power
31:23 we wanted to basically share share power bi stuff in in the power bi service with the same experience to external users it’s all URL based it’s all like you should yeah you always have to reference that URL there’s no things apps whatever yes so it’s interesting to me that as you’re talking about an external user which you can’t like in other parts of azure you absolutely can right like if you want to access people with this different domain can come in no problem they can do whatever they want to yeah like so if this is now stretching the
31:53 like so if this is now stretching the bounds like there’s gonna be a conflict there there where they can work with all these elements within power bi but then you can’t share things with them in an interesting one it is pretty it’s get it’s getting there I think there’s going to be such a need for it that’s why I put so much and all the training we’ve done with cross tenant I’ve created a lake house I’ve created data flow and they’ve done all the things but there’s some weird login issues and then I probably will want to work on some of the tables some of the items
32:23 some of the items in an external tool in in some Advanced features right now you’re right it’s everything’s URL based this goes to Mike’s one too with the admin if I need to get a connection to a to an artifact I shouldn’t have to like always find that setting especially if I’m collaborating across tenants yeah yeah so you also mentioned external tools were there any particular tools that so 25 so this is I think let me just articulate the feature back to you so make sure you understand what you’re saying better integration with external
32:53 saying better integration with external tools and or better security around adding guest users to artifacts that inside power bi a bit won’t understand there’s the cross collaboration and with that ability with the external tools I combine it because you made me think of it oh oh yeah but I wanted I needed to give it the shadow the the external tools side of this is I’m working with other tenants and other companies being able to right now with a warehouse right you can create views office you can create
33:24 views office you can create a SQL database with the views and tables easy integration help manage their data flows that would be amazing to set that all up for them so what so you’re talking about like something let me let me keep you there because I’m not hearing a tool yet yeah are you talking like this is like a vs code plugin that you would use to say I’m going to write out this data flow I’m going to write out these things and help connect to a customer’s tenant sure and push more things is that what you’re thinking or like what yeah that’s the external tool that we’re talking about we still don’t
33:55 that we’re talking about we still don’t have Empower bi desktop the ability to switch accounts I have to still sign out and then sign in with another user in power bi desktop yeah but that’s not an external tool I’m driving for what is the external tool that you’re looking for I’m looking for something where I can manage the items and artifacts and fabric data flows data sets modeling and I can switch across accounts seamlessly okay fabric there’s this new setting that I think gray skull analytics in the UK just showed you can now do read and
34:27 just showed you can now do read and write on xmla for fabric items okay so we’re getting there but give me something where rather than having four systems open and four ta for browser windows making sure I have the URLs and the right sign information let me be able to manage all this in one place okay I still don’t hear an external tool I just hear surface I hear I hear like access points or external tools that potentially will be there you don’t have in your mind and external to let you really care about you’re just saying give me more access to these things and the artifacts the
34:57 to these things and the artifacts the things that are in power bi right so that people could build external tools and and exactly do more automated movements or manipulation of stuff in power bi for example I would really not want to create another measure in the service and a default data set as you’ve seen seen it broke horrendously on a live stream that was awesome that was really fun this is the worst experience that’s you were right that desktop was going to be in the cloud I was right this it was going to be terrible to create measures I I’m not Patrick’s wasn’t so
35:27 I I’m not Patrick’s wasn’t so hot either when it first came out so I will I think we need to give them some time to refine this we’re writing the measures and they would disappear on it we would try and hit enter and be like sorry error sorry error and then I was saying things were locked I’m like what is going on right now we have no clue gold data sets which are gonna have more than three measures in it yes there has to be a better way I I think to that end right to be editing things in the service right that was a big Boon and I think this whole idea of like having the data model always be immediately updated
35:57 data model always be immediately updated to whatever like it’s automatically Auto saving everything I think somehow that creates some locking experience if you have two people trying to edit the same data set no way it’s just not gonna that would be that would be a problem because that would be a problem who do like it’s a workspace everybody’s working in the work that’s the point right multiple people yeah you could you could have multiple people editing the same stuff so I think a better experience would be more of what they do inside synapse where you you are on a branch and you’re committing things you’re checking things in and then the
36:27 you’re checking things in and then the two code Centric I know low code people low code I want to check out and let me use a tool to create a measure where I know it’s not going to hide on me that’s true so all right a Tommy you’re up to 70 bucks Tommy you’re Seth you are up to 50 60 dollars since since I know you increment in very small numbers I’m going to shoot all the rest of my shot and oh wow 40 bucks 40 bucks whoa this better be great
36:59 bucks 40 bucks whoa this better be great it’s what I call AI Innovations copilot right oh so so one I would love to see some non-figma demos come in here pretty soon across the board okay one of where where my mind goes with the AI Innovations above and beyond just like the co-pilot experience would be things that when Hopkins is exploring with Chachi PT and and other AI transform tools where he’s getting very
37:29 transform tools where he’s getting very verbose in data connecting and cleansing right if you if you haven’t checked out some of his stuff like do it he he’s he’s showing showcasing that AI is essentially connecting to data sources like understanding the fundamental breakdowns of like why something’s not clean or why the data is not coming
37:50 clean or why the data is not coming together the data types There’s issues with the data and there’s so much there right we’ve talked a lot about where is a fundamental core problem with business intelligent General it’s bad data in Source systems so if I can if I can use AI to connect these sources and you do that like another great example that we’re experiencing again you connect to Salesforce yeah exactly Salesforce is a system that allows business people to create databases or tables of information that will span one two three
38:22 information that will span one two three four five six hundred columns because they have no idea how to manage it they just add another column add another account yeah and that’s the advantage of that system that’s what they want you to do crap show right yeah yeah but in terms of like leveraging systems that could rip through here and say yep none of this is used anymore here are the here are the columns that that you need here’s what’s missing can I associate that to a business owner or say hey like in our analysis of this data like think about all the ways you
38:53 data like think about all the ways you could automate a lot of this ticky tacky like I don’t understand why this doesn’t line our numbers are wrong etc etc etc etc just in that first layer right yeah and then we’ve already seen pseudo demos or demos of copilot where as a coding experience somebody’s trying to do transforms I think that whole experience could get a lot easier for business users as they’re like hey okay now I have my data I need to compile and transform this are we to the point where if
39:23 are we to the point where if you describe things it could it could do those transforms and the output of that would be a model right can you conform things to a model that you could easily engage in as opposed to you having to build those things and then like in the pie in the sky like go build my report thing in power bi we’ve seen pictures of would love to see that eventually but I think a true end-to-end would be the tooling that would allow individuals who know how to put in the
39:55 individuals who know how to put in the right inputs to build the whole thing end to end right I don’t think that’s I don’t think that’s off the table at all based on just the compartmentalized and componentized ways and I’ve which I’ve seen people using AI already that I I think the the investment there just speeds up the not only the time to Market to solve business problems but at the same time like actually build out the infrastructure that is can be reused for many other reporting purposes and that’s the value
40:25 reporting purposes and that’s the value here is like yeah I have these systems that can that can help facilitate and speed that whole process up of of the actual engineering of things of things and sky’s the limit right I like that one I think that’s a good feature as well and AI code if the AI copilot thing has the potential for doing lots of really rich things I do find people already using chat GPT to help them write measures to some degree but this is this spans way more than just ready
40:56 is this spans way more than just ready measures I really like your AI around hey AI here’s a table of data tell me where there’s inconsistencies in this data and it should just rip through a whole bunch of columns of data coming out of these systems and saying here’s where we find their problems here’s where you should focus your attention and if those columns don’t mean anything to you then okay fine make Norm yeah you should check out win Hopkins recent videos and stuff like the things he’s doing I’ve seen him now posting that stuff yeah it’s it’s it’s pretty cool what’s this what’s his YouTube channel
41:27 what’s this what’s his YouTube channel or or is he’s got a YouTube channel most of it on LinkedIn and he’s doing the videos right there yeah let me let me grab that well and it’s not just that I think there’s a few like what’s that command line tool DBT yes yeah so I just saw an announcement for that one I’m a little leery on DBT I’ve seen a couple places I’ve heard some customers asking about it and trying to incorporate it in their workflow workflow on one hand I like
41:58 on one hand I like what it’s doing on the other hand I’m like well is this just the same thing that I would automatically get or would be able to build already in Azure with all the other so I’m not I don’t like the mindset of just adding other tools just to add other tools like I need I need a really solid purpose behind it and is it really solving a large enough technology gap of what I already have right so it’s DBT is around composable data architectures and reusable components and defining these things like yeah I get it but don’t we already have that isn’t
42:30 but don’t we already have that isn’t there something already there that is also doing this stuff already today so I’m not sure if I’m totally sold yet that that is the that is the way to go yet I’m I’m cautiously holding my breath I guess I’ll say it like that well yeah and I don’t they’re doing a lot with that but just even the data modeling to your point Seth and it’s not just chat GPT chat like hey look at my model yeah it’s the what is the llm mems the
43:01 it’s the what is the llm mems the like the language large language models yeah yeah the ability for that to quickly go through obviously data models are not nearly as complicated as what some of the things that it’s doing now and try to identify think about it try to identify primary Keys Try to identify a few things that you add in that’s gonna be a huge way from a creation Seth do you see anyway from a consumer point of view with the AI or is it really just like the the development the build well that’s interesting yeah because you’re focusing a lot on the development experience as opposed to the
43:32 development experience as opposed to the consumer side yeah but what is the consumer side of reporting I think well in general could you utilize AI much better in the features where I can ask a question right like right now we have to custom build all of the synonyms and everything within a model for the keywords and everything to show up appropriately for the model to be able to produce the right results for the end user right could AI simplify that experience without all that I think so
44:02 experience without all that I think so right because it’s just do I have the data within this model to to produce the results for my end user that is asking about it right because that’s the same thing like it’s a data set AI is just bolted onto a data set whereas like chat GPT it’s not the most recent data set right it’s not up to date in terms of the internet right so could could that yeah and in those types of use cases where somebody’s asking for things outside the bounds of like a report page and visual I I think
44:32 like a report page and visual I I think there’s even some tools out there that are like ah we don’t even need report visuals anymore right like just ask your question and we’ll give you the answer in in visual format a mix of that could also be something that is is extremely valuable right because how many of I I think we’re where it really extends to me is potentially on the data side I’m a sales guy I’m gonna do my quarterly Business review for this company here’s the metrics I need to you company here’s the metrics I need to show them and it rips through the know show them and it rips through the data set not necessarily a power bi
45:03 data set not necessarily a power bi model right it’s just the one Lake data sources and okay here’s here’s the company here’s the things here’s the metrics bam here’s your answer interesting right right very interesting I think your feature studied on something I’m also going to throw some money at here in a little bit as well you better hurry up because we’re running out of time and you’re out of money so we can keep going I am out of money yep all right so I’m gonna I’m gonna throw down I’ll give like my
45:34 throw down I’ll give like my next two items because I’m I think I’m behind everyone else on spend at this point so point so I’m I’m gonna do better debugging and experiences around the notebook experience inside fabric I’m going to throw 20 at that one I think notebooks and the massive parallel processing that comes along with them are isn’t as a game changer I really like writing notebooks if I had to write notebooks in Python over SQL statements I like using SQL inside notebooks so the
46:04 I like using SQL inside notebooks so the the spark version of SQL so that’s I like writing that but I really enjoy writing in the notebook experience more so than in a SQL script the SQL script is this one long text file thing that you get and I really like the idea of notebooks where I can execute a command one at a time and evaluate clean up look so I really like that experience it’s just very frustrating to me because when I’m trying I’ve done a little bit of testing around it and I’m still trying to get my head around it sometimes when you’re trying to create a temporary view like
46:35 trying to create a temporary view like an attempt view or a view that lives on the machine and you try to write that view back down to the lake or create another table some of the commands didn’t seem like they were working correctly and I had the wrong data frame and it was it was not letting me write the data back down so I was like I needed better debugging better indicators to me like what did I do wrong help help guide me a bit more what code should I be using here and I think to your point Seth the AI feature that you’re pointing at would also very much greatly assist me inside the notebook creation like hey I see you’re trying to
47:06 creation like hey I see you’re trying to write the data down here’s this here’s a piece of sample code that would help you write the table back down to the lake like give me AI there as well but that whole debugging experience whether it’s AI driven or not there needs to be
47:18 AI driven or not there needs to be better language there there needs to be better information inside the notebook experience and like what is wrong and why is it taking so long or what are the problems with your code either in python or SQL whatever you’re writing there why is there what’s the challenge what was the issue that was making a mistake there help me help me understand what that problem is because if we’re talking about notebooks we’re clearly talking about developers that’s that is a developer experience and you’ve got to make that experience top-notch and easy to use
47:48 to use so that would be where I would throw another twenty dollars and I’ll throw in one more feature here I’m gonna throw a 30 in what I’m going to borrow directly from databricks this thing called the unity catalog or Master data services or data lineage and Tommy you already talked about you’re going to throw some money at data lineage as well I I’m throwing a lot of money at this one right I think this is a I think this is a very important feature and to your point Tommy as the workspace gets more crowded with artifacts and things I’m seeing the the workspace now becomes
48:19 I’m seeing the the workspace now becomes this Rat’s Nest of random data that’s moving in and out all of the you can connect anything to anything anymore I can go use a data flow pick it up from a lake I can do some Transformations and I can put it back down the same lake or a different lake or into a SQL server or wherever else we want to put there is so many possibilities now people are going to have a hard time just figuring out where did this data come from and what where did I apply the business logic in this process so I really think we need better documentation we need better you
48:50 better documentation we need better you better documentation we need better to your point set they’re on data know to your point set they’re on data stewardship and ownership of data this is where I would put that I would put that in this Unity catalog that has every table with descriptions of columns with all the measures where they go and then I want full lineage from from the raw bronze level data if you’re talking Medallion architecture I want bronze all the way to a visual like give me the full lineage here’s where all this data came from and I want to trace it back I want to go from this visual these measures this column back to this data set back to these tables
49:20 to this data set back to these tables backed all the way through I want I should be able to very quickly diagnose where all that information came from and I think that’s a huge confidence booster for the business user so even though I need it as a developer like in we’re starting to see things now that are called I guess Tommy we looked at is called impact analysis yeah so that’s stuff that’s starting to appear I want more of that that that makes sense to me so those are my next two features so I I like that a lot just because if you anybody in the data realm knows that like documenting things all
49:51 knows that like documenting things all along the way is it’s a challenge because it gets stale right but at the same time like it is AI you could have like the pipeline and then interjected things of like hey there’s transforms that happen here yes what happens here’s the like totally make that a business level like description because I think about us as developers when you’re talking to somebody on the phone you’re looking at the code and they’re saying like well what is it doing well in your as you’re reading the code you’re like okay well interpreting is we’re taking this account and then we’re removing all
50:21 this account and then we’re removing all these and then we’re adding this and filtering out that and adding in this to have this thing and blah blah and the business person’s like oh no no that’s not right you need you need to have this and this and this right but at the same time like AI is definitely that’s where it’s great right explain this measure show me this menu data sources all the transformations in a way people can easily digest like exactly to me that’s a low-hanging fruit right now that AI should just definitely pick
50:51 now that AI should just definitely pick up and just handle that for me so that way I don’t have to write documentation my documentation problem yeah and just have it automated every time I publish a change it knows what changed and just spits out the next the new definition of whatever that thing is yeah we’re already getting odd co-pilot for github’s doing a preview of automatic get commit messages like AI generated messages yeah yeah it’s coming the tool that I use for note-taking has this idea with databases
51:21 note-taking has this idea with databases but it’s like Auto summarized Pages we can do this with tables we can put a little prompts and I agree I that’s a good one so that is that’s your you got 30 left so I’ll do my last one yeah you do your last one I got one more after this one time so you do your next one what’s your next one here Tommy I am throwing the rest of my budget to the data flow Gen 2 experience if they leave gen one in the dust so be it I’m with you on that one I’m all behind this one I like data Gen 2 holy smokes
51:51 this one I like data Gen 2 holy smokes really good it just is it just to me feels a bit too slow still and they need to make it like massive parallel processing like we would with the notebooks like if they can do that part of it I’ll be very impressed with it I really like all the capabilities with Gen 2. I agree with you 100 on this one and they please tell me how append or replace works because right now I don’t think it does or make that clearer yeah so because or how can I configure the power crew because I we’re creating something I did
52:22 because I we’re creating something I did append and I was like oh I have 18 duplicates right now the same thing like what the heck yeah so but just making it faster just making the UI the UI a little better just a little more space I don’t know desktop app or something cool oh boy let’s not go there I’ve closed my browser three times it just crashed we got out of the class that’s because you have too many tabs open just put one or two tabs all you need is power query data flows tab data model tab report build tab that’s all you need that’s all you need anymore that’s all
52:52 that’s all you need anymore that’s all that’s it that’s it no way or you can get you can exit out of it or back I don’t want back buttons I want something dedicated make it faster from the UI just making it because it’s still a better experience to edit and build and agree yes from copy pasting to editing it move a little faster so give me if you can give me that experience in the web or just freaking let me just do the Jade info Gen 2 in desktop and push it I don’t care but make it faster help me
53:22 don’t care but make it faster help me with the pushing to other sources but keep it going man so that I think that is going to be the future of fabric so I’ll give you my last 30 here and this would make this is again a very developer-centric thought on this one it’s not as much UI or front end but again I think fabric is trying to push more towards the data scientist and the developer anyways I’m throwing 30 down on git and deeper integration with all the artifacts that are created inside the fabric workspace like I really so I’ve done the get integration currently I can see that I can get a data set and
53:53 I can see that I can get a data set and a report those are both being updated but I want Tyndall on all things inside the data model like I want the whole data model be tindled right so it’s easier for me to edit that and have that being checked in or out of git with Tyndall and then I want a report definition like I can see it now with a pbip project but I want a full def like make Tyndall come to the report side do the same level of detail of all the artifacts that are in the report and manipulating that Shameless plug we are now we in our power bi theme generator you now can not only build just a theme
54:24 you now can not only build just a theme you can also build a full report today so like I’m already very deeply involved with this pbip world of things and very excited to have this pbip evolve into a better thing because we’re already building today pbip projects for you you can build multiple pages of report you can add your own backgrounds you can do all kinds of other crazy things and stay tuned more new features coming out here in the near future that are going to make that experience even better for you to build your entire report without even touching desktop which will be absolutely a game changer
54:55 which will be absolutely a game changer for anyone who wants to develop templates or repeatably building things over and over again it’s going to be huge so it’s crazy cool all that being said that would be where I would throw my last 30 dollars and I think with that we’ve all spent our money at this point I think we have done it all right hundreds of dollars hundreds three hundred dollars and we have solved all the major power bi fabric major issues and problems just kidding good luck Microsoft it’s
55:25 just kidding good luck Microsoft it’s going to be hard for you to figure out which features to build next you probably will take none of our ideas and and throw them all in the trash can and build what you think is the best probably something totally different than what we’re even thinking about so with that we really appreciate your time thank you everyone who was listening to us today I hope you have some ideas around this if this was just a fun conversation for us to ramble about the different features and where we see the strengths and or weaknesses of fabric at this point point it’s still in preview I’m very confident that a lot of these things are going to get fixed and we’re going to get better experiences around all this
55:56 get better experiences around all this stuff as they continue to invest and develop in fabric I see this as being this is this is so much of an investment for Microsoft side I don’t think they’re gonna they’re not gonna walk away they’re going to keep investing that this is the future of what’s happening with power bi and and where we’re gonna go so I think this is just going to continue to snowball and get better and better as we go here so very Pro fabric I like the experience still a little rough around the edges but it’s getting better better with that Tommy where else can you find the podcast you can find us on Apple Spotify wherever you get your podcast
56:27 Spotify wherever you get your podcast make sure to subscribe leave a rating it helps us out a ton you have a question an idea or a topic that you want us to talk about on a future episode you can do so go to powerba dot tips slash podcast thank you we’ll see you all next time appreciate your time and we’ll see you next time the next show
Thank You
Thanks for listening to the Explicit Measures Podcast. If you enjoyed this episode, share it with a coworker and drop a question or topic suggestion for a future show.
