PowerBI.tips

OneLake for Fast Adoption - Ep. 370

OneLake for Fast Adoption - Ep. 370

In this episode, Mike, Tommy, and Seth dig into OneLake and why it matters for faster adoption in Microsoft Fabric. They talk through how a unified data layer changes the day-to-day workflow for teams, where it can reduce friction, and what to watch for as you roll it out.

News & Announcements

  • TMDL - Visual Studio Marketplace — Extension for Visual Studio Code - Language support for the Tabular Model Definition Language (TMDL).
  • PowerBI.tips Podcast — Subscribe and listen to the Explicit Measures podcast episodes and related content.
  • Power BI Theme Generator — Power BI.tips - The worlds best theme generator for Power BI reports. Increase your speed to develop stunning reports using this free theme generator. Themes are essential for any report developer’s tool belt. Visit…

Main Discussion: OneLake for Fast Adoption

The discussion focuses on how OneLake can remove a lot of the overhead teams feel when data is duplicated across tools and environments. The crew shares practical ways OneLake helps centralize access, simplify governance, and speed up delivery—especially when multiple teams are collaborating across Fabric workloads.

Looking Forward

If you’re trying to accelerate adoption, start by mapping where your data is duplicated today and which teams are blocked by access and governance friction. Then pilot OneLake patterns in a small, high-value area, document the conventions, and expand once the workflow feels repeatable.

Episode Transcript

0:34 good morning and welcome back to the explicit measures podcast with Tommy Seth and Mike good morning everyone good morning good morning it’s I’m still traveling so that’s why the background looks different today but been having a good time traveling it’s been good conversations we just did a YouTube video yesterday that we’ll talk about in a moment and quickly let’s jump into our main topic for today the topic we’re going to talk about today is opening up one Lake for fast adoption let’s talk about how one lake is changing what we

1:06 about how one lake is changing what we build and maybe how we build it and are there any other Alternatives I think I have some good opinions around some of this now I’m starting to formulate some more ideas around this further testing required but this will be I think a great discussion today looking how to leverage the lake houses within your power G power G development let’s just jump over real quick any other news items I have a I have a brief news item and I’ll jump to that one if anyone else has anything else news else news related it’s been been quite

1:36 related it’s been been quite quiet on the on the blogs Microsoft ignite is coming up it is happening at the end of November so it’s probably not going to be too many announcements they’re hold holding News until Microsoft ignite happens typically so there’s probably not going to be much we’re going to see here in the next couple days or weeks until up to ignite shows up but once ignite hits I’m pretty sure there’s going to be a lot of announcements at ignite one of these conferences I want them to announce it three weeks before and then spend the conference doing workshops and

2:06 spend the conference doing workshops and letting us actually just play around with it never gonna happen never happen just one of these times so I I know like oh that’s so cool when is that oh October but then but then you take away Tommy all of the excitement from attending the event and being in the room and it’s announced and yeah to say I was there along with everyone on Twitter right right that that was one of the moments that I when I was watching did you guys ever watch the SpaceX Landing of their new it was

2:37 SpaceX Landing of their new it was the Falcon heavy rocket where the rocket came down was caught by the little arm things yeah so I watched that live I got a little like wow this is such a technological Marvel and it was it it’s one of those things when when we go to like the MVP Summit or we go to like conferences like fabric conference there’s usually large announcements that are being made because Microsoft is trying to put a splash down of like what’s coming those are extremely exciting to go to and and yeah I’m very happy we now have a dedicated data and

3:07 happy we now have a dedicated data and AI conference that is happening which will be fun so so Tommy I think what you’re proposing is like they should have a follow-up conference yes yeah yeah exactly that’s which which would be smaller but it’s like hey you now that you’ve you’ve digested the stuff for a month right or a couple weeks right like we’re going to have another followup followup conference or is it should it just be at the end confence we all know the those dirty little secrets with any big

3:38 dirty little secrets with any big company doing demos for the first time it’s on by the seat of their pants and that thing is not they’ve never how many times have they actually been on one of our conferences they’re like we’re gonna show you something and it’s available right now today just log in oh there it is sometimes I I get what you’re saying some of the some of the future future stuff right might be in a video form or something because it’s it’s about the idea it’s about like like the thing that you’re going to get and is it fully

4:08 you’re going to get and is it fully baked all the time before they announce it no not always here’s what I’d like to propose to You Tommy because I think this is I think your idea is a great idea what I think should happen is there should be other groups like user groups or things like that what should happen is user groups should be that Force to drive the let’s get our hands on these new features after like a month after the actual event so I would encourage you if you’re listening to the podcast and you’re part of the community you and you’re part of the community when when Microsoft has big events know when when Microsoft has big events what I think would be wise here is

4:38 what I think would be wise here is listen to those events find what you think is interesting build little mini sessions or workshops around that and then run user groups that’s on top of those events so one thing I’ve been talking about with Tommy here for a while I just didn’t did a I didn’t do a good job this year so maybe 2025 we’ll do something but after these big events particularly the Las Vegas fabric conference I’d love that like a little Midwest mini fabric conference and do one just touring around a couple cities jumping in a couple Microsoft offices and actually sit down and do

5:09 and actually sit down and do the same thing but re rehash the content do a little bit more demos because now the software’s out you can play with it you can start seeing how it works I think I think that would be a wise idea so free idea for anyone if you’re a user group leader we recommend you go you go do that that’d be that’d be amazing oh one thing I I’ll have to admit here for those of you who are listening who are User Group Leaders Tommy your user group is on Meetup correct oh this is this is very good yes it’s been on meet up for quite a while

5:40 it’s been on meet up for quite a while now so Microsoft has recently changed a pivot in Direction on their user groups well let’s go through a little bit of User Group history real quick yeah we were in this thing called Dynamics communities they had their own website their portal we we used that to like Get Together do groups and events it was awesome it was good good event m Microsoft said hey we’re not going to work with Dynamics anymore we’re going to take over user group management and and do it our way great they built a new website we were now User Group Leaders there but there was

6:10 User Group Leaders there but there was very difficult to get meetings created and added and actually run our groups Tommy talked to me about he look Meetup is the way to go it’s the place to be and Milwaukee User Group which is one that we help out with Seth and I lead that one that group we didn’t use meet up because we didn’t want to pay for it well if you’re a user group leader for powerbi you should definitely reach out to your community leads because Microsoft Now is offering a Meetup free subscription in their Pro Network so Microsoft said look we

6:41 Pro Network so Microsoft said look we know this is the best way to do it we’re going to give you a Meetup event so you if you’re paying for meetup or you or for your user group or you want to start joining the Microsoft meetup groups you can transition from wherever you are to the Microsoft Pro Network inside Meetup and that will let you have a full Meetup for free so I’d highly recommend other groups if you are doing this if you’re running groups if you have meetups make sure you reach out to community lead if you actually need extra help with that let us know we’ll help you get in touch with the right people but I think it’s a really good

7:12 people but I think it’s a really good opportunity I like the Meetup platform I think it works well it’s easy it’s good I I’m I’m happy with this move one one bit of historical before that the original was like straight like the original user groups were up straight through the community so so Microsoft was like trying to support themselves then they engage Dynamic communities and a okay a okay and I yeah let’s just say there there was a a strong recommendation right out of the gate to just use Meetup so I’m

7:43 of the gate to just use Meetup so I’m glad that they’re there now you did bring up some really good memories I probably was on that Dynamic site like at least three like three or four times a week putting content up there yep yeah yeah it was such a early middle 2000s type of website so yeah well I not to be critical right but like ultimately the the the goal of user groups is Meetup it is to create an event it is to get it to the widest audience possible and and get

8:13 widest audience possible and and get people to a live event right and I I think along the way the implementation of some of those things created a second community of chatting and and things like that which probably took away from the main power of the eyesight and yada y right like so hey we got a user group we got people you want to show up put it in the place where a lot of people go to go look for events and meet up is that place so yeah and it’s nice because you if you are traveling if you’re going

8:43 if you are traveling if you’re going places you can actually say where like I go on trips you can say well again as as MVPs we like to speak wherever we can and me as a business owner I like to expense as much as I can so if I’m traveling somewhere I want to find I want to find a user group that I can jump into and like hey I’m going to speak here and then I can expense and stuff so I so I like doing that and and Meetup I think just makes it so much easier from a searching for Live Events search for the word powerbi search the word Microsoft fabric things just start seem to show up so anyways just want to get that out there to people who are User Group Leaders please if you want

9:14 User Group Leaders please if you want you want some free Meetup to run your user group let us know and we’ll get in contact with the right people that being said jumping into my final news item and then we’ll go into the main topic here I have one more news item I just did a YouTube video with Sid the Microsoft product team the Microsoft timle tmdl that’s an extension for the new format that they’re using to build models and things just went through a major update literally yesterday three hours after it was released said was like hey let’s do a

9:46 released said was like hey let’s do a video on this and we sat down on YouTube and we did a video a demo a live demo of working with the timle editor for VSS code couple prerequisites you have to have your reports in PBI R format your part your reports need to be linked or synced into not PB it’s the pbip project format that’s basically the format you’re looking for those projects need to be synchronized with either Azure devops or GitHub one of the two doesn’t really matter but once you have those synchronized you can then link your

10:16 synchronized you can then link your workspace directly to a git repo all the artifacts the models the reports anything you build in the workspace will make objects inside that repo and it was amazing there’s autocomplete there’s

10:29 amazing there’s autocomplete there’s co-pilot GitHub co-pilot writing timle is amazing you don’t know how to write formatted strings it does a good job of giving you how to write a formatted string inside a measure or column you don’t want to document things you can highlight a whole bunch of code and say document this section of code it just did it if you want to if you want to yeah you could say make explicit measures on top of these columns and just list the name of the columns it writes all the code to write all the explicit measures in one in one prompt

11:00 explicit measures in one in one prompt or one command so I was incredibly impressed with this new plug-in because think of it the timle format is files that are text Theo basically with some enhancements but you now have all the richness of a very mature code editing Studio it is it is vs code I don’t know if anyone uses this online our team is a very Dev developer Centric and we use a thing called live share a lot

11:30 we use a thing called live share a lot and it allows two people to work in the same vs code environment together at the same time writing code in the same file together you can see people’s cursors move around it’s in it’s like a GameChanger for people who development so if you have big models if you have a team of people working on models together to me this is a no-brainer like this is like yeah you have to have this this will greatly increase your productivity and then the I on the cake is anywhere you are in the code you and I I learned this yesterday yesterday control I allows you to jump right into

12:02 control I allows you to jump right into co-pilot with whatever you have highlighted or selected and co-pilot will just give you a prompt window you shove in your prompt and Boom Out spits answers and solutions and dude I was sold I was like this this is next level stuff at this point not to say that co- pallet isn’t good but the GitHub co- pallet is pallet is excellent dude this goes along with my thinking when we think about Dev tools and everything is just going to vs code you take the Tim timle extension

12:32 code you take the Tim timle extension Geral Buckler’s two powerbi and fabric Studio extensions synapse remote which basically will spawn your spark server on your own machine this is all within vs code man so I I know you want to say the desktop is the development tool but still G to go with this you can only edit the settings in Json format type of development tool there continually is becoming more and more personas that are being

13:02 and more personas that are being supported by the fabric environment right so it it seems like we’re growing a bit so people say oh you’re you’re the business analyst and you’re going to use tabular models and that’s what your work is is I I’m not sure I would throw that role all in one person like that I I really do think there’s like this I don’t know how you describe it there’s this really gray area for me where it’s between you’re not quite data engineering but you’re not quite the report analyst either either I really think there’s this Persona called the data modeler right someone

13:32 called the data modeler right someone who really intently understands the semantic models someone who knows how to shape the data and get it ready for Star schema performant models Large Scale Models to me that that feels like a skill set and I feel like Microsoft is putting that skill inside the data analyst or the the analyst level of of persona and I disagree I feel like it’s more data Engineering in nature it’s more code writing and so that’s why I like this extension for vs code because that pushes me more towards that data engineer level of things as opposed to

14:04 engineer level of things as opposed to the analyst level of things I don’t know that’s just maybe my opinion I I could be wrong on this one I’m just I feel there’s more Happening Here I I for with with the description of those two roles an analyst is not what you described because because what what you just described as somebody who is not the bi developer or the data engineer it’s the person with both of those experiences yes because you’re you’re not going to be like like if if you have so much work right that

14:36 if if you have so much work right that you have an individual that is creating the models on top of all the engineered objects right you have to know how those ecosystems are going to work yes because you’re teeing things up to work effectively in the model related to all the measures you’re creating how how like you’re going to build the filter contexts on all the pages where where can you put the the transforms is that back in engineering is that does that have to be up for front and forward like so yeah I would say you have to have

15:06 so yeah I would say you have to have both of those things to be that model an analyst is not that like they analyze data right like when that that’s my perspective anyway I agree with you anyways really cool extension I’m going to go see if I can find the documentation on the vs code extension now getting popped out but anyways I’ll try and put that in the the chat window here for you as well but what I would recommend mend is if you’re going to go use the timle GitHub extens or sorry vs code extension make sure you

15:36 sorry vs code extension make sure you go find it on the vs code Marketplace it’s the they have a little Marketplace for all the extensions go there search for tmdl that’s what you search for and it’s from Microsoft it’s just called T tmdl and it has no logo oh actually that’s changed they just added it so with this release they should now have have a new oo logo for the the timle editor or the timle extension I guess so that’s been added too so now there’s a nice pretty little timle file folder

16:07 a nice pretty little timle file folder thing there and apparently in talking with Sid there’s a lot of other demos and use cases that ruy the leader of this is now teeing up and there’s going to be a whole bunch more announcements tee up educational pieces so again this is one of these things I think is going to be landing a little bit more towards ignite there’s going to be more information on this there’s going to be more demos around how great this is so very excited about about that and then also another side note of this as well is I think the reason why they’re doing some of this is a parbi desktop is also going to be getting a timle editor

16:38 also going to be getting a timle editor as well this was announced at Microsoft fabric conference in the main session and in one of the sessions that was done by ruy and his team as well so we got some really great features coming for desktop and I kind features coming for desktop and I like the fact that they’re adding a of like the fact that they’re adding a whole lot more like code window inside vs code to help us build stuff and and I man I’m telling you this is feeling less and less like coming home is it coming home for you it is it’s this is

17:10 coming home for you it is it’s this is feeling less and less like a VI tool this is feeling more and more like a pro developer tool desktop is really getting some some like better a term desktop’s really getting some horsepower here like I think this is going to really let you extend these models quickly it’s going to let you automate a lot more things the editor the timle editor in desktop looks amazing you can edit multiple measures at once you’re going to have GitHub co-pilot you’re going to have regular co-pilot on top of it there it just seems like the right scenario I just it it’s awesome

17:41 right scenario I just it it’s awesome I’m very excited for where they’re going with this the development path here I think is very exciting for me anyways I’m very pleased about that all right enough of our any other news or our topic items I’m gonna no just the only thing when you said they had a logo you’re not talking about that PE paper icon and then just says timle across it MH yeah that’s it that’s the icon that was no icon to me but okay okay yeah that doesn’t get a fancy thing apparently these are Engineers Tommy come on yeah marketing doesn’t involve

18:12 come on yeah marketing doesn’t involve for extensions apparently I I don’t think they get involved for this he pilot generate an image for me in two seconds thanks yeah exactly that’s probably exactly where it came from awesome so a main topic then let’s transition over a main topic so let’s talk about this topic here the topic is opening up one Lake for your fast adoption all right Tommy give us some context what are you thinking about going with this one give us some some talking points here and we’ll unpack this as we go so when fabric

18:44 unpack this as we go so when fabric really was first getting launched and the idea of one Lake we I think all immediately natur went to the governance and the administration side of things and I don’t know about you guys but the more I talk with clients about this in their different use cases even the more side project that I’m putting into fabric I’m finding it’s almost too easy to get started with adding data and getting data stored in a lak house which then immediately becomes something that I can query in a notebook

19:15 something that I can query in a notebook or I can create a report off right away it’s never been easier to immediately just get your data not just something that can be in a report but something that’s actually tangible in so many other places if you have a workspace and a bit of power query you’re done if you there’s just a few tables that you need that comes from SharePoint or maybe it’s from some email subscription site that you have in your marketing you now have your data

19:45 your marketing you now have your data that’s in a structured in a sense database that you can do a ton with so with that in mind we’ve talked about this idea of rolling out fabric this idea of like we’re going to identify the right teams who are going to treat it right basically U but I think there’s a a different take on could this easy way to get our data centralized and now it’s all in the same place be one of the bestselling component for adoption and I think it’s

20:17 component for adoption and I think it’s really talking about this de this really being open to data and we get everyone playing in our ecosystem rather than a if you want to see your stuff you just got to go through this developer and this analyst and then you can see your data nice and pretty but now we can basically let others say hey you now have a lake house here’s where it is and I think that that way is one way that we can even begin to that a governance program

20:49 even begin to that a governance program yeah it will be messy of course it will but I think there’s a lot of there’s a lot of push here so just wanted to start that way

21:04 well I’m going to make just a general observation at the beginning here and this is I think what’s forming a lot of where Microsoft is going on all this technology I think the main shift I see here is I think synapse was trying to be what fabric is today it feels like when I talk with customers what resonates with them is a single platform where you can use anything you want whether it’s SQL servers or VM or it’s functions or it’s all the things all the things that are like notebooks like

21:35 things that are like notebooks like there’s I think users or customers want to have this unified data platform and they can pick what is most comfortable to them all this stuff works together the challenge is none of the tools have the same format of data so that’s why we continually lifted and shifted data from SQL Server to blob storage from SQL Server to data flows and then data flows then depart everything wanted a different way like a different file format basically so I think what’s

22:06 format basically so I think what’s happening here is where I think the the great unification came from is Microsoft said look we’re all going to standardize everything we standard on on the Delta Lake format we’re going to do that when they made that decision SQL Server can now or SQL serverless in synapse was talking to Delta Lake tables awesome they decided to make the seman model just read directly from Lake no import mode this is where direct Lake comes from I think that was a genius

22:36 comes from I think that was a genius move honestly I think this is the open standard of the Delta Lake format is I think the the secret sauce it’s it’s the single thread I see in every single tool Microsoft is now building and everything fabric related it’s all doing this on top of this Delta paret format so I I I think once they made the shift and said we’re going to do this across all tools all the tools were able to quickly unify themselves and say look we’re all going to talk in the same language now we have we can truly have a

23:07 language now we have we can truly have a fabric environment where there’s now a storage layer and now there’s multiple computes you can pick from and so I think this is where we talk now is okay the one Lake which I think was actually a smart move here it’s it’s like a blob storage Azure data Lake Gen 2 storage but it’s a storage as a service it’s not storage as a blob item that you can go build does that make sense like yeah it’s an a it’s an API to go store read and and use stored information as opposed to I’m not configuring the

23:39 opposed to I’m not configuring the actual storage account and where the data goes Microsoft just handles that it’s a service so the service of one Lake in addition to this new format of blob storage this is like yeah this makes sense to me this is the way we want to be going I’ll pause there I said a lot of things your your thoughts maybe Seth I agree okay moving on I I can I can regurgitate the same things but I think that is one of the

24:10 things but I think that is one of the fundamental transformational things that makes this platform so different than other platforms out there without stating the obvious that there are a lot of systems and storage locations within organizations and a lot of those are local desktops right where data lives that is important to the business and if there’s a framework and to your point I think you leaned into the most important which is you’re transforming

24:40 important which is you’re transforming your data into a dig digestable format for any of these applications when you load it up there right so not it’s not just putting it in a a central location it’s putting your data in a central location that is instantly accessible provided you follow the rules in that to all the tools yep and and I think this is while this is a good thing I think this comes with a downside the downside is how do I

25:11 with a downside the downside is how do I choose which compute engine to use so let me give you an example right if I’m coming from structured data let’s say I have a SQL Server somewhere I’m going to connect to that SQL server and I’m going to land those tables inside one lake or a lake house where where do I do I start do I do I build a pipeline and do I create that data and do I land that data using a pipeline using a copy activity into the Lakehouse or do I create a pipeline and take the data load it into a data warehouse which also uses a

25:43 a data warehouse which also uses a storage layer under the hood that is one Lake it still does the same thing but in the in the data warehouse it’s now all SQL accessible right it’s tsql now and you can run store procedures on top of that data the store procedures then if they make new tables makes more Delta format tables under the hood so now you can now you have two choices well we have a third choice well maybe I want to load the data in using data flows gen two now I have another one now most of the other tools I and data flows gen two

26:14 the other tools I and data flows gen two will let you pick where you put the data am I going to put the data inside the data warehouse am I going to put it in a SQL Server am I going to put it down as flat files or am I going to put it back down to the one L we have options here so I think this is a good a good problem to have but I think now it becomes more of a technical question of which is the best system for my process and I think this is where things get a little more difficult right now my choice is am I using one lake or am I using Data

26:44 using one lake or am I using Data Warehouse this this is like where I’m I’m having some more projects now where we’re doing more heavily data warehouse things and I’m really trying to evaluate the difference between the two and figure out how to pick which one yeah they’re both really strong it’s hard to choose let’s back up just quickly before we get straight to the technology because as much as I want to have that conversation I think sure let’s talk about the adoption side too where what type of person are we saying come over here come join us in the fabric

27:14 here come join us in the fabric Community and our organization right they’re probably not asking any of the questions that you just asked initially because they don’t they’ve never asked those questions in their lives so I think I think regardless of there’s a lot of probably one ways better to do it than the than the next all of the choices that you said at least get that person for the first time started with actually getting their data in a centralized place where and that’s

27:45 in a centralized place where and that’s kind I want to start like what type of people where we can without them having to ask those questions because before fabric you had to know when as your resource Group was you had to know all these things to spin up that instance already have the knowledge permissions know where to go and provide those permissions we again right now it’s logging into powerbi and you can do the same thing so yeah I say me go ahead no I guess I guess the the question I have for you

28:16 guess the the question I have for you toming is if we’re talking about the business user right the users for wider adoption of of moving files and and folders and locations where are they currently storing that stuff everywhere it’s specific big places so I am without even being cliche here because I I’m never fails to amaze me it’s Excel it’s very very in-depth and organized Excel files and processes that they put together just to try to get some

28:47 together just to try to get some semblance of their own database sure a lot of times too they if they’re not pushing to do a data engineer they have some other third party tool that’s at least they’re pushing the data into share Point a lot of times they don’t already have there are some like mid tools where people can push files and it autog generates but really for the most part we’re dealing with people who are either a storing their data in local files or in some local system or be they’re only really

29:17 local system or be they’re only really accessing all the data and they’re getting all their information from either a powerbi report that someone’s directly connected to or the source systems themselves that already have basically a that wall up between what they can actually look at so so what would what would your proposed solution be like if SharePoint is the place to share documents and see version history and have all these locations where it’s been the only place where we could technically like put out an Excel file

29:49 technically like put out an Excel file right and you got a bunch of browser experiences now with Excel online like you are are you suggesting that people throw all of that into one L so I I’ll do one before I answer the Excel one because there’s I think a few directions there I’ll let’s do a simple one with that I see all the time and I saw it all the time in marketing where we have all this data and it’s in like for email our email campaign feeds and all the campaigns and the accounts and the users and it’s not in a structured

30:21 the users and it’s not in a structured way and the only way they can get is they had to either get someone who knew what an API was or query it where this becomes in in a in a fabric world where they go hey I just want to get this data in some table format that I can in a sense either pull a list from do and that’s me walk just sitting with the person for maybe an hour like yeah here’s a thing called Power query you just connect with your credentials see it here and you’re just going to push it to this place there’s a bunch of

30:51 to this place there’s a bunch of Transformations it’s your data have a great time at it so all of a sudden someone who could only in a sense kind someone who could only in a sense query the data or only see of query the data or only see assemblance view because a lot of times the people who are relying on these Source systems data need to do a lot with it besides just reporting sure that now they can actually statically hold it in again whether it is files or it it’s actual readable tables and I think that’s probably the to me

31:21 and I think that’s probably the to me the most dead forward straightway straightway I want I want to unpack what you’re saying here and I think your question is relevant and

31:29 think your question is relevant and Tommy I I know you’re you’re trying to move on to a new Point here but I I have some beef with this conversation here I I think I think what you’re observing Tommy is you’re observing organizations that didn’t have the tools or the skills in their in their teams to go get the data from The Source when I observe more access databases when I observe more Excel files when I observe more SharePoint usage those things exist because someone didn’t have

31:59 things exist because someone didn’t have access to where the data was living in a structured apid driven way it was difficult to get it out we weren’t W to list the time I’ve seen it over and over again many people when you don’t have access to the underlining Source system or can’t query it using a SQL endpoint we were we businesses are resource constrained at some degree they’re not going to give everyone production database access to these tables and some

32:30 database access to these tables and some organizations didn’t have a good pattern of taking what was in production and moving it into a place where the team the business teams could go connect to it and do stuff this is where powerbi desktop came from the reason desktop exists is it made it easier for us to do data engineering power query was is the tool that you use to go access that data in wherever it lives so instead of you having to have this whole monolithic you’re not logging into a server you’re not going into something that’s production you’re not running

33:01 that’s production you’re not running queries against a machine that you may take down because you can’t take down prod right you can’t take it down so we needed other ways of getting out of there and this is where I think fabric fits is fitting very very well is we’re now in a in a more decentralized way the business is able to go in and automate their loads of their own data yes there will be times where we still have to get flat files out because but the only reason flat files exist is because the Upstream thing we can’t access we don’t have access to it so the

33:32 access we don’t have access to it so the default fallback for people to work with data is well I’ll just export it to something CSV or Excel and then we’ll just shape it from there and we’ll go from there I understand it has to happen it’s not it’s not going away but I’m just saying as we as we move forward I think there’s a trend in general technology that’s like we’re going to give you apis it’s going to be driven with microservices you’re going to be able to get the data out in a not flat file way and I think to me this is the Lynch pin of a lot of what’s changing here and particularly with one L right

34:03 here and particularly with one L right it I’ll say this another way one Lake isn’t the central place for data one lake is just a common way for anyone to access data and an easily securable and sharable system that’s what one link is because I can have two different workspaces and totally different data in both those workspaces and if I don’t have access to workpace I can’t see it I can’t use it it’s not we’re not making a single Lake for everything we’re making multiple Lakes across everywhere it’s just really easy to manage the permissions between work workspace a and

34:34 permissions between work workspace a and workspace B so we’re not really and it’s a it’s a well it’s a central data repository so you don’t have to move data anymore you are permissioning it though I agree I would agree with that the one lake is a central data store yes but like we’re we’re carving out like let’s call it like if one lake is the house we’re making rooms of data and those rooms are permissioned by doors and locks on those doors or not depending on what you’re building so

35:04 depending on what you’re building so like to me the analogy holds really well we have our business our business is now going to use one link which is the the structure of the office building the house and every room or often office becomes the workspaces and that’s where we put the data I like this room analogy because it’s like there’s going to be some doors I open nope don’t want to go in there close that door this this the floor in this room is mess we don’t want to touch any of the data in that room that that that’s that’s a a mess would need to happen so but anyways I I just want to throw

35:35 but anyways I I just want to throw that point in there I think I think the world is changing and this is why we’re seeing the proliferation of one Lake and tools like spark data flows Gen 2 it’s help and pipelines it’s helping us move that data from wherever it is to a place where we can use it so I don’t know where I disagreed with you because everything you said I thousand per agree with you and I think to your point so say you agree yeah yeah we’re good we’re good we’re cool but and maybe I’m

36:05 good we’re cool but and maybe I’m misspoke because I’m not I think a lot of the times at least in my in the scenarios I’m thinking about these are all these Niche situations where different teams have some Rogue software they’re not cering the business data and they’re trying to do no it’s fine I’m just making the point that I think the reason those things show up I think the reason why you see SharePoint and flat files appear is because a lack of access to something else that’s the problem the root cause is not oh Bob and marketing has done all these flat files things this is a problem that’s not the issue

36:35 this is a problem that’s not the issue the issue is someone hasn’t given proper access to the source system wherever that may be or it’s a third party tool where we’re never going to get access to it or the company we’re working with doesn’t have a good data data extract process and all they’re going to give us is flat files because that’s the Easy Button hey you want to give get data make this query run it and here’s the file will email to you or whatever like that’s that’s the lazy man solution to get data out that’s the least amount of effort they can do to get you your

37:05 effort they can do to get you your information now we have a lot of new solutions for this sorry Seth go ahead yeah no I think what’s interesting to me is there’s two two aspects of this conversation that I’m digesting one is and I’m taking cost aside from here but one is this approach where Tommy’s like hey let’s let’s all let’s all jump in the one Lake it’s nice and warm in here here which which it what the what that’s going to do is take all of the data that’s already

37:35 that’s already duplicated right in the systems to your point because a lot of business folks don’t have direct access to the source system of information or it’s transformed or it comes through a report or whatever you’re going to dump that all in Wind link what that does is the same thing that we’re doing in SharePoint or on local machines Etc is you have versions on versions and versions of data that may or may not come directly from s system I think what’s interesting Mike in your description of taking Tommy’s question is if there’s one mode of

38:06 question is if there’s one mode of adoption and just saying one lake is the thing everybody come on there is another that maybe it’s a pool instead where we have to clean it like there’s the the people who are responsible for making sure the waters are nice and I I I think that’s where there’s a huge opportunity for organiz where you start to roll out access to or build into the fabric right and one like your data sources of certified data that

38:38 your data sources of certified data that that maybe parts of the organization never had access to that there are opportunities now especially with the roles and the permissions and the the more tight you can get on what types of data you can see from those Source systems right where you don’t need the redundancy anymore and there could be great opportunities where you roll through teams that are building their their own things and saying hey you don’t need that that file on a file that you generate manually

39:08 on a file that you generate manually through this report Etc this is just now accessible to you go build on top of this like you’re going to get the latest the best data right out of here instead of the 500, 000 versions of excel that people are generating and passing around so I think that’s what unique what was unique in in the conversation bit you unique in in the conversation bit that that I was adjusting between know that that I was adjusting between the two again there’s another feature here that’s also very interesting as well when you’re working with SQL Server Microsoft has opened up this thing called mirrored SQL and and mirrored SQL

39:40 called mirrored SQL and and mirrored SQL is the same situation A mirroring SQL is this idea of hey the SQL Ser this is the Holy Grail of like all things data analytics right there’s a system that runs the business I need to be able to build reports on that if I can have the data that’s coming from the system that’s running our business as fast as possible in my reporting solution we’re done that’s the that’s the goal the reason this isn’t done everywhere the reason we have SQL databases and batch nightly loads and all it’s

40:11 batch nightly loads and all it’s expensive to do that like it’s it’s not cheap to stream your data from the operational systems directly into your reporting systems it’s just not cheap and the format of the data in the operational system is row based it’s a single row it’s a table where I want to edit that row and not let other people destroy that row of data while I’m editing or creating or whatever and in the reporting side we’re looking at like I need to look at everything in a column and so the whole data reading we have to the whole mentality of what we’re trying to accomplish it’s two different objectives and because of that that’s

40:41 objectives and because of that that’s why I why I think that’s why we have these different systems that’s why we have these things batch loading nightly loads extracts of files shove it into SharePoint like all these other things because the the existing data we have today doesn’t support those needs and I think this is where I have a lot of heartburn now to our talk topic specifically okay where do we put it I would rather put it in one Lake because I have files for that stuff and I have tables that make it very easy to work with but on the other hand I also have data

41:12 the other hand I also have data warehouse and I don’t want to I don’t want to discredit Data Warehouse because I I think way back to your early part in the conversation Tommy you said what’s the user Persona we’re talking about here this is giving us new capabilities and I think it’s really going to depend on your background and how you’re entering into fabric if you’re coming in as a Excel powerbi user you’re probably not going to be as thrilled about using a bunch of data warehousing things because you’re not a t you’re not a t-sql person you you don’t know how to write it it’s

41:42 you you don’t know how to write it it’s not as comfortable for you yeah maybe you can get around enough to get a query built into powerbi but that’s about all how to do you’re not comfortable there but if you’re a Seth and you show up and you’re like I am the DBA and now this is a different story The SQL the data warehouse now might be a much better option for you just

42:00 a much better option for you just because you can land the tables it’s still in Delta format it’s you still have accessibility to direct Lake just like you did everything else but now you can write views store procedures and it’s it’s in the SQL engine so for me I’m thinking okay how do I make the decision between one tool the other if they both do roughly the same thing is it now just a matter of what you’re most comfortable with or is there some other consideration does doing an operation in The Lakehouse with a notebook is that

42:30 The Lakehouse with a notebook is that more expensive than running a SQL query in the SQL engine so now I think I’m actually want to have more data so I think there needs to be like a white paper or some data or some guidance around how much for these kinds of jobs should I be using the data warehouse or for these jobs should I be using Lakehouse and notebooks I think that’s the dichotomy I’m bouncing around in my head right now and a lot of that’s data source though what I can’t get my head around is if I’m if

43:01 I can’t get my head around is if I’m if I’m changing the structure of like how how and where I as a business user access and organize my world of documents right like that’s SharePoint or it’s local so are we talking about like SharePoint and the solutions we have to share documents stays but then we’re going to say the files that are actual data sources that a lot of times really relate to the same initiatives or the projects whatever that we’re managing in this singular location are

43:31 managing in this singular location are are now going to be separated right like what does that do for the business what are we saying like hey yeah move everything and now just operate on one link or are we saying to the business guys you have to think about data in a different way here’s where we put all of the data things because they’re going to be like we’re going to be able to extend that more and the reason the reason this is important to me is because what do we try to do like for a large swath you

44:03 try to do like for a large swath you want people to go to a single location right every time we add another piece of software you’re you tend to lose a bit of the like a bit of people right like they don’t use that thing or they don’t know like they don’t know what they don’t know so if they know SharePoint and that’s where it came in how would they know unless somebody told them that all their data is in one Lake right everything’s exactly back so that’s that’s that’s my question and I I also see people fall back on what they

44:33 also see people fall back on what they know right when you get in the business world you fall back to what right if if something seems hard or difficult or you’re under a time crunch for anything you fall back to what you for anything you fall back to what I just was on a project recently we know I just was on a project recently we were falling back to like ah we’re close we can almost we’ve got this API working we got the DAT coming out it’s almost the way we want it we should just throw it down let’s just let’s just parse the Json in powerquery we can do it there we know we know we can do it there but that that was a that was a fallback mechanism of like we were just falling back to what we knew we comfortable with

45:03 back to what we knew we comfortable with I’m like that’s not the right solution the right solution is let’s parse it as a table it has to be right so we can just directly consume it so if we spend a little bit more work on that data engine push ourselves a bit more we get a better solution on the output and it makes it more automated so at the end of the day for me the B the best thing I want to be thinking about here is automation like whatever we’re doing here whatever we’re building the less I can have people moving phys files that things can be automated the happier I will be and so there’s nothing wrong with SharePoint there’s nothing wrong

45:33 with SharePoint there’s nothing wrong with storing files there but if I can’t 100% automate that and have the the reloading process work consistently the worst the worst data source system that I ever connect to is Excel it’s just a pain it’s not that it’s hard to connect to excel it’s the fact that people can touch that Excel and change the data in ways that break my my loading pipeline sure and you have to be really sharp about protect a lot about not screwing up those pipelines when users adjust things in those but the same time I think I think

46:04 those but the same time I think I think that is also something that business people could learn absolutely by getting into an ecosystem where you you start connecting to these sources and and use it but a business person who uses powerbi already knows that as well yeah right yeah true like but I I guess Tommy how do you answer my question like if you’re pushing and one lake is going to be like the way we start driving adoption that’s that’s me to me seems like something we need to be and I think with to be frank that’s

46:37 think with to be frank that’s the the tradeoff here with just getting everybody on board right where you’re creating these processes that at some point are going to have to be to like in a sense leveled up to some type of automation but I I think the the biggest butt here is at least we’ve got people in the door and if we can if we’re on like the our the B the data team now we’re spending time too with

47:07 team now we’re spending time too with also understanding some of their data but yeah like the tradeoff if you’re going to get the business user just connect to a lak house well they’re gonna have to do one a few ways they’re going to do through pipeline which nice pretty user interface they a little bit of a learning curve it’s a little bit of a learning curve yes yeah it looks easy but I was like what variable and yeah everything’s on settings the like a power like a data flow which again I spent half dashboard in a day

47:37 again I spent half dashboard in a day going through data flows but the amount of people that I’ve trained on dashboard in a day now most people should have be able to at least use a data flow but it if we’re just connecting to whether it’s Excel files or straight Force they’re not like well it’s a nested Json record they tell me that then okay they’re probably a developer too too but there is this going to be some sign up they’re not just in a sense copy pasting right so that that part’s almost eliminated where it’s not like yeah just

48:07 eliminated where it’s not like yeah just log into Fabric and just put your content in there they’re gonna have to use one of the source systems or one of the Fabric’s tools to do that albeit they can be very simple we get them started in that so we’re they now actually have that access and they’re probably going to want more from there but it’s just that it’s almost like a the trial run we’re going to give you 80% or 50% of what your data can do if you want to sign up for the rest then it goes through the more

48:37 the rest then it goes through the more inoc Standard Process I don’t know if that’s answering your question because there’s really no good way to answer to say yeah we’re like we’re GNA just FastTrack them to go to the The Medallion the the engineering approach I don’t think that’s not that it’s not doable but I mean that’s not going to work in terms of someone one becoming an ownership of their data and then two being able to at least get started with that if that’s what we

49:08 what we want I I see what you’re saying Tommy it sounds like you’re making a case for yeah yeah I hear what you’re saying but the business is going to do what it’s going to do and it’s just going to have files all over the place and you and we’re not going to move forward and I we’re not going to move forward and maybe that’s not your point but mean maybe that’s not your point but like Enterprise AR has a really good point here in the chat he says you point here in the chat he says people could learn that that is a know people could learn that that is a thing but the business side response sometimes is not my job not my problem

49:39 sometimes is not my job not my problem I’m not going to learn it not my technology and I and I think what we’re seeing here particularly with fabric is we’re blending like never before the tools that straight it used to use and what the business could use and so I’d really challenge people in this area and again again let me let me just put my my thinking cap on in my dream hat like I’m going to dream here for a minute so just bear with me while I get a little bit abstract right the the really what we need is we need the ability for people to go into like Excel and say I’m going

50:11 to go into like Excel and say I’m going to make a table and inside that table I need hard restrictions on that table saying this table has five columns you cannot change them you cannot rename them you can add more data in the column to the table fine if you want to and and have the option of adding additional columns if you need them or not so one word of advice I’ll give people who are working with like data on the side and because the operational system the dat we bring in there’s no way that we have all the information we want the way

50:41 all the information we want the way marketing wants to group things by there’s missing information some team didn’t do it Upstream instead of fixing the problem we need supplemental data to add to our normal data extraction to to make it to make the reporting work basically it happens everywhere it’s so common that it’s like you just got a plan for it at this point so my point here is if you enhanced the Excel experience or if we had a better let’s ignor Excel let’s just say I had a better table experience where I could go in and make formulas lock columns down start doing a little bit of hardening

51:12 start doing a little bit of hardening around the table structure but still let people build things in a flexible way in an Excel an Excel like way I think that would go a long way here that would make it very easy for us to have these registered tables of information that we can then easily use manipulate use as if it was Excel we have really tight requirements around it that’s what the business wants they want to be able to go to a table edit any cell anywhere in the table adjust it modify it tweak it arrange it and then now that immediately gets pulled into the rest of the data

51:43 gets pulled into the rest of the data information that’s what they want yeah and that’s I Mike I think that’s exactly what what I’m trying to say is that’s they’re just not looking to push files into a different place because that’s what’s the value out there but the reason why people are exporting random files and the reason why they have these elaborate workflows is because there are certain points I need to extract some of the information that’s not available in a way that I need it and I don’t have any control with the

52:13 and I don’t have any control with the systems or how it is to easily do that so now we’re again to your point we’re going to go through that with fabric where it actually can be something that they can now be tangible to them where if I just want a view of these five

52:28 if I just want a view of these five columns because I need that every week for yada yada you could literally just create a view in a s a squl warehouse or have that as a table in fabric but you gotta know how to do it this is my problem like but how to do it it’s completely different than where you started right if you’re saying the value like the value outcomes of fabric and the accessibility of those objects to other teams is a value ad that is not

52:58 other teams is a value ad that is not hey can can we get can we drive adoption with throwing everything into one Lake that’s completely separate topic is is there value there 100% there is and I think that goes and drives back into the other point that Mike was making earlier which was if you’re going to go go down this path of putting everything in one Lake then it it should be in that manage space where you creating value objects that the business can use as

53:29 objects that the business can use as opposed to just dumping everything in in into one Lake but like that’s why I said they’re they’re two completely different topics so we’re like if we’re talking do you bring business users into one Lake right or you move how they operate in the business to a different location that’s not the same as Enterprise teams building value ad to the business so they don’t need all those objects

54:02 yeah and I obviously that I think that’s where I’m struggling with but just for the record I’m not I have not opened up my my heart and my arms to every consumer or business user comes to me now to get into fabric and fabric everything but I’m definitely entertaining different use cases and I and I think like what I’m going to be trying to do at the very least is be more open to it but I think there’s probably going to have to be identifi Ying candidates for this and I think there will be some demand for this and I

54:32 there will be some demand for this and I think rather than we’re going to always have to go to the business say can we get your data cleaner please where it’s almost where to me it’s does a team or business in this in this situation that we’re dealing with you situation that we’re dealing with have some technical abilities or know have some technical abilities or basically whatever the prerequisites that we’re looking for I don’t think they’re as high as the data engineer they need to be but obviously we’re not going to just push this to everyone even if they’re not going to own it or do anything with it and to

55:03 own it or do anything with it and to your point if they’re if someone’s com reply is that’s just more work it’s not my job it’s not my system that’s not what we’d be trying to do anyways we’re not trying to push everything to fabric right now but I think there’s a lot there’s a lot of situations right now right now with a lot of people who if you told them how easy it would be to really manage and run with their teams and business data they would be on board so I think I

55:33 they would be on board so I think I think this this is moving into a different direction right which I’m I’m fine with but what you just described as one of the challenges of like hey you we we don’t have owners of data know you we we don’t have owners of data right that’s the whole point of mesh and all of what fabric is is designed to do is if we came to the do the owners of data and we said hey you’re now a domain owner you’re you’re a domain owner of your domain the business right there’s

56:05 your domain the business right there’s an ecosystem fabric by which we’re going to plug into your data how do you want us to do that right and if the answer is not my problem here here’s the Excel files like fine that’s what’s in that’s what you’re putting in one link because that if you’re going to them and you’re saying you are responsible for the output of information that we’re going to plug in into and you’re responsible for updating these files in this shared location because this is what we’re going to use

56:35 because this is what we’re going to use or report on or do whatever and you’re you own it now there are a bunch of other opportunities for you right we can plug you into the source systems we can make sure your data gets cleaned through these processes etc etc etc the whole point is that these teams put skin in the game because without skin in the game you get the problem that you have which which oh my gosh we’re like we have a million different data sources and nobody’s putting any thought or process behind it or like yeah process behind it and what

57:06 or like yeah process behind it and what what is the end outcome it’s the comment in the chat which is nobody cares because it’s not my problem but it is your problem because it should be their problem because they’re the subject matter experts yes so I think that’s where this door opens a little bit more in terms of if you’re in fabric what do you do what do you say to the business we want their data accessible we want it discoverable we want it to be usable so where do we tell them to go put it tell them go put it in one leg but there has

57:38 them go put it in one leg but there has to be some thought and planning around why you’re asking the business to do this because you have to give them something back in return and that’s what we’re talking about with like the the better more accessible streamlined versions of like data from systems that they can access as opposed to no we’re not going to like Grant permissions to a SQL Server when nobody on that business team knows how to write seq that would be dumb right so I think as long as organizations are going like

58:09 as long as organizations are going like in that mode 100% you’re you’re bringing them in and you’re forcing a change albeit now maybe my documents are in SharePoint but if I if I have to look through this lens of the the information that’s coming out of my area has to be part of this known ecosystem because it’s a mandate because an organization can look at things and go holy cow there’s a lot of value here if we Implement a fabric ecosystem where

58:41 if we Implement a fabric ecosystem where I have owners of data and then I can sick this team or I can hire Mike and to come in and and organize this for me so that all these teams are operating in such a way that I’m getting the most value out of my data and that’s what I think is the most like exciting part about everything in Fabric and what one Lake provides is hey now everything is in a digestible form so that whatever technical team is going to help

59:11 technical team is going to help Implement things or if we’re going to hire some people or the domain already has the Technical Resources we’re all working on the same platform and it’s a permissions thing hey yep business unit over here I need access to your data we’re pulling this thing what what are what are your sharable files like give me access to them boom they’re instantly accessible and I can plug into them right walk me through how do we do this what are we doing here hey I see you have a million Excel files I’m looking at the data here do

59:41 looking at the data here do that we can do this and this and this right you’re improving the the the business domain value right because you’re giving them accessible things but you’re also helping the wider part of the organization like build better data and that’s why I think we’ve been missing because you don’t have that visibility in all the systems that we do right now because you’re just pushing pulling pushing pulling pushing and pulling right so I think if I’m rounding out like where we started where we thing that would be my final

60:11 that would be my final thought I like that that’s a good final thought and I think a lot of this what we’re talking about here this is a great talking point around the intersection between data culture and what your team can do and handle technology wise so I think this is a great point to communicate discuss work through I think my my final thought here is you think my my final thought here is ask the question why those flat know ask the question why those flat files or SharePoint or other items are existing I love your point there Seth around okay well let’s work with them to me this this whole conversation reeks of

60:41 me this this whole conversation reeks of this of the concept of ownership who’s taking ownership of the data where where’s the break line between the business and the and the it or the the the bi team who’s going to take ownership of getting that data into a central place to be usable any final words from you no I I really think there’s going to be a lot of different roles that are going to come up as a fabric where it is that either on boarding person or we’ve talked about that liaison between the business and the technology before

61:12 business and the technology before but I think never more than with fabric awesome well that being said we really appreciate your time we know your week or hours are are very valuable and we appreciate you spending your time with us through the podcast so if you don’t mind please make sure you give us a thumbs up or a like or subscribe so you can hear about us when we’re coming out with new videos in the future here we do appreciate your ears but we’d love to have more so if you wouldn’t mind share this podcast with other people you might think this is valuable with hopefully this gives a couple talking points for you to unpack think about really review

61:43 you to unpack think about really review your data culture and how does it work how would you fit your data culture with what one link is doing maybe you need to do some process change around the data things that you’re working on and in doing that you can then start really fully leveraging one L to make things easier faster and automated Tommy over to you where else can you hear about the podcast you can find us on Apple Spotify wherever your podcast make sure to subscribe and leave a rating it helps us out a ton you have an idea a question or a topic that you want us to talk about in a future episode head over to power. tisp podcast leave your name

62:15 to power. tisp podcast leave your name and a great question and finally join us live every Tuesday and Thursday a. m. Central and join the conversation all power. tips social media channels awesome thank you you all so much and we’ll see you next time

62:56 [Music] out

Thank You

Thanks for listening to the Explicit Measures Podcast. If you enjoyed this episode, please subscribe, leave a review, and share it with a friend or coworker.

Previous

Mass-format reports in Power BI - Ep. 369

More Posts

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.