PowerBI.tips

Use Cases for the OneLake File Explorer – Ep. 342

Use Cases for the OneLake File Explorer – Ep. 342

In this episode, the crew walks through practical scenarios for using the OneLake File Explorer and where it fits into day-to-day Fabric and Power BI workflows.

News & Announcements

  • PowerBI.tips Podcast — Subscribe and listen to the Explicit Measures podcast episodes and related content.

  • Power BI Theme Generator — Power BI.tips - The worlds best theme generator for Power BI reports. Increase your speed to develop stunning reports using this free theme generator. Themes are essential for any report developer’s tool belt. Visit…

Main Discussion

The OneLake File Explorer makes it easier to browse and work with OneLake content using familiar file system interactions. The team discusses when it’s useful, what problems it solves (and doesn’t), and how it can speed up common Fabric tasks.

Looking Forward

As Fabric keeps expanding, having low-friction ways to navigate and validate files in OneLake becomes increasingly important—especially when you’re moving between notebooks, pipelines, lakehouses, and Power BI.

Episode Transcript

0:36 good morning and welcome back to the explicit measures podcast with Tommy Seth and Mike good morning everyone morning good morning pranking along it’s been a long week it feels like fing along here jumping in getting things done this week’s or this today’s episode is brought to you by the letter O for one lake so and we’re going to go through use cases for the one Lake file explorer and maybe even talk a little bit more I don’t know how long we we’ll spend on just the file explorer of

1:07 we’ll spend on just the file explorer of one Lake but there’s also some thoughts around what is one lake is there any things that you found in working with one Lake what are you putting into it like just some general one L experiences and maybe what we’ve experienced so far since fabric has been out so that I think this is going to be a good topic today and interesting to see where we land on things with that let’s jump into some news Tommy you found some news from Darren gospel yeah he is our favorite friend of the owner and creator of Dax studio and 3. 1

1:37 owner and creator of Dax studio and 3. 1 release came out some really cool things a Dax Studio command line utility which I a lot I like terminal I like command line utilities especially if you get Windows terminal and this is just the ability to run a subset of command operations from the command line you can actually and so if you are more of that developer you’re trying to automate things you can now utilize DAC studio and automate some of your pipelines that you already have there’s some capture Diagnostics database dialogue and

2:08 Diagnostics database dialogue and more from your evaluation and execution metrics features which again daak studio is that tool unlike Tabler editor which is like the the pros tool Dax studio is I’m assuming Greg Baldini’s tool choice because this is all the background of what’s going on in powerbi powerbi yeah this is this is one of the thing this is one of the first tools I think I got exposed to being able to go and create Dax statements this is where you

2:39 create Dax statements this is where you start learning like oh Dax is more than just a formula in the formula bar It’s actually an evaluate statement and a whole bunch of other things that are occurring know you could do so many things with this yeah it’s a it’s a really good tool and I really like it’s interesting to see know the command line support I I’m not sure I understand quite exactly where I would push that into a process I get that you can do things with the command line and you things with the command line and sequence a bunch of command line know sequence a bunch of command line items together not sure why you need this

3:14 yet I know you like it Tommy but but why I’m not going to use it in terms of my every day but with command line utilities any of them it’s not just actually utilizing the terminal at that time it’s you can actually create in a sense local jobs if you wanted to run things over a series of file FES because right now if I wanted to evaluate all the metrics in powerbi using Dax Studio I’m opening up one file at a time I cannot run that across all files or all models in a given file or folder so is it more like I want to

3:46 folder so is it more like I want to automate the extraction of testing from my model is where you’re thinking this is going exactly so run run some performance tests clear the cache every single time I’m going to grab coffee and then come back and then 10 tests have been run there’s a file that has all the data in it and you can evaluate what’s the performance of it interesting I’m not it’s interesting to see how that’s going to go anyways I’m looking forward to it I might play around that a bit myself but I don’t have any major needs for it at this

4:16 have any major needs for it at this point yeah I would say probably then the big thing for for the normal powerbi Pro is the execution metrics adding some more features there to really see more of the numbers the evaluation log like I I love like this is so so crucial especially if you’re wondering why your model’s running slow or that one measure with that one table is incredibly slow that’s where you go to daak Studio it feels like a lot of these features here that I’m seeing it felt like it feels like some of the the hand of M Marco Russo is on this as well

4:48 of M Marco Russo is on this as well right so there’s some other things here like added support for OB fiscated model metrics and that’s something that M Marco Russo touts a lot when he does his work on studio right hey you don’t need to send us the model just send us the verac output and it offcat things and we now know like okay it’s not really important what you name things it’s just more important that this column this relationship is bidirectional and connecting these two really large tables together that could be slow like so they there’s patterns I

5:19 be slow like so they there’s patterns I think that are evolving that they’re they’re Illuminating to you which is which is very fun there as well I think for me of this release more of the items around being able to save the output I I think has been very helpful like there’s more buttons for I think that in the release they’re talking about capturing the Diagnostics button there’s now another button in here I thought there was another button around being able to save the output of things so I like that I think that’s good those are things that are going to be more

5:49 are things that are going to be more useful I think in longer term yeah it’s a it’s a great troubleshooting tools even from cheer Point Tommy looking at cardinality the sizes of different different things like to when when you’re optimizing your model Dax studio is a key part of that I wonder if people are going to move away from DX Studio a little bit because I think you get the same maybe not exactly the same information but you are able to get semantic link to do things as well and so this is where I’m a little bit more torn now because I’m

6:21 a little bit more torn now because I’m seeing semantic Labs like for example let me pain a picture here for you companies like to use parbi things but one of the arguments from Microsoft is hey if you’re not going to use the Microsoft tools we’re not going to let you go employees this is the talking to employees we’re not going to let you go use third party tools to go do stuff like if it if Microsoft doesn’t support it it’s not a thing that we can go do like for example calculation groups was not supported creation of

6:51 groups was not supported creation of them in the desktop for a long period of time so they’re like you can’t go get Dack Studio or tab editor 2 for free and go use these tools because it wasn’t supported by Microsoft they they feel that if Microsoft builds the feature you should be able to use it so for me having semantic link on the side solves a lot of these problems because you can do a lot of the things that you’re already doing here you can get table statistics you can get measures you get relationships out you can get diagrams out from the model it’s probably not doing the exact same things I’ll be it

7:22 doing the exact same things I’ll be it and then yeah but I feel like a lot of the capabilities are covered from deck Studio inside semantic link yes and no you’re dealing with cost once you run semantic link because you got to run that from from fabric from fabric so you’re already incurring a cost it’s slower in order to get it run once you get it started run up you gota point right I would think so I if you and I did a race right now and I said start you create a jupyter notebook I’m G to get the evaluation metrics yeah but once

7:53 get the evaluation metrics yeah but once you figure it out this is true the level of automation like so to your point I you’re talking like am I going to go to daak Studio run deck Studio on six models that are in one workspace my local machine let’s you’re going to your local machine so what if I want more automated so this is where I’m going with the command line things and some of these other pieces like well if I’m really going to try to provide true automation if I’m really am going to go build these insightful things to be honest I think me personally I like the experience of the

8:24 personally I like the experience of the semantic link notebook experience and then once I’ve learned the commands that I need to make I agree with you Tommy if I’m going to start from scratch yeah probably you’re going to be faster in Dex Studio because all the commands are ready to go I’m clicking buttons things are just running it just works I get it but once I understand what I want to an analyze of the model I would my opinion here is you’re GNA want to do that over and over again on many different models every time you release oh no I agree with that from a scaling point of view from a scaling point of view you’re yeah

8:54 from a scaling point of view you’re yeah yeah so I to right now I would say look what would what is the equivalent feature that I need in da Studio go build it into a semantic notebook and then run the automation there because I think you can also create measures you can manipulate a little bit of the model anyways it just it’s it’s just interesting to see the tools evolve in front of us as we’re getting more options to do the same stuff yeah make a lot of assumptions about the skill level of a lot of teams though for you yes

9:24 of a lot of teams though for you yes well but even you can make the same argument with Dak Studio Dak Studio you’re not giving Dak Studio to like it so semantic link part of fabric in there you just set it set it up versus third party app download it figure out how to work it like there there there’s the same thing like I I think I there’s the same thing like I I think same audience where you’re looking mean same audience where you’re looking at new tools I agree with Mike from the standpoint deck Studio has been around for a really long time and if semantic is offering up some of those new capabilities within the fabric

9:54 new capabilities within the fabric ecosystem then that’s a a great a great way to stay in the flow of where you’re

10:00 way to stay in the flow of where you’re working yeah well I am looking forward to that blog Mike I’ve already been doing that’s a great yeah I’ve already been doing doing it with Stephanie that’s right yeah so we’ve already we’ve already done it so looking forward to it the blog’s already done the blog’s already done you can tip I’m already ahead of you I already got the blog out there it’s so Stephanie Bruno and I have done a couple exploring semantic link and writing things back in and from the lake house and consider Generations there so we’re getting into

10:31 Generations there so we’re getting into it we’re actually doing some videos and training and demos around this because we think this is important enough that people should know this this is really I think where more advanced developers will spend their time right this is not for the average business user yeah well I’m a reader so I’m looking forward to the I’m wna we’ll convert that to a Blog I’ll get my AI overlords to go ahead it’s already out there I think I did it like three days ago that’s awesome okay I will be checking that out there you go it

11:01 checking that out there you go it probably won’t be as much as informative as the actual video but you as the actual video but excellent with that let’s jump into know excellent with that let’s jump into our main topic unless we have any other openers or any beat from the streets things that we’re learning working working on I’ll point on one thing here that I’m working on with domains domains and capacities and workspaces there’s an interesting mix of what feat features can be delegated from your powerbi

11:31 can be delegated from your powerbi tenant down to domains so I’m doing some investigation on it right now an admin at the tenant level can delegate certain features to a domain admin as well so think for example certifying data sets certifying data sets you might want to have a it level admin side of this but if you think about different domains there’s probably a leader inside your sales team inside your operations

12:02 your sales team inside your operations team inside those teams there is let’s call them experts ormes that are going to be creating their own content in that space having the ability for you to delegate to that team the leader of that team what is a certified data set or not is actually really I think very relevant here so there’s some really interesting things that are I’m I’m looking into the powerbi settings it’s it’s documented well but what actually comes out of the apis and the scanner API does some weird things so you

12:32 API does some weird things so you can’t really get the information out from the scanner API very cleanly as as you can through the portal so we actually have to go through each setting every single one and look at them and see what could be delegated and what could not and some of them are on by default and you can’t change them so I’m not really sure there’s some things out there that I don’t quite understand exactly what Microsoft’s doing yet it feels like this is not a fully baked idea it feels like this is a half idea that is getting some traction around being able to delegate administrative properties anyways just want to bring that up have you guys worked with domains at all in that fashion yeah I

13:04 domains at all in that fashion yeah I have a very similar experience where it’s like I feel like they’re doing a lot of things right and then there’s just a lot where it’s almost like you turn take a left and it’s a roadblock like oh I wasn’t expecting this here yeah why why was this not like it should be this even though you’re on the right track it’s and it’s it’s it’s on the right direction there are a lot of things on the right direction with domains that make a ton of sense but yeah there’s almost like what we talked about get where some of these features that almost

13:34 where some of these features that almost you feel like would need to be required to be there are not there yet yeah it’s and it’s just very interesting it’s it’s weird because you look at some of these features I I’ll see if I can pull the and some notes on this one that I’m kind and some notes on this one that I’m working through of working through don’t seem to have the notes in front of me right now but the it’s just odd like there are certain if you think about the interaction between capacities and workspace and then domains capacities are attached to a workspace so that that has to link there but there are settings at the capacity level that

14:06 are settings at the capacity level that you may let a domain owner manage or not manage or the tenant might delegate down to the domain owner there’s certain workloads you you may or may not let them choose to turn on or off you may override some settings so it’s I’m still trying to get my head around what is the hierarchy of the settings and how they work through each of the different layers it’s not very clear yet anyways something I’m just chewing on through domains figuring out what works works there with that let’s jump into our main topic let’s talk about one Lake this is

14:36 topic let’s talk about one Lake this is a a Blog article that comes directly from Microsoft so from the fabric blog Tommy give us a quick overview what is one Lake and why do we care about the one Lake one Lake Explorer so the blog article specifically referring to the Windows application that you can download that basically the SharePoint for one Lake and again one lake is the central ized place where all of our data is stored lake houses pipelines all the data that you’re going to Be Imagined in

15:07 that you’re going to Be Imagined in fabric comes through one Lake it’s the bottom rectangle of your diagram of all your other artifacts yep and one Lake Explorer really allows us for the very first time in any powerbi environment to Simply easily upload any data what we’ve had to do in the past dealing with CSV files Excel files even IM well one images was the non-starter but even tab type of data we had it had it to a SharePoint list directly connect to

15:38 to a SharePoint list directly connect to that it was still not necessarily directly connected to powerbi with one lake or in one Lake file explorer we can say hey just move it to this environment or this workspace and any business user who has that access can easily add that and we can not only connect to that in powerbi connect to in a Pipeline connect to in jupyter Notebook connect to it any of our normal artifacts that environment is available in the fabric ecosystem yeah so it it’s it’s basically

16:08 ecosystem yeah so it it’s it’s basically one one drive the application you use for SharePoint right or storing your files from into the cloud but it’s like the same thing but just now the lake houses or and we call it one Lake because it’s it’s not just a table that’s a Delta table a bit more than that right it’s it’s a lake house and potentially folder structures where there are files individual files showing up well essentially isn’t it just a data lake or like Azure it’s like it’s a Storage storage

16:39 it’s a Storage storage mechanism yes but I think it’s more than that with with hooks into fabric right right so it’s already plugged into the systems that you’re interacting with in in terms of data but at the same time like it’s very it strikes me as like a much better SE more seamless experience that’s integrated into like Windows Explorer as opposed to like the other application right right where your

17:10 application right right where your cow early morning fog in the Brain still your your Lake Explorer right well yeah and well yes but it only points at one lake at at a single blob storage account right right I think one lake is trying to be a service that is rendered back with collection of lakes underneath of it and there’s a lot of other API well when you’re when you’re interacting with all of your Lakes right you have access to them in your in your

17:40 access to them in your in your storage storage Explorer it’s and you can copy and you Explorer it’s and you can copy and paste files in there it’s just a know paste files in there it’s just a little bit a different of a different experience I’m saying like the the one I like the one like one built in because it it feels like everything else I interact with in Windows Explorer on my on my PC yes yes yes it it mirrors the same experience as you would with the Azure file explorer correct without some of the other features and we have this full integration again not just with Windows for any user who’s use a Windows

18:11 for any user who’s use a Windows computer but then it’s also that full integration fabric I can pick and choose any of those new files that have been added actually I remember one of the early demonstrations that Microsoft did wasn’t with a CSV file wasn’t with an Excel file it was with a marketing team using imager images and how that you can actually utilize those in fabric too I think something with generating AI images but we have this now ability for the I think the very first time we haven’t I think for us we haven’t given

18:42 haven’t I think for us we haven’t given a ton of love that new ability where rather than my interaction with the business team on okay I’m going to create a SharePoint list for you or I’m going to create a SharePoint folder for you and give you access to this and powerbi can only connect to this but it works and we’re now just add your CSP files here well now that ecosystem we’ve elevated that to a Lakehouse or we’ve elevated that much higher Upstream on what we can do with that data I think that’s really

19:13 that’s really powerful yes I I would rather I don’t know I don’t use so to be very honest I don’t use the one Lake Explorer on my desktop very much it’s not if I if I need to get files up to to one L quickly or I’m I’m modifying something as as a source right yeah I’m probably doing a little bit of work to get that information into lake house and I’m doing it through desktop there’s are there are use cases where you have you there are use cases where you have supplemental data or things

19:44 know supplemental data or things that are not really required to come from a specific list or you want to make an Excel file that has a couple rows in it that you’re going to clean some data with I find that I’m doing a lot of these like side ancillary loading of data and I’m just putting the file there

19:58 data and I’m just putting the file there boom and then if I want to edit the file I go back to one Lake Explorer or the open the one Lake and then just edit the file locally and hit save and then it just gets changed then it goes it can be updated inside the service see I don’t know if you’re the direct use case for one Lake file explorer what do you mean so I’m my initial thought to who’s using one L file explorer is the business it’s someone who honestly is not touching lak houses that often but they have access

20:29 houses that often but they have access to a workspace because they’re the ones who control files my experience working internally was dealing with types of unstructured files all the time coming from marketing coming from operations where these systems and platforms didn’t have a direct API didn’t have a direct way to connect to them but they had csvs and they had export and there are a lot of processes we created that had to be pointed to SharePoint or how to be

21:00 pointed to SharePoint or how to be connected to a cloud where these local files are going to be automated upon it was it worked incredibly well but the problem was it only was the analytical solution it was only can be used in powerbi for a report I couldn’t use this in power apps because again it was a connection to a power query I couldn’t use this as from from a database point of view or write back or any other type of application with this because these are all csvs going to powerbi the use case now is all these

21:32 powerbi the use case now is all these files weekly files exports can now become part of a lak house can now become part of so many other directions rather than just simply becoming a report and that’s the end of the story so I can now utilize my team to say don’t put in SharePoint put it in this other thing that looks very similar to SharePoint just drag your files here and we’ll take care of the the rest and I’m not just yeah what’s what’s the value ad here though right like from from if I can connect to files in SharePoint and

22:02 can connect to files in SharePoint and this is already a seamless experience where I can sync my local Windows Explorer to my SharePoint folders and I get I get a shared space and I can connect to those that SharePoint location from make it a source for fabric and all the things I want to do as far as data connectivity does one like does does this just make it more seamless like where what what’s the big win here so I for me it’s not just the a reporting

22:34 for me it’s not just the a reporting solution let’s take an example I have event data and I get event attendees CSV files but I want to actually say mark them add info their contact information and they’re all CSV files well think of the ease of use now where people are adding these CSV files to one Lake I can add them to a lake house and now that Lakehouse can be used in any any direction outside of reporting in power apps now they have this place to say

23:04 apps now they have this place to say okay update this contact info that contact information is now living in a lake house coming from a CSV file from just people dragging dropping one Lake file explorer that ease of use or that the amount of quick steps to do that was not possible before one L and especially the one Lake File Explorer for we now we can actually use this database so to speak or this storage in any other capacity again more than just a report and that’s to me where the

23:35 just a report and that’s to me where the biggest value at is okay yeah so let me let me unpack what you’re saying are you arguing that if you are a team of people working on powerbi files and you have a whole bunch of PBX around and you are going to do a process that will check in check out or you need to place to put them or collect them together are you recommending putting the PBX files in a folder inside the one L no back take a back a few steps back think of

24:07 take a back a few steps back think of the data sources or those raw files think of the CSV files people are exporting that’s important to them and they’re utilizing that for reporting but they probably have more use cases for that so let me okay let me let me retool my question then or my statement then so not thinking about saving the actual PB files into the one L but more of we have supplemental data or we have a whole bunch of CSV files being generated from A system that we’re not able to automate right the goal here would be

24:38 automate right the goal here would be look I’ve got fabric if I can automate it I will automate it if these are the things that we just physically cannot automate and someone has to go get the data the table and land it in a files area and would your expectation be then okay once you have it in the files area it’s easier for you to go access it via a pipeline a data flow those things right so now you can just say okay go here and find the content as opposed to having to reauthenticate go to SharePoint go to a one drive

25:08 to SharePoint go to a one drive location which not meaning doesn’t mean you can’t do that thing right right it just means this is an easier method to to do that inside the one Lake you you’re right on point because I’m putting that I’m putting those suckers into a lake house and again if they need if that marketing team or sales team or whatever the team is which I found too many times they need to do additional things or they would like additional columns or markers on that raw data again before it’s like well you’d have

25:38 again before it’s like well you’d have to update those Excel files yourself but I can now put this in a lake house then I can now connect to that in power apps or another application or wherever wherever else I would want to it doesn’t have to be power apps but it anything that data now lives in that lake house that can be now updated structured Etc whatever I want to do the marketing team or that team can now comb through it modify it then we can report on it but again the reporting is one aspect of it and it’s not I don’t think the only the primary solution here it’s a great one

26:10 primary solution here it’s a great one it’s going to be something we’re going to utilize but our ability now to take this data that is important to teams and again this is still all too common apis are running crazy but they’re not for every solution or not available for every team this now becomes a centralized rep repository for team of data that’s important to them that they can now that we can build other Solutions more than just the pretty bar chart or the nice the report because now they can modify this now they can manage

26:40 they can modify this now they can manage this and now they can own it and it’s no longer something they’re owning in CSV let me ask you another question then when you are looking at Delta tables that have been created through the one Lake right in interface are you able to access those same Delta the tables or manipulate them or edit them locally on your computer meaning let’s let’s imagine I’ve made a Delta table there’s a pipeline that’s loading some data in there I want to go look at it with my vs

27:11 there I want to go look at it with my vs code notebook locally or go edit or go you write some python against it just to test something can we do that can I can I point at a Delta table that is locally on my on my computer I I would have to check the latest update from my last experience implementing this for someone no but that was quite a bit ago but I’m the files exist right the files exist but the table’s there there is a Delta paret structure that already has

27:44 paret structure that already has the Delta log it has all the other items that are there why wouldn’t you be able to access that table at least read it locally so let me ask you where if that’s possible are you getting somewhere with that do you have am I feel like there’s something well I’m trying to I’m trying what I’m trying to do here is I’m trying to sniff out use cases like where can where can we use this one right one use case could potentially be is I have a lot of data coming in there’s potentially bronze silver gold things I want to do some work locally but I don’t want to turn on my capacity right so my understanding is these again I don’t know if this is

28:15 these again I don’t know if this is right or not we’ll have to do some testing this may maybe after the fact but what if I turn I do the work I do a data load those Delta tables are created and built great I’m going to just go locally on my computer and I’m going to run some machine learning scripts I’m going to bring a data scientist to thinks hey here’s where all of our tables live you can experiment things offline because my computer has more horsepower than what I would want to be putting into a spark notebook or whatever like I’m just going to do something with the data here’s the

28:45 something with the data here’s the output of all the final tables right or and or maybe I want to manipulate those tables locally or I want to change those Delta tables or add another version to it could I do that and I I think based on what I understand so far is I think you could do it I think you could connect a python notebook locally to one of those tables that are coming from a one Drive Lake sync it would then load the files that it needs now granted if your files are large you could be pulling down a couple gigabyte files from a Delta table into your local

29:15 from a Delta table into your local machine right you may not need all that so that might be a a challenge that you would face with this because you don’t need all the files from every version of the table you just need a couple of them right potentially that that could be another use case so what does this do for business business users I’m gonna say not much Seth honestly what looking at this I’m not it’s not super valuable I think at this point other than editing adding a couple

29:45 point other than editing adding a couple files here there and then being able to edit like there’s a Json file that maybe defines something or having some parameters that are going to be running a a pipeline yeah I maybe I’ll edit those slightly locally on my machine but

29:57 slightly locally on my machine but there’s not not I don’t think there’s not a huge value opportunity I’m seeing here I honestly don’t think I could disagree more and it’s been a while since I’ve said those words and it feels nice to and he says this with a very nice smile on his face dude I got you here it’s nice to wholeheartedly disagree coach you into conflict this this is why because again business users are dealing with a

30:27 again business users are dealing with a lot of unstructured data and they’re not comfortable say go to fabric saying unstructured you mean like random file sources sources CP whatever plug in anywh want Json files yeah unstructured would be like an image or video yeah a lot of jam a lot a lot of Rumble and jamble and random files thing plug in any word you want to but we’re all on the same page matter words matter a lot so let’s use random

30:57 matter a lot so let’s use random thing to to ease our our English overlords and AI overlords too they’re listening so this random files that they’re constantly dealing with and I think I think being not compassionate but being sympathetic to what they have to deal with on a daily base with all these platforms that they’re not that how they’re trying to actually make sense of this and manage these files well yeah we could do the SharePoint but again the problem with SharePoint is for us from the bi point

31:28 SharePoint is for us from the bi point of view is those are only one story and that story is a powerbi report with one L this L this changes that whole story because now those files it’s like just point it here it’s not going to wow them to the point where they’re like I love one W one Lake file explorer I can’t go without it but the impact that occurs where we’re asking them please here’s your this one workspace or environment that you see in one lake at add it to

31:58 that you see in one lake at add it to this folder and now you can manage this in this application we buil for you or you can connect to it this way or we can build any type of report for you off of this yeah as a business user though right yeah like SharePoint is where my business lives right all my file all all my data all my files for review and Analysis Etc like they’re all in SharePoint so what you’re you’re saying I have to use a different storage

32:28 I have to use a different storage mechanism for some of my data we’re talking about we’re going to create create when you’re you’re putting the same argument out there to me where someone’s like so you’re telling me I can’t ride my horse anymore no I’m I’m asking I’m asking what data what data do I have to put in one Lake and what data can I keep in SharePoint because that’s my team that’s where all my teams data is yeah the reason why they teams data there is because that has been the best solution yeah but SharePoint SharePoint offers a

33:00 yeah but SharePoint SharePoint offers a lot more from team collaboration and flows and all the things that help my business no but that’s where they live today they’re part of the business okay so your argument is use one L I’m like okay well I’m the business person so I I move what what part of my data to one Lake why do I have to use two different platforms now I think we’re talking you’re talking about different files or we’re having this conversation about different files PowerPoint yeah sure they’re random

33:30 PowerPoint yeah sure they’re random files to you for reporting because you’re going to do something with them but to me they’re they’re my data for the business that I run I think what what is a business user doing with 18 CSV files on a month or that are all manually putting together into another Excel file well that’s where we come in anyways right or or and the recommendation has been I think to assess point though is you’re already doing work with PowerPoint slides your data of your team into a SharePoint

34:00 your data of your team into a SharePoint site why am I worried so why am I saying okay I’m going to split apart my process and say okay only the business files go here and then all the data files go someplace else there needs to be to your point say I think I think there has to be a compelling story to say why I need to learn something new and put the files in a different place if I if I’m not getting enough value out of that what’s the point why why would I spend all the time and effort to get all the way over this new location plain plain devil’s

34:30 this new location plain plain devil’s advocate of the business user Tom Tommy why do I have to change for you can’t you just go pick up my files from SharePoint if you need them for your data things can’t you just look in SharePoint if we’re simply reporting probably not but I’m making the argument a lot probably not hold on hold on hold on so a lot of these Rand if if it’s just simply we’re going to build a bar chart for you from your CSV files fine SharePoint is a great solution it’s the default

35:00 is a great solution it’s the default solution I’m making the argument that a lot of these random files teams would like to do more with and they are doing more because they’re going into these CSV files and Excel files and making edits and asking people to add a column and there’s formulas that we’re still reporting on in powerbi anyways there’s a lot a ton of use cases there what if I move this to one Lake file explorer which again saying you have to learn a lot making a lot of assumptions here

35:30 lot making a lot of assumptions here because I didn’t say you had to learn a lot I’m saying you’re separating out where all of my my stuff is fine if there is a need which I would challenge that a lot of use case there is one to do more than just reporting and more just to see my data visual in a visual sense then one L file explorer allows me to put that data in a place that can be touched in multiple areas and interacted with multiple areas

36:01 areas and interacted with multiple areas I’m no longer trying to manage saying please don’t edit column C in this Excel file because that will break everything and I can now touch this in power apps I can now point this to if they’re if they another team or another system wants to actually touch that interact with that data again let’s use that event attendee data to keep a flow going I have all this cont so but here’s here’s where here’s where and they don’t have to touch that they don’t have to wor about no no no no here’s where the value comes in for me I guess okay I think you’re

36:32 in for me I guess okay I think you’re getting where I’m going to go here Seth because because up to this point I keep hearing you as you’re working with this business team you’re the business intelligence team and my push back on you is okay great there’s many ways you can go copy data from a SharePoint location into Fabric and get use out of it so you’re the bi technologist you can figure that out it do it shouldn’t it shouldn’t Force me as the business user to conform to what you have you want where you want my data I

37:02 have you want where you want my data I work in SharePoint this is where all my stuff is if you want it it’s there go get get it that’s the argument I have right like like you can go access the data in SharePoint conversely though I don’t think that’s the use case right what you’re proposing is what is the value for the business user business user you you don’t have to build all these connections you don’t have have to like do all the hard work all we have to do is we’re going to move your analytics data into one L so that when you’re

37:34 data into one L so that when you’re interacting with these in fabric it’s automatically available to you we don’t have to build anything so that to me is compelling it’s like oh well okay so I’ll just keep my business side presentations and things like that in SharePoint but the the compelling argument for me the business user is if I just throw this in one Lake which is just a different part of my file you just a different part of my file Windows explorers right now it’s know Windows explorers right now it’s available to me when I go out to my workspace and my workspaces are now available to me in this place then now

38:05 available to me in this place then now that is a compelling argument I don’t think you had it though when you’re talking about the bi team connecting with these business groups because it’s like actually pretty fair what’s the value there yeah I’m going to argue with that one as well I think that’s another point that you’re trying to make here Tommy that I I resonate more with there’s less friction getting from SharePoint to tables or lak houses items then there is going right through the files area of the lake housee and and let me give you an explanation here Ry I just dided a project where I was trying to use a pipeline to go access a table

38:37 to use a pipeline to go access a table of standard data from SharePoint yeah can’t do it without a service principle can’t do it without going into the SharePoint site and making some really there’s some other things you got to set up so to your point Tommy if you’re going to remove some friction of getting data that the business is creating into a place where I can do use the analytical tools on top of it it does make sense I think to bring those files closer to where the lake is living MH

39:07 closer to where the lake is living MH things another thing that I find is difficult here is if you if you generate files or things that you are saving and putting them in the one link there’s no easy way to get them out there’s no download button there’s not a lot of other I can’t edit files inside the service yet so like for example I’ve had a Json file that defines some parameters for my pipeline that pipeline needs to consume those parameters and this could be something that’s driven by Dev test production right so this this is a configuration file that may go with the Lakehouse that you actually point to

39:38 Lakehouse that you actually point to it’s the same name in each location so that way you can deploy it correctly but then when you’re running the pipeline it finds the file it reads the information in it and says oh I’m going to go talk to the dev server or the other servers great but you can’t edit the file you can’t change it in the one Lake itself

39:55 can’t change it in the one Lake itself there’s there’s no editor for those kind there’s there’s no editor for those things which maybe we’ll get there of things which maybe we’ll get there and maybe we’ll be edit those files great but right now if I use the one link Explorer I can then see the file download it edit it and then it’s automatically gets synchronized back so I think that is valuable I also like one thing I I’m missing from this one Lake if you have SharePoint inherently in SharePoint there’s already versions of of files I don’t know Tommy are you are there versions of files in the one L

40:27 there versions of files in the one L file explorer like if I edit a file does it make a version of it and can I get the old ones I know there’s a log directory on what occurred yeah I know that Version Control I’m pretty sure there’s no Version Control so it’s kind there’s no Version Control so it’s it is what it of it is what it is it yeah this is this is what it’s always been this is the way so it is what it is interesting I I wonder if but again I’m not putting powerbi files in here no it’s data like know

40:57 here no it’s data like know it’s consolid yet yet yet so the reason I’m pointing this out is I’m actively thinking about or looking at all the lake housee is a great collection of files there’s these things called workloads that Microsoft has built Everything You Touch a pipeline a a data Factory all those things are individual workloads why are not more the workloads the workloads all have access to the Lakehouse right when third party companies come in and start

41:27 third party companies come in and start saying hey we’re going to build a whole bunch of other workloads for your company there’s no reason why those workloads can’t borrow portions of your lake house to then create value on top of your data and spit it back out and put it back into your lake house so I really see an opportunity here that workloads or other things are going to like you already see it today with the ADF and the pipelines right those are two workloads that are leveraging the power of the Lakehouse to easily get files in and out I for sure see a future

41:59 files in and out I for sure see a future where other workloads are going to be able to add more value and so why not store your pbit files there your templates hey this is where I’m going to put them if you want to go get a template everyone in the workspace now has a common thing again if we’re thinking about centralizing the work that we do in inside power. com now we have all of our powerb templates and by the way you can shortcut files to other workspace workspace so I can have another Lakehouse

42:29 so I can have another Lakehouse somewhere else in the system that I shortcut myself into so Tomy do your point right let’s imagine I’m I’m the individual who’s in charge of creating all these CSV files in a folder for sales data or Salesforce exports or whatever the thing is right that folder I believe can be shortcut it to another Lakehouse meaning I still maintain the ability to control and add more files into it but I can easily distribute that to any other analytical lake houses anywhere else in the business so now there’s this common place of okay

42:59 so now there’s this common place of okay well we can just share things across different teams and shortcut them in and now we don’t have to worry about any this stuff so there’s potentially another option here that’s like it feels to me like there’s more potential than what we’re seeing here initially at least that’s what I’m looking at that’s that’s not I honestly that’s one way I didn’t consider it for like more for the bi team and I I’m going to mention something Seth talked about to how we changed the conversation or what was more of the value ad if I came to

43:31 was more of the value ad if I came to this and said I was the technology team and not business intelligence does that are we still dealing with business intelligence when I’m telling them to move your files to to one L right because I’m thinking about this if I’m doing more than just reporting at this point and I’m coming to the team going oh yeah you’re gonna interact with your data and you’re going to have more access to this outside of just reporting is that the role of business intelligence you’re going to have more

44:01 intelligence you’re going to have more like well like well if maybe I answer in this way like fabric offers me the opportunity to engage with these business users earlier in the process like allow them to see a lot more of like how things are made right so if if my argument to them is hey’re we’re going to put your data here the the the value and what I was pushing back on you is that what is the what it if I’m the business user what’s the value of me changing my the way I do things for this

44:34 changing my the way I do things for this and I think I think it it revolves around what we talked about which is hey let me show you I’m going to bring you along for the ride but we’re going to move your analytics data in this location and what look here when we go out to the workspace which is what you guys are familiar with for your reporting here’s what we’re going to be building for you now right all of your analytics data now it’s accessible to you here right so like I’m going to we’re building these you we’re building these models or this ingestion for you know models or this ingestion for you and here’s all the business logic and

45:04 and here’s all the business logic and don’t worry we’ll take care of it but it’s here for you to Vis like interact with and be the subject matter expert and we’re going to walk through these pieces but if you want to do things you pieces but if you want to do things on your own or you want to spool up know on your own or you want to spool up and do some different analysis here’s the new tools and capabilities you have and you don’t you don’t have to muck around with like all of the structural configurations and extraction of data and all the things we have to deal with when we’re using independent services and all the like non-fabric right where it’s it’s just there and

45:36 right where it’s it’s just there and that is that is super compelling it’s like oh well of course I’m not going to use SharePoint now I’m G to move my data into my analytics tooling right now and then I can scale in in that platform right right is there did that answer your question no that was phenomenal and it just it doesn’t neily raise more but it triggers a ton of more I think situations so to speak so is there a Coe or governance play here with one Lake file explor oh boy for sure there

46:09 file explor oh boy for sure there is or should we close that for another day based on the reaction was like well oh boy I oh boy the CU should definitely have input mean the CU should definitely have input as to what we’re trying to do here right and and to this point right there’s more friction going to there’s more friction going to the L the the SharePoint page than there is going to one Lake file storage right so the center of excellence should be deciding

46:39 center of excellence should be deciding okay on certified data sets should we have any SharePoint data set location connections is that something we want to be able to do do we want only items coming out of the the lake house is there is there a criteria where we’re saying look we are willing to say no SharePoint files but SharePoint lists are acceptable where do we store our PBX files do we put them somewhere do we put them in a common location does that live in SharePoint so I think the Coe has some input on those things and more so on the items that are certified regular

47:09 on the items that are certified regular we know we have to repeat that that data over and over again I think it’s it’s nice to have options but I think what Microsoft does a lot is they give you so many different options to do the same thing it becomes incredibly confusing to know where’s the best place to put things what’s the best way to organize it because there’s not just one way doing it you have like five so you have to figure out okay what are the pros and cons of all these items so Microsoft thank you keeping contractors employed for the last 50 years and looks like we’re going to be going for another 50 just because we don’t know

47:39 another 50 just because we don’t know where to put the files for everything well they they’ viewed no but they like we we talked about this before Microsoft’s an enabler right like they build the core tools and leave a wide swath of opportunity for third parties to enhance or build specific solutions that help companies do things more with those tools sure in the same way like the options thing is good it it’s just a Challen a challenge because when as I think business people we we

48:11 when as I think business people we we go in or you hear this all the time like well just give us the standard way well well how do we implement it well it depends the answer it depends it depends because your organization while very similar to other organizations in terms of data Etc is its own unique Beast right like a process that you may build for your business may not work successfully in another one and data culture could have a large part of that like how data literate people are Etc but in in the

48:42 literate people are Etc but in in the options of like picking and choosing what a Coe would would make in terms of recommendations are based on probably driving efficiency within that organization and each organization kind organization and each organization has different needs which is why you of has different needs which is why you have the VAR routes that and Technical Solutions that I think Microsoft offers a plethora of to to steer people down but ultimately like we’re like those bodies are are in place to create reliable outcomes and

49:13 place to create reliable outcomes and efficient outcomes within the business right so hopefully everybody starts to like the outcomes of of those decisions are not so people can go oh yeah that’s that’s pretty cool it’s so they start behaving that way right because it’s going to make something better for the business I agree it it it’s all about again a lot of these things I look at these features going well how does this make my life easier than what I’ve been doing can it make my life simpler less complicated less steps less clicks like

49:43 complicated less steps less clicks like if it’s doing those things I feel like that’s a good method to go down if it’s not increasing that level of Simplicity on things I’m I’m more hesitant like don’t give me something that’s more

49:53 don’t give me something that’s more complicated I’m not going to use it as much the only thing I would say to that is that is the I wouldn’t necessarily say that all the time where you’re saying Mike because there’s also the situation where there’s some learning curve and once you get past that hump so to speak then that’s when everything begins to accelerate not saying this is the exact equivalent here but if I’m if I’m integrating this and this actually kind integrating this and this actually goes to my hot take here I do have of goes to my hot take here I do have one which I think I’m going to get higher than a one with Seth on here

50:24 higher than a one with Seth on here that’s that’s the goal but if if there is a slight learning curve where the end of that road or the end of that like okay now we’ve got the process down when everything accelerates that’s makes it worth it so here’s my hot take here and this goes off some of the things I’ve been saying but in the next two years for organizations who have adopted Microsoft fabric fully notice I’m really giving myself some cushion here oh boy next two years for an organization who

50:54 next two years for an organization who has adopted Microsoft fabric fully one L File Explorer or an equivalent by Microsoft will be the de facto place your analytical random data will go n it’s a one jeez Louise people people are still going to do what they’re going to do like I don’t think they’re going to it might be a little bit less friction there might be a little bit more ease there but I still think people are going to want to live in where they work all the time it’s going to be in teams it’s going to be in SharePoint it’s going to be in one drive I think that’s going to be where a lot of that business data just

51:24 of that business data just amasses because it’s easy what are you as a consultant what are you gonna recommend or don’t do it don’t do any of this no I I don’t I don’t know who knows who knows what tools will come out from these things right I don’t know like there’s there’s a lot of really rich things that are happening at the one lake level right I think one Lake storage Explorer all alone just by itself minimal impact okay the tools that you can build on top of it high potential so I think the the fact that

51:54 potential so I think the the fact that you can access the one Lake you can build tools that interact with it you can build tools that do advanc things with your data and then put it back into the one L but then that’s the value of it well yeah that’s the value one file explor I’m surprised I’m surprised you say that say that actually I I just don’t I don’t use it a lot right now well I and I I agree because the the if you look can I build a look so let me ask a question can I build a power app off of things that live inside

52:25 power app off of things that live inside a one Lake yeah I don’t SQL Warehouse what was that there a SQL if I have a one L okay so the one Lake if I have a lake house but you’re going through the SQL endpoint to get to it right okay not not disagreeing with you just saying like there’s there’s other things like the making power apps on top of SharePoint lists it’s pretty seamless and easy you’ve got to know what you’re doing in power apps to get to right to tables inside L house yeah

52:55 to right to tables inside L house yeah not not like any applications Microsoft is removing the friction from building those things inside other applications like so SharePoint already has a nice easy list editing system that is better than managing a single file inside a one link so to me there’s there’s things already that exist that are making that easier now if you build me a workload that does this stuff and then I can have a series of files that exist inside one link that I can go make lists and edit and all these other things and it just becomes a dat Source

53:27 things and it just becomes a dat Source then I think I would recommend it more but I think there needs to be more enhancement on the usability on the front end side again I think to your point said earlier was Microsoft just builds a framework the framework’s there the framework I think is important but the tools that we’re actually able to use to interact with Lake one Lake stuff they’re not there yet people haven’t come up with the ideas so it’s I I’ll I’ll actually give you like a three to four four Tommy like in two years in two years time like you think about the compelling argument of like centralizing your data

53:58 argument of like centralizing your data store right like if if my analytics data currently in one lake is now accessible to me in Fabric and tooling right faster and easier because I’m not copying versions of data right I’m not creating my own version of something it’s not ETL into fabric it’s just the file right I do I think there’s some potential complications of like like not ver not having versioning right like is that a big deal well it is in the

54:28 that a big deal well it is in the business Realms in SharePoint it’s you business Realms in SharePoint it’s how many times have we been saved know how many times have we been saved by backing some stuff up so like I I vacillate a little bit backwards and forwards but in two years time I would I would argue that yeah the vast majority of like un not managed data is probably a better way than than saying some of the like just the the masses in in one lake is probably where where organizations

54:58 probably where where organizations centralize around and and I would even argue like further than that is one of the the I think bigger things with one Lake and what it offers is the repository of all meaningful data potentially that could be leveraged with AI AI yes yeah I think that would be again not going toe point for systems we mentioned SharePoint but we had one drive we how many different locations are people storing

55:29 different locations are people storing data in and if this moves more into a centralization of that analytical data then I I yeah there’s a lot of value there I think I think I think I agree with you though it’s taking it it’s going to take a little bit to like fully recognize or realize all of the value that it provides and is it enough right now even to my previous argument to push people into it right and and that’s where as a business

56:01 right and and that’s where as a business user you need I need a I need a reason why you’re going to change my process or the way I I do things and typically that’s when you show there’s more value to them in other other Fashions what if we could just do short shortcuts to one Lake from one Lake into SharePoint yeah what if SharePoint could just we just sh shortcut things right from there then you just leave it where it is you don’t ever to worry about it it just it just shows up in the one L listen I’m with you I am completely frustrated how is there not power automator power apps

56:31 there not power automator power apps into a fabric why is there not any connection but I have 17 scorecard ones if I want to use it why but these are my points like I don’t think I think the user Community has not quite asked for the things that they want in the right way so that we get what we need out of it and things things that I would like one Lake to be doing more right I would like the ability to be able to turn on versioning I’d like to see the versions of like this is all top of blob storage blob storage already has version of files

57:01 storage already has version of files already so if I have that feature the same way I have it in one drive just figured out for me Microsoft if I want this folder to have versions tracked that’s what I want turn it on so that way I to to your point Seth and Tommy like I I would feel more confident moving more of my business workloads out of SharePoint and other things and centralizing them around these are things that are bi Central workloads and we’re going to do all that workload only inside fabric I think that solves that problem the other the other caveat that

57:31 problem the other the other caveat that I’m thinking about here is you have to have all all of the fabric turned on this doesn’t work if you just have half Pro half fabric so this is going to be a fabric enabled everything so yet to your point tomm me earlier these are for organizations that are going to fully adopt fabric everywhere across all workspaces across all team members and I’m not sure companies are willing to pay for that yet or have Justified in their minds at this point we might get there but right now I’m not sure if all companies are like yeah we want to pay

58:01 companies are like yeah we want to pay for pro licenses for our developers and give all this other access to the parts of the business I think they’re going to want to have a mix of pro licenses especially the ones that have e5s because they get Pro by default right so the the story for them to move over to fabric is like let’s pay more money so we can put files not in SharePoint that we’re already paying for like I don’t know good point we’ll have to tease it out go build your pipeline to share oh tell no thank you I’m not doing

58:31 share oh tell no thank you I’m not doing that We’ve literally just said screw it we’re just G to use data flows so it was not worth the time or the effort to figure it out anyways that being said thank you very much for this interesting topic I hope you found some value from this one though thank you jumping on the comments would be awesome so thank you comments jumping here a little bit as well telling some things about what you feel about one Lake and how things are going to be integrated as well with that we you like this conversation if you found this to be valuable please go out and share this conversation let someone else know

59:01 conversation let someone else know you liked it either on social media or you found some insightful information we’d love to hear from you what you what you pulled from this are you using one Lake file explorer how do you use it we don’t know everything we think we do but we don’t for sure we don’t so we’d love to hear from you get some feedback tell us what what is working for you and what’s not that being said Tommy where else can you find the podcast you can find us on Apple Spotify or wherever you your podcast make sure to subscribe and leave a rating it helps us out a ton do you have a question an idea or a topic that you want us to talk

59:31 idea or a topic that you want us to talk about in a future episode head over to power. tips podcast leave your name and a great question join us live every Tuesday and Thursday a. m. Central and join the conversation all power. tips social media channels awesome thank you all so much and we’ll see you next time time [Music]

Thank You

Thanks for listening to the Explicit Measures Podcast.

Previous

CLM Part 3: Develop & Manage Content – Ep. 341

More Posts

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.