PowerBI.tips

SQL or OneLake – Ep. 236

SQL or OneLake – Ep. 236

Fabric is pushing Power BI teams into a new (and slightly uncomfortable) reality: storage and compute are converging, and the labels we’re used to (‘warehouse’, ‘lakehouse’, ‘SQL endpoint’) don’t always map cleanly to what’s happening under the hood.

In Episode 236, Mike, Tommy, and Seth start with a few solid reads from the community (including a Data Goblins bar chart gallery and a Tabular Editor deep dive) plus Microsoft’s move toward low-code paginated report authoring. Then they tackle the real question: when do you choose the SQL surface area in Fabric, and when do you just land data in OneLake and move on?

News & Announcements

Main Discussion

The core tension in this episode is convergence vs clarity. Fabric is making it easier than ever to access the same data through multiple experiences — notebooks, lakehouse tables, and SQL endpoints — but that also makes it easier for teams to accidentally build a messy architecture with unclear ownership.

Mike’s question is one many teams are asking: if everything ultimately lands in OneLake, what’s the practical difference between working through the SQL/warehouse experience versus treating the lake as the default and calling it done? Seth and Tommy push the discussion toward the parts that still differ: who the target persona is, how security/permissions are applied, and whether you’re building an IT-managed data product or enabling business teams to move upstream and do more engineering work themselves.

The punchline: OneLake interoperability is the prize, but it doesn’t remove the need for intentional boundaries — especially as self-service moves from ‘build a report’ into ‘build pipelines and transformations.’

Key takeaways:

  • Treat ‘warehouse’, ‘lakehouse’, and ‘SQL endpoint’ as interfaces and governance boundaries — not fundamentally different storage (in Fabric, the storage story is increasingly unified).
  • Pick the SQL surface when it reduces friction for your consumers (analysts, BI developers, tools) — adoption is a real requirement, not a nice-to-have.
  • Use OneLake as the interoperability layer, but don’t let ‘it’s all Delta’ become an excuse to skip architecture decisions.
  • Decide (and document) who owns what: IT-managed pipelines/curation vs business-managed enrichment, plus how deployments and environments keep you out of chaos.
  • ‘Give me the data’ use cases aren’t going away — paginated reports and table-first access patterns are still essential in many orgs.
  • Community patterns (like Data Goblins’ PBIP examples) are a reminder: great outcomes often come from disciplined modeling + measures, not a pile of custom visuals.
  • Expect self-service to move upstream; governance needs to evolve from report-sprawl controls to data-product and pipeline controls.

Looking Forward

If you want Fabric to feel simple for your org, make one decision explicit: which experiences are ‘supported paths’ for building data products — and which are experimental playgrounds — so OneLake enables reuse without enabling confusion.

Episode Transcript

0:29 good morning everyone welcome back to the explicit measures podcast with Tommy Seth and Mike hello everybody good morning Mike hello good morning all right well we got a couple of interesting openers today things that have been coming across the internet some stuff that came out very interesting Tommy I know you you picked up on these things it’s funny how I read these articles and I see the topics of those articles immediately pop up on the podcast Tommy is always on point with grabbing some interesting topics around here

0:59 around here data goblins as always we absolutely love data goblins so good well well designed super well thought out out we see data goblins coming out with an article around bar charts how to create bullet charts in power bi this is really cool did you guys see this article at all and check out his list of bar charts there all the different types or ways you can style them them it was insane yeah so this was I don’t know if you actually opened the desktop

1:30 know if you actually opened the desktop file because that was the first thing I had to do there are two things that I absolutely love that Kurt did here was really just basically took a single visual visual and made 28 out of them and obviously it’s it’s one of those how did you do that how many measures always incredible because we’ve talked about this before one of my favorite visuals is a lollipop chart but it’s really hard to you just like candy Tommy I just like candy it’s actually a better representation

2:01 representation anyways but I was like but they’re not a really easy way to do that but he’s basically put together 20 different visuals all from Once like yeah I don’t need custom visuals but the best part is little secret is everything he’s the GIF Hub it’s all in pvip people are starting to become really sneaky with developing things around pbip now I can say that because our

2:31 pbip now I can say that because our theme generator now lets you build pbip files as well and I really like the formatters yeah yeah slick when unbeknownst to people that just you can open it up and you just start going with a whole lot of things under the covers this reminds me well one Kurt always has a way to make me feel like I never really thought out an idea completely true it’s it’s like the components of something that I I always had like 30

3:03 something that I I always had like 30 and then yeah I had that other 20 but he always pulls it together and this one reminded me of like when power bi like first came out and we really should be doing this again as they’re rolling out visualization changes is like how far you can stretch like the different visualization types and this was just I think a fantastic way like Kurt does in all his blogs to represent in a concise way like hey have you ever wanted to and here’s like

3:34 have you ever wanted to and here’s like the look and feel of all the different ways you can use a bar chart at night it’s fantastic I love it you guys haven’t seen this it’s a it’s a fantastic Vlog yes it is and I recommend it shows shows really how far I think also it just out of the box power bi is has come in terms of some of the visualizations you can make so I’m excited to see even the ones that have been out now like the line chart now that they’ve updated and all the cards which isn’t represented

4:04 cards which isn’t represented obviously in this particular blog but yes like I said we’ll have to revisit and go through and and stretch the the bounds of what pretty visualizations we can put together maybe we just have curbular now yeah yeah yes Kurt there’s a new line chart Kurt can you yeah thank you no I really like this one and I love the fact that he’s using a lot of standard visuals on the bar chart now to your point Tommy that’s where my mind always goes I need to quickly download that

4:34 goes I need to quickly download that file because I need to figure out what the heck did they do here how do they style that thing so I could get it to look exactly like that one I don’t really this isn’t this is one of my pet peeves at Power bi though a lot of times I see other people’s work on power bi reports and think oh that’s amazing I really like that I need to learn what that is and it just feels at some level daunting because there’s so many different ways you have to know things to get these extra options like you don’t you don’t get an easy button that says oh make a lollipop

5:04 easy button that says oh make a lollipop chart and it just Boop it makes it the way like it just starts styling it like so there’s a lot there’s work that needs to be done in order to like figure out what to do and how to build this stuff so to me that’s always like a barrier to entry on a lot of things around power bi is it’s just a little bit more work to get it to style the way you want if you wanted just a standard bar try to play Microsoft but it builds it great no problem it’s when you want to customize it got to figure out how to make it work yeah another article that came out this this week I guess it was this

5:34 this this week I guess it was this week week actually it’s a little bit older but we’re picking up on it now the tale of tabular editor from Daniel ocare over in I believe it’s in Denmark area area very cool article Tommy why did you pick this one this is a little bit older but what was your thought here it’s a nice story to see like how in the world is there really there’s only one and there’s only one and only Tabler editor how much we rely on that and what we do with it and it’s really like

6:04 do with it and it’s really like the origin story of how did it actually come to beef well we all remember using one of those tablet editor two you one of those tablet editor two clunky I guess but I that was my know clunky I guess but I that was my first moment of being intimidated by a software going I don’t know if I want to touch this but just to see that Grove story of something that’s so integral to what we do now and the different things that like ignited them to go well what if we could we do that and the next thing it’s it’s it is what it is today yeah

6:34 it is what it is today yeah yeah and and it’s an interesting thought to think see where Daniel has gone with this so he started tab here editor 2 or tabular editor did a reversion to it and it typed their editor too and then now he hasn’t the actual tablet editor that you pay for which is to have it under three so tabular editor two is like a free version you just get some basic editing things you get a lot of what what typewriter 3 does which minus some more advanced features

7:05 and it’s just interesting to see where Daniel started he started as a consultant he built a tool it helped him out immensely and then he turned into a full-time developer like he literally this is his job now his job is to run however the editor now which is interesting to see people talk about their career a little bit and how they got to where they are just always kind got to where they are just always fun yeah the things that struck me of fun yeah the things that struck me out of this were you guys shared this this lazy engineer mentality where where his story revolves around like somebody was just going to let him work on something for five days and he

7:35 work on something for five days and he said like that that just so wasted my time so manual work I don’t want to do that and then that the Inception of tabular editor so that I found funny yeah if if the community or people listening don’t know about tabular editor go look it up yeah alone is going to change your experience and how you interact with your it speeds you up immensely just to speed alone and so so huge contribution from Daniel and then like what what I love about the

8:05 and then like what what I love about the article is his separation of what tabular editor 3 and that true you tabular editor 3 and that true that that is sustaining him but how know that that is sustaining him but how Tabby letter two is is its own product still and he’s gonna it’s the community one that he’s gonna maintain and that is very good to hear right and how he’s like it’s it’s the life of how this this thing that he came up with as a solution evolved into both a community tool that he constantly

8:35 a community tool that he constantly updates and then also a way he can stay engaged in making both products better all the time so I love the article so I’ll also put in here in the show and just because we love the tool so much we did a four series blog or four video series basically it’s a four hour series with Daniel building out okay what’s the introduction to tablet editor how do you use scripts what is a best practice analyzer and how to use that and that that alone right there is worth its weight in gold just because you can use a tool to check if you’re

9:07 a tool to check if you’re doing things in a weird way in models and gives you recommendations automatically an automatic recommendation tool to performance tune your model great love that and then their last one was talking about like how do you use devops and tablet editor because tablet editor really was also letting you design the ability to create a data model and then deploy using code that’s been like the like the the weak spot I would say a lot of power bi at this point it’s it’s been a sore moment pretty much since the beginning right it’s initially it hasn’t

9:38 right it’s initially it hasn’t really ever gotten there but we’re seeing a lot more strides here recently that’s making a lot more parody with an actual deployment pipeline type stuff yeah yeah then our last article today which I think leads very well into our main topic is there is now low code authoring authoring of patching reports right off of the one Lake data hub this has got to have you excited Mike oh

10:02 this has got to have you excited Mike oh I’m at first any messages the only people who treat me are people like I told you so yeah true true time it’s like hi hello how are you doing it’s more like haha I was right you were wrong I really do like the fact that they’re more well there’s more features being rounded out in creating pageant reports you cannot go to one Lake data Hub you can right click or go click on the Ellipsis and you can directly click from there create a paginate report which is

10:32 there create a paginate report which is basically again this is my argument I think there’s two ways you want to access data you want you want insights on data or you want access to data housing reports is a good way of just getting access to the data so show up build a report make your columns build your things add some filters boom done you have a table of data leave me alone you you have access to your data you can go you want and you can go build your yucky sell Excel stuff all day long and and you can leave me alone so oh

11:04 and you can leave me alone so oh what do you mean no it happens okay there we go there we go oh that’s a shame it happens I I do I do recognize you can’t do everything inside power bi so you definitely need some level of analyzing and mucking around with things inside Excel I do I do recognize that but I really like being Daniel a care and lazy and Building Systems that just do what I want repeatedly all the time so that’s actually a lot more of my my favorite there so

11:35 there so those are my Microsoft blog talking more about that so what do you guys think any thoughts on this one honestly this is what I said with the last time we actually had an entire episode of devoted to paginated reports where they’re they’ve got to update the system we have the technology where we don’t have to be in this archaic our like a builder and you can see that now it’s like oh wow it’s like a really easy UI to drag and drop a visual around do how difficult or what you guys do know but for those who don’t

12:06 you guys do know but for those who don’t know if you ever use the report Builder just to drag a darn image across the screen and get it just where you want was a horrific experience and it’s not because you were dealing with data it was just something from 1968. where now often they’ve built in oh I don’t know powerapps probably into this or something so also Microsoft owned so the ability and that’s what people always wanted it with the patient reports I want to be able to do this invoice or grab this data but correct quickly so

12:37 grab this data but correct quickly so we’re getting there yeah there there’s a great comment from Nico on LinkedIn that is rebutting Mike’s comment about Excel he posts that tweet from Buck Woody and if you if you don’t know who buckwheat is like you’ve got to follow him he’s a Microsoft for life but his his little meme is like there’s a crab holding a fish and the fish under it says entire Global Financial system and the crab underneath is Microsoft Excel 2013. that’s true that’s true

13:08 that’s true that’s true some Financial like I made me a finance department person that is not so heavily involved Yes except it’s not even funny I’m not gonna disagree with you and I came from a world where the entire company a billion dollar company ran on Excel we were making decisions right out of excel so I’m not saying that it doesn’t happen it definitely does happen and it definitely is needed If you didn’t have Excel what would mean if you didn’t have Excel what would you do like what would you what would you you have to like it’s got to be there but on the other hand I also think

13:38 there but on the other hand I also think too like okay it’s it’s good for like a a single one-off thing it’s not as it can automate it and that’s are you making me walk back on my comments you’re making me walk back like I do I do like it a lot and there’s nothing wrong with it I’m just saying of of the things I prefer now now that I’ve gotten my power bi tattoo I I have to I have to say power bi is the best the best it is that is a probably one of the better images you’re going to see and probably devoted to its

14:09 going to see and probably devoted to its own episode to talk about how much of Wall Street than the financial world still relies on it like in Toledo or whatever and that is not a good episode it’s like 99 except for the ones like Google Google probably the only one because they have sheets yeah software before Excel start with an L Lotus one two three yeah Lotus Notes so I think it’s that was the other one I believe an IBM product I think it was where’s Greg baldini when you need him to fact check everything because he’ll

14:39 to fact check everything because he’ll know like oh it was made on October 3rd 1990 or whatever it was so you want to talk about the need for data governance I’m reading a few like historical Finance books and the what took off everything in terms of the 70s and the 80s was Lotus I played with this very briefly when it first came out I remember initially working on a little bit it had it’s Excel hadn’t fully taken over everything at that point anyways anyways I’m I’m reminiscing and having fond

15:10 I’m I’m reminiscing and having fond memories of moving around inside Visual Basic trying to build random automation things inside Excel because again I was lazy and wanted to be efficient and press a button and have a whole bunch of things calculate so wow we’re pulling out some of that we’re pulling out some really good ones here Lotus one two three for dos we have visit calc ASCII charts these are things I have never heard of that’s where this is where you start feeling young again like maybe yeah exactly exactly well especially from the engineering

15:40 well especially from the engineering world too my my dad’s an engineer and he he does a lot of they do a lot of testing and I’m like what do you use it’s like he’s like he’s like CSV files because I did that I was like Yeah The Machine yeah the machine outputs a CSV file great now I have to go parse that dumb thing yeah all the other data now we’re gonna have our development team build an application just to do it or themselves so it’s crazy it’s great yeah I was calling came

16:10 crazy it’s great yeah I was calling came with I think again I was talking with data goblins Kurt and he was talking about how there’s a big Miss in the it’s in the data collection world of PhD in scientists they they do all this work to collect this data and it just gets stuck in like Excel sheets or random places there’s no database there’s no way to like automate it there’s and there would be a lot more Integrity in research data if there was systems that would could immediately absorb data directly from the machines

16:41 absorb data directly from the machines so it so it REI can serve a purpose there we could see how it would work so anyways I digress digress let’s move on to our main topic today which I think is also going to be extremely controversial and I’m also very disappointed Greg baldini is not here for this one because I think he would also have some words for us here as well as well the art the article for today is from an individual we talk about a lot on the podcast Matthew roach who came up with roach’s Maxim transformed the

17:11 with roach’s Maxim transformed the data as far Upstream as possible and as far Downstream as necessary so so great article we’re gonna we’re gonna go through Microsoft Fabric and then falling in love with SQL all over again so this is the article for today but really our our conversation today is around okay we have SQL we have this thing called one Lake would fight do you do you put everything in SQL now or do you put everything in the one link

17:41 or do you put everything in the one link and use SQL to access information in one link and where are the decision and or break points around now we have this new fabric Thing there’s a lot more SQL that we’ve never had before in place how does it how is this going to change how we work what is this going to look like like is that a good summary topic of what we’re doing here Tommy anything else I missed here that you would like to kind missed here that you would like to ADD of ADD into our topic of today I’ll put the article here in the chat window no I think that’s that spot on I think in terms of what this radical shift is

18:12 terms of what this radical shift is going to be on what I’m finding myself in both macro situations and or micro situations for my for my own purpose but then also thinking about teens as a whole and just how much the these obviously fabric huge but just some of those shifts with one leg and what it can do but the data flows pushing and storing this in this data analysis so accessible really changes the conversation I’m not sure if it changes it I it

18:45 I’m not sure if it changes it I it wasn’t a conversation before I here’s what how I would yes and no I I here’s what how I would yes and no this is where Seth and I typically mean this is where Seth and I typically butt heads over things so Michael will Air a little dirty laundry Seth loves sequel sequel guy came out of the sequel World Michael loves lake houses so we always like where we we we’re continually let’s call it debate

19:16 we’re continually let’s call it debate where should the data live and where where does it make sense for the right the cheapest the easiest way for users and or data engineering things to happen inside that landscape I I will not be so naive to say is oh we’re gonna call SQL dead we’re never gonna use it again that’s not what I’m saying here I think instead that I’ve had this conversation with clients SQL runs the world right you can tell me other languages around the world maybe python is a pretty big

19:47 is a pretty big player in this space But at the end of the day any company there are people writing SQL there is databases that are being built and this language is so Universal about how it uses to access and manipulate data it’s just there

20:01 there where I think things are getting to me this new fabric Thing is blurring the line a lot more for me and I’m not quite sure where I always should put all of my data what is the use case for using the SQL Server endpoint versus just straight up up dropping it into the lake and calling it done right so so it’s I’m not saying it’s changing the game to me it’s just blurring the line a lot more it’s making it more difficult to understand where’s the right place to put things because it’s getting so similar to me in my mind

20:32 it’s getting so similar to me in my mind thought Seth oh boy he’s cooking on something good my thoughts are if you air Dirty Laundry make it factual and accurate okay okay well yeah I think I think there’s a significant difference between the SQL language and or other languages that we’ve we’ve discussed versus storage engine and where things are at because I don’t think we’re I don’t think we disagree at all around

21:02 platforms that support big data and those being much better in structures that are that are like similar to what what fabric is supporting in one Lake data bricks Etc Delta tables are absolutely the the method to go about handling large volumes of data in those spectrums so I think maybe it would have been laundry from four years ago or five but true in any case maybe I converted you a little bit I maybe we sure sure

21:35 little bit I maybe we sure sure you have led me down the path Mike and I can’t thank you enough let’s just let’s just get it out Mike I can’t tell you how much I appreciate you holding my hand down this path no no it was I I don’t know like let me express my how I truly are those just gratitude I can’t stop I can’t I gotta say that’s so much sarcasm slash S yes yes well and it was interesting because at the time data bricks was coming out with a lot of the

22:05 bricks was coming out with a lot of the stuff around the lake house Medallion architectures at the same time I was going through my masters in data science and there was a lot of talk around spark the stuff like figuring things out and I was like Ugh spark is such a pain I was building Hadoop clusters and trying to run stuff I was like I don’t want to write mapreduce jobs this is horrible and so like it was just awful absolutely awful and then they started introducing me to I don’t know what it was called Hive or something like that I can’t remember what’s what’s the the language of of The Spark engine but spark eventually pops out and says hey we’ll

22:36 eventually pops out and says hey we’ll let you write SQL against your data tables oh okay I can figure that stuff out and that made sense to me so like I don’t have to write all this crazy like Scala stuff over and over again or you Scala stuff over and over again or even Pi spark I could just know even Pi spark I could just write things in like SQL and again when that once that turned on everything got a lot easier for me so I think there’s a very mean I think there’s a very where I think things for me have changed is I think the emphasis is less on like I need a SQL Server and it’s in the machine to run the

23:08 and it’s in the machine to run the sequel as opposed to now where I’m using I’m either borrowing a compute engine to go access data in Delta tables or I’m using like I now can use Python and Spark to go directly access the information I’m still using the language of SQL to do a lot of data manipulation but I think the infrastructure behind the scenes potentially is slightly changing and it’s really allowing us to have larger volumes of data data better multi-threading because Microsoft I think really botched the whole massive

23:39 I think really botched the whole massive parallel processing for SQL it was just way overpriced and never and not really as cheaply or as easy to be able to run as a lot of the spark things right yeah but that’s also one of the reasons why this is like the this conversation is even like we’re comparing these right if you yes the the same infrastructure you had to set up that was extremely complex right and then you have to like building your own your own Hive cluster Spark engine all that like that was simplified by Yeah by

24:10 like that was simplified by Yeah by fabric like all these things like this is we don’t this is why fabric is fantastic or tools like it like consolidate it all and like even what we’re going to talk about today it’s like consolidate it all and oh my gosh so it’s this is so difficult to go create a connection and suck in my data right like I want to create a new Warehouse new I wanna I wanna create a new lake house new right done yeah like that that’s an absurd concept even five years ago yeah and I I feel like one of

24:40 years ago yeah and I I feel like one of my major when every time I listen I talk to people or particular Consultants they’re like oh we have this old SQL thing okay great oh and we have all these ssis packages every time someone says that I hear groaning because they’re just like like the matching and willing of teeth yes and people are like we got to get off of this stuff It works it gets things done I mean it works it gets things done I don’t think it’s It’s Not Dead by any means but it’s like a lot of companies are trying to say we we really want to move away from that something a bit more modern it’s just a bit too clunky I

25:10 modern it’s just a bit too clunky I think nowadays and now with everything being in the web it’s it everyone’s expecting more of this UI based web experience and doing a lot of your data engineering there and Tommy and I have been doing this whole fabric learn fabric series and it’s been very good I think for me to really get my head around what is Fabric and how does this relate to everything I do in power bi and if I had to like put my a big a lens on and look at what at the end of the day fabric is just synapse moving to power bi it’s

25:41 synapse moving to power bi it’s literally that’s all I did is take the some of the best pieces of synapse Rebrand them simplify them a little bit more and just literally verbatim drop them right into Power bi most everything is there with the exception I think of direct Lake and I think there’s one core technology change here where the data tables you’re storing is in the Delta format and this I think to me is the secret sauce to why I prefer one Lake type things or one

26:12 I prefer one Lake type things or one Lake elements over maybe just straight SQL things like a SQL Server right so I want to make a distinction here for me I still love SQL I really think the one lake is very powerful and but I think Microsoft actually renamed the one Lake incorrectly I think the naming schema is how I’ve been thinking about one Lake and lake houses the way my Microsoft is talking about it is almost it’s a Miss based on my understanding of what a lake house really is

26:43 what a lake house really is so we’ll go in there why is it a miss well so let me okay let me make sure we’ll get we’ll get the the suggestion cards I don’t know what you’d name it but I I don’t know what you’d name it but so the the one Lake mean so the the one Lake is so again I come from my databricks world so their whole methodology databricks created much coined the whole idea or concept around lake houses there may be other people talking about it but databricks has very been Pro on this

27:13 databricks has very been Pro on this idea of a lake house the concept of a lake house is you can have n number of storage accounts in a lake house all those storage accounts store tables or whatever data you want and then you just basically put that data into that lake house with a various with various formats Okay so in in my opinion that is where all the storage accounts live to me the lake house is equivalent to the terminology Microsoft is using right now called one Lake one lake is a collection of multiple it’s a service

27:44 collection of multiple it’s a service basically it’s a service for you that lets you have I don’t know how many storage accounts right you can co-locate data in regions you can reference them with shortcuts right so the one link has potentially multiple storage accounts that are region based that you’re just putting data in and Microsoft as a service just handles where the data goes for you awesome I think it makes a lot of sense then you put a compute layer on top of that that and then in in my opinion the the objects called lake houses are actually

28:14 objects called lake houses are actually called databases they’re actually a database in from a from a SQL term right you have the seek you have the server of SQL the server is all the compute and storage for any database you would have the the databases are what Microsoft is calling a lake house right it’s a it’s a way to collect tables together of a similar nature right here’s my Enterprise Warehouse I’m going to put all the tables for that stuff inside a single database so that’s equivalent now to the the lake house right you may have

28:46 to the the lake house right you may have Dev test prod of the servers right or a Dev test prod of different databases you make a lake house for Dev test prod and that way you can segment or physically segment access control and or objects away from each other so when I click on the lake house that is a prod environment I only see tables that are prod based so to me the equivalent language here is one link is equivalent to lake house or the lake house the storage accounts the lake house objects that we see in

29:17 the lake house objects that we see in fabric is equivalent to database and then you get tables and files inside the database so that’s my mental model as I’ve been playing with the system it feels like this works well that seems to translate to what I know today so that’s why I think they named it incorrectly I feel like the lake house as a as a single object that stores tables is not not just is is the lake house term is too descriptive or is too open whereas what they’re actually building something much

29:47 they’re actually building something much more narrow it’s it’s literally a smaller thing of tables and files that’s it which is equivalent to a database does that make sense am I or am I making conceptually yes it does so but I don’t like so

30:03 like so to say they named it the wrong thing I’m not going to go that far but but I I get that even in the diagrams that they have like technically is it because they’re stepping people towards this new paradigm of how to think about data within an ecosystem and the storage of that is all now the same right all their marketing stuff is Warehouse lake house coustodb data set right right as separators in one Lake

30:34 as separators in one Lake but all those file formats are the same so it speaks to your point which is yes like why are you calling this something different yes maybe it’s just the journey by which you’re leading people to this new paradigm because it is it’s a new pair like then the questions would be well if everything’s in Delta format why do I have four different ways to interact with it correct because you’re used to it right like yes you’re used to the SQL Warehouse you’re yes to the synapse experience in in science and you’re used to the lake house and Jupiter notebooks well why wouldn’t you

31:05 Jupiter notebooks well why wouldn’t you just unify this all into one thing well maybe that’s where we’re going Mike right like maybe the one thing that you’re gonna need down the road on top of that though and like there is one security there’s a lot that goes on like if you think about the actual Technologies underpinning all this you Technologies underpinning all this I’m sure that there’s like it’s not know I’m sure that there’s like it’s not like all that just went away but it’s it’s probably the storage method that is the newest latest thing of like having all of that t-sql that SQL

31:35 all of that t-sql that SQL warehouse now being stored in Delta parquet right and that’s the unique thing that we get this interactive experience between all of these different interfaces into the data so I don’t know enough about the back ends it’s not like I totally disagree with your conceptual model of things yes but I don’t know if I’d go as far as saying like yeah that whole thing is the lake house and that’s what you should just call it I and I think honestly there’s a big part here though too obviously how technical gonna get Mike I

32:07 obviously how technical gonna get Mike I messaged you the other day about the same idea how much of that’s going to be necessary for the majority of people who are going to be at least focusing on on the fabric like yeah if you’re coming from the dead engineering world yeah like there’s obviously you’re not gonna go well let me see what this this warehouse and fabric is you’re going to maybe transition but there’s a whole audience of people who are not necessarily gonna like you said jump the Hadoop but are going to see

32:37 jump the Hadoop but are going to see this as a really great solution to finally store and push data that they don’t have to go all the way up to becoming the full-on data engineer so this is a really good point you bring up here Tommy because I I think there’s two parts of this and how I’m looking at it right right I’m looking at it from I I think I’m looking at fabric coming into fabric as if I would probably quantify myself as a pretty solid data engineer maybe like a senior day engineer like that’s my world I’ve been building here as I walk into this

33:07 walk into this I understand what you could build in in data bricks I understand the debugging I understand the unity catalog that they’re that we’re doing in other systems right I understand other tooling and Microsoft is still playing catch-up in my opinion right purview’s not there not there it doesn’t do the same things that Unity catalog does there’s there’s gaps I think in the system so looking at what fabric is doing from a data engineering standpoint I think it’s a little bit lacking and I think data Engineers are going to show up and be like like it’s okay but there’s it’s still watered

33:38 it’s okay but there’s it’s still watered down too much for what I did an engineer would really want to be doing I think my opinion on this one however when I look at this from the perspective of the of the business developer or or business user who’s experimenting more with data engineering activities I think this is giving them a lot more capability than they ever had before and the fact that they can pick a notebook and actually I’ve been seeing Alex Powers sing the praise Alex Powers is huge on power query he loves it it I’ve seen a lot of tweets and posts from him talking a lot more around man do I

34:10 him talking a lot more around man do I really love notebooks do I love data Wrangler Wrangler that’s a pure spark thing that’s writing code and again I don’t think Alex is any Alex powers does anyone to shy away from code because he writes he doesn’t even use the UI and M just FYI he literally just wakes up one day he’s like I’m gonna write him and he just he just goes to the advanced Enter he just writes all he just uses that exclusively now so like he’s he’s on another level I’ll just say that however he’s enjoying the notebook experience and I think from a business user I really enjoy using all

34:42 business user I really enjoy using all the fabric things I like how it’s so much more integrated I don’t have to worry about spitting up clusters I don’t have to worry about spitting up notebooks and things I literally just click on an object create notebook boom done let’s get going I’m going to make a pipeline great two clicks I’m making a pipeline I’m not worried about ADF and integrating things and connection strings and all that’s there’s a lot of things that are not there that we just get by default so for me the perspective right now is like I really like I I think the

35:12 really like I I think the of the two personas the business analyst or business engineer business data industry I’m gonna call that and then a true data engineer from it did engineering I think the person getting the better end of the deal is the business user turning towards data engineering they’re getting tools that are more reliable they’re bigger you know you can do more as that business user whereas I think the data engineer is going to walk into Fabric and go where’s my debugging where’s all this extra

35:42 my debugging where’s all this extra features where’s all these other things that I was typically used to doing and I’m getting down a watered down version of data engineering now not to say it won’t get there but for right now I think they’re getting a little bit less than they’re normally used to so there’s always going to be extreme use cases where complex data engineering tasks are going to have to happen that aren’t going to be supported directly in fabric but fabric is an Analytics tool but it’s an it’s a like

36:12 Analytics tool but it’s an it’s a like it’s designed to solve the the business intelligence problems of the world right like analytics but it like you’re saying there’s a thought in here so apologies for the like back and forth of the thinking here though yes I don’t disagree that it’s designed for the business in the same way power bi came out like sure there were not the same amount of features that we have right now it’s probably going to be the same thing where it’s an evolution right we’re in preview as well so we’re not even at at a point where they’re saying this product is completely put together and solves all

36:43 completely put together and solves all the things that we would want it to correct correct it’s going to be growing right but in that Foundation there how how much how deep do you really need to go right like you’re enabling teams like the enablement of fabrics is so much higher because you don’t have to deal with the infrastructure right it’s a service that you just plug into and do what you need to do as a person that’s dealing with data which is connect the data manipulate it transform

37:14 connect the data manipulate it transform it build the things you need to to support reporting in an efficient and performant way right like of course there’s going to be edge cases of course you’re going to need some system that’s going to deal with like massive volumes of data that you need to rip through and you need to do all these like hardcore ETL tasks and great you can do that and then you can land it in one leg make it a data source for analytic needs like I don’t I would be cautious around saying Fabrics like the de facto end to end forever tool for all of these things

37:44 forever tool for all of these things because that’s not what Microsoft’s saying I think we agree with that yeah I agree yes tool sets still exist outside these these this architecture I think it’s designed for data that’s in systems that needs to be transformed for reporting purposes and in that case like yeah it opens the door completely for business users but it’s not like you’re not getting the core functionality that you need to do that what 80 90 of the

38:14 you need to do that what 80 90 of the job that you would normally do with data cleansing before putting in a report layer layer do you think like you made the point that it’s like you think it’s it’s lacking in certain areas correct specific to a hardcore data engineering but I’m I’m challenging like the data is that is that a role that belongs in fabric I think if you read Microsoft documentation and they’re utterly confusing documentation around hey these are our user personas for fabric and all

38:45 are our user personas for fabric and all of them are data Engineers so like if you listen to I feel like if I listen to what Microsoft is saying they’re saying yes 100 data Engineers will be happy here you’ll love it it’s the way to go I’m just not sure I’m convinced yet of that story I think it needs to more time to mature to that point but what I will say set to your point here right Microsoft I don’t think we should again I agree with you I don’t think we should be pushing Fabric and saying this is the way everyone must play here now moving forward forever and all Engineers all

39:15 forward forever and all Engineers all data scientists everyone plays in the space I think it definitely removes some pain points away from that audience and I think it will get better over time however what I will say is the ease of integration of what fabric is doing against what’s traditionally been done as a the engineer the data engineering role right the idea that I can still make Delta tables I can still employ data bricks I can still build my own Lakes somewhere else in it and then basically provide a shortcut to the output of that team

39:46 output of that team to me this is the really the neat Secret Sauce here is yes we’re talking like what has happened is a lot of the I. T technology that we’re trying to move forward next in the next 10 years whatever this is going to be now that is now in place usable and it’s going to make the business

40:02 and it’s going to make the business users life a whole lot easier to integrate and connect directly to what the what the IT team is doing right you’ve already got data lakes and blob storage accounts already started up great no problem provide me a shortcut to that storage account boom all these Delta tables can now appear in my business user space as read-only objects that’s what we want go ahead it you own everything above this this step here give this take give me tables of data and I’ll figure out what to do with it Downstream this is I think also talking about data I’m doing a

40:32 talking about data I’m doing a lot of data governance and deployment stuff right now this is creating much headache for for its and business like there is literally a mindset mindset shift that is happening here where traditionally it just gives you reports here you go walk away away we now have the capability of giving them cubes which some teams also do that’s also been a thing and now we’re going all the way back up and saying hey here’s the raw tables that I get out of these systems and can give you even further access Upstream because the tooling now I can build my own notebooks

41:02 tooling now I can build my own notebooks I can build my own data engineering processes and pipelines in the business now so like to me like this is another whole realm of things that are going to be be enhancing our experience oh okay okay it does but it’s not anything new I I think it’s not what’s new is how how quickly we could turn those things around I agree there yes I don’t need to have a credit card stuck in Azure at this point I can turn on fabric and all a lot more isn’t that a prerequisite yeah isn’t it a prerequisite what was that isn’t that a

41:34 prerequisite what was that isn’t that a prerequisite perfect sticking your card in Azure first oh yeah yeah that’s true yeah it’s not free no it’s not free not free but I’m saying it’s been a great try with the ease of it though like if I if I get before I had to go buy data bricks I had to go buy these things so there’s infrastructure that needed to be set up a lot less of that now in fabric it’s just there and you can just okay I’m just gonna buy one skew and I get Azure data Factory I get essentially security with key Vault I get one like secure like there’s all these other things that come along with it that I think make it

42:04 come along with it that I think make it again less of a barrier of entry for these for teams yeah so Seth you had me up until the very end of what you’re saying I agree with everything you’re going with peace I I don’t as we’ve been going more into the fabric with fabric series I’m gonna I’ll get there so I want to show you how much I did agree so and then we disagree disagree but and just I think what’s been going on is yeah there’s it’s not the

42:34 going on is yeah there’s it’s not the business user all of a sudden now every Power bi developer has to become a full-on engineer stack that’s not going to be the story which that was what I initially thought I’m like oh great now I also also have to be like 10 years experience in in pi spark 10 years experience in data bricks just to do my job and that’s not going to be the case the only thing I didn’t agree with what you said was at the very end where the majority of things are going to be solved where it’s just going to be for

43:04 solved where it’s just going to be for reporting purposes I agree with half that statement where I really think even take away the notebooks and even the lake house and just give me somewhere I can actually store my data and then I can connect to it another pull up so the business users the power bi developers now what have they been doing for the last 10 years they’ve been taking faulty data and raw data and structuring it the only problem is it’s lived in a data set another Power bi

43:35 lived in a data set another Power bi report but those cleaned tables can be used in other places mm-hmm mm-hmm so so these business users now have the ability to actually push and store this data in places that the business can actually use outside of just power bi because before all the logic that I was doing in data flows true yeah State yeah stayed in that realm power bi became a sinkhole basically right everything fell

44:06 sinkhole basically right everything fell into it and it did nothing came out like it was just like absorbing right black hole type thing yes that’s a good point yes right and so that is the seismic shift where now I can take the same tools and already principles that I’ve already applied is giving you your CSV files you’re poor you’re hungry and I will clean it for you and but it just lived in and this isn’t this is an interesting topic you’re thinking about here Tommy I don’t think I don’t think I’ve actually jumped to this level of like of like you’re and you’re now involving the

44:36 you’re and you’re now involving the business and potentially part of the data Engineering Process hey do some stuff stuff engineer some data spit it put it back in a SQL Server put it back and we’ve literally yeah we’ve literally been programmed to clean and structure tables like for the last 10 years we didn’t even know it true but but so now I have the ability to take all this faulty marketing and then be able to give it back to them in their own systems where they need to connect to it where it can live outside of this power bi there’s a

45:07 live outside of this power bi there’s a majority of the things that I probably won’t have to even touch a notebook maybe there’s some use cases but I already have had the ability to do things now I’ll learn some pipelines sure I’m sure there’s other use cases but if I didn’t need a notebook before why in the world would that person need it now what are they all of a sudden getting introduced to them now there may be some cases but the majority of the time I really think that what’s going to happen is it’s going to be the two-fold where where it’s going to be not just for reporting

45:37 it’s going to be not just for reporting it’s going to just open up so many other doors where yeah I can push it back to the your SQL database so then we can utilize this so I’m creating power of data flows and I’m doing other things I’ve already been doing but now it has such more access from the raw end I agree with that I think I think this is another use case that that is becoming more enabled by not letting power bi be purely a ingest only method right it actually has the ability of now outputting data to

46:08 ability of now outputting data to certain and so so yeah so I I want to make sure I understand what you’re saying because in my mind there are there are two major differences here if if if you’re saying we’ve had this conversation too especially with work workspaces or business units if they own their end-to-end Pipeline and clean data that’s a fantastic idea like in in a world where they have a third party system they are ingesting data into fabric

46:39 they are ingesting data into fabric they’re doing ETL and the output of that is either a table of clean data well it’s two things it’s it’s an object of data that is quality and reports on top of that because that’s the source for reporting we do that every single day right we’re we keep we we it’s the same methodology anyway you’re saying like clean to like push it as far back as you can so you can reuse this component of clean data as much as possible like

47:09 clean data as much as possible like we’ve talked about that if that means I’m giving access to that table to the business so they can leverage that that’s fantastic it’s cleaned data it’s it’s gone through a process of data quality governance Etc to give them value in the business it’s still reporting data still data that they’re that that is out of systems to and run through pipelines to make sense for them is it in a report no but that’s also why we provide data to them in these tables in reports in whatever

47:40 in reports in whatever if to me 100 agree you own the whole pipeline let the company access these clean tables of data to merge with other data sets or whatever like this could be part of the whole ecosystem of how you govern data within fabric I agree with you you if you’re suggesting that in fabric we should start pushing Source systems of data yeah no I I don’t I’m not on that train right now because now you’re talking about

48:10 now because now you’re talking about like like how do other applications work within this ecosystem like to me it’s it’s not a it’s it is not that right and right and to to to that point then everything beyond that like in my in my first pipeline Point means everything we’re doing is for reporting it’s for the business to understand what’s going on in in their ecosystems and where they’re going to correct data problems are in the actual Source production systems which could be third

48:40 production systems which could be third party it could be a SQL Server somewhere else it could be whatever the case may be once we’re talking Lake I think we’re already in the analytics realm right you’re ingesting data into a central repository of information like area that you typically wouldn’t normally like use outside of a reporting purpose whether that’s internal or external your comment hits me two ways yeah no yeah I see your face on on one hand I totally agree like on one hand I’m thinking about like the lake yes

49:11 I’m thinking about like the lake yes reporting we’re moving away from online analytical transactional processing Ola oltp to online analytical processing so the analytical like the reporting side I I would agree I feel like I feel very strong there but I’ve had some use cases where people have asked me hey if I did something to data if I blended some stuff with information how could I return it back to a team and give them some more refinement around what that would be at

49:42 refinement around what that would be at use and then that could be utilized back inside the system so on one hand I agree with you on a lot separate I think that that’s the delineation okay Source systems that collect data call it like CRM yes right it’s not a manual thing like what we’re talk if you have a business unit that supplements Source

50:02 business unit that supplements Source data with business process right which you’re communicating I think where we have like hey we have some supplemental stuff that isn’t going into our third party system but we do need it as part of like our reporting ecosystem then yeah of course like it allows for them to enhance data that they use to run the business but if it was in integral to the source system they would figure out how to plug that thing into the source system not in their own not in this process process that’s a good point it it yeah I could

50:33 that’s a good point it it yeah I could see that as well yeah if you’re if you’re really seeing the need for the business to refine or adjust things you there’s another process and where my mind slightly goes here a little bit more I say maybe I would challenge slightly here is if I think about what happens inside the The powerapps Experience right powerapps is essentially it’s it’s building apps on top of data that is in potentially production right it’s Dynamics level data you’re building apps on top of that information you’re editing those real tables

51:04 you’re editing those real tables I feel like that that’s PowerApp story is somewhere around we have one source of Records we have one source of truth that is production and we’re able to get it out edit it do some things to it report on it and then get back in there and make immediate edits back to production to your point though Seth I agree there’s like this other challenge around like well what if a data comes from Salesforce what if there’s another third party tool we have to go get it in and we have to bring it down to our Lake and I think I think while I I agree with your thinking here I I do think there’s an idea in the future here where or Microsoft is trying

51:35 future here where or Microsoft is trying to make it such that it doesn’t really matter what you do it they’re not sure I would recommend it yet but they’re going to make it so that you could pull data out of a system do some stuff with it and then try to push it back or try to write it back to that system and Alex was saying here in the comments if you can’t get a power query or some power query thing to push data back out using a power query flow you for sure could make a whole bunch of API calls inside a notebook and you could start talking to

52:05 notebook and you could start talking to apis of things so you could go get a bunch of data out of a system you could use a notebook to connect to the Salesforce API and you could then start writing data back into Salesforce with updating things or whatever right but that’s a different thing that is creating a process within here to say I have identified data quality problems go fix them in my source system that is different than making fabric a source system for production workloads in and out and I don’t agree with that that was the point Tommy was making well no I’m not saying it’s a source system because

52:35 not saying it’s a source system because it’s you did because if all these things are not for reporting what are they for they’re going to push back to somewhere they’re going to go to another okay hold on hold on don’t you’re not gonna do your magic putting words in my mouth again but I’m not putting words in your mouth yeah so but I I was I’m agreeing completely with what Mike’s saying here in regards to there is I don’t think that’s even that much farther down the road Mike either

53:06 much farther down the road Mike either that capability is basically here people realize I feel like there’s a Rat’s Nest of things I could do I’m not saying I would recommend it right you could really do this you could bring data out you could let the business do stuff with information merge it blend it do whatever they want to do with other things and you could potentially try to push it back up into systems not saying it’s right I’m just saying the potential to do it it’s becoming easier and easier I think I think right it’s it’s filing now where you look at where the power bi platform is

53:36 look at where the power bi platform is in terms of the hold integration story it’s literally just got a lot fatter in terms of now can talk back again I don’t think that can be under emphasized in terms of we have the ability to go whether we’re storing it in a lake house which I think that’s what you’re saying but if I need to put into a database and let someone else do something with that at least it’s the Transformations and that quick ability now where I didn’t have to rely on them what’s becoming archaic now or some of the archaic ways

54:07 archaic now or some of the archaic ways we hadn’t connected Data before there’s such a big part here of not just the ownership but I don’t worry I am very weary of a lot of the things too in fabric there’s not necessarily the the protection if things do go wrong because once you start talking about do having Source data and data that’s going to go to other systems you need to have those layers of backups that’s why you have people who obviously it’s more than just making a data clean it’s about that

54:38 just making a data clean it’s about that protection if systems go wrong my goodness you get one error and then everything would break but I digress on that point what I’m saying here is I there’s now an ability here where with the same skills that I’ve already had if I am just calling from Power bi maybe expanding a little I now have the ability that someone in databricks or someone very higher up the chain has and has had

55:08 and has had to be able to now have data in the same place they have had but also do some of the same Transformations now I finally I have a very good idea of or I’m on that almost that same playing field I just may not know it and that’s different than what you were thinking I think that’s different than what you were saying earlier earlier I think you were I felt like you were saying earlier I’m going to take data out of a system do some stuff to it maybe add Excel sheet add some enhancement add some columns or fields and then try and write that data back up into well that’s that’s what I’m saying

55:39 into well that’s that’s what I’m saying like so yeah maybe I said it wrong but what you’re saying now it sounds like you’re saying there’s there’s capability the business users are now getting of the tooling we have today we now have capabilities of doing things that we would traditionally see being done by a data engineer well and those those skills now are available to us at this at this level now so yeah I see what you say so let me give a solid example I’ll I’ll we’ll go back okay let’s do an example think about all the these critical mass systems that companies have that are still coming from Excel files and CSV

56:10 still coming from Excel files and CSV that we have the responsibility of putting into the report it had to come from these old dirty systems they were never going to go into the the engineering side and the only unfortunate things that lived in that report or that model and then bought that local and tribal knowledge yes correct right but that was mission critical and that’s what things were reported to finance but it was essential I can now take the same basically methodology that I’ve done and I can and rather than having that only live in

56:40 rather than having that only live in this little power bi world I can now push that and actually store it as a table in SQL and now now it has the same phone like with all of the bells and whistles that a normal SQL table would have so are you saying something along to the effect of I had a system I would export a bunch of data out that export data would live in Excel sheet I would do data engineering to that Excel sheet I would manipulate it and then I would email out that Excel sheet to lots of other people therefore adding value of that table or information that I’ve developed you’re saying now in power bi do the same thing

57:12 saying now in power bi do the same thing but now just do it inside a power bi space where I’m writing the data out to a SQL Server a system a place that the entire organization thing can get value from that same engineered data once once power query touched something it was stuck it was there and it was going to be in power B it was never going back up no matter what I cleaned in in power bi but it was going to stay in that world but now I can

57:42 to stay in that world but now I can basically create a SQL table that’s in a sense has sense has it’s not necessarily any more special but it just it has it can speak to how many systems you said SQL rules the world and now I can actually in a sense talk about that I think I think I I think I agreed like that that feature or capability addition right it’s a capability addition to what we’ve already been doing well I do think we hit a lull here or at least we were out of time let me see it that way we probably could talk about this for another 30 minutes

58:12 talk about this for another 30 minutes easily so I do want to say this has gone incredibly fast thank you everyone the chat has been absolutely Lively really good things and comments here in chat I want to call out a couple Kurt thanks for showing up Alex I really appreciate you you heard us apparently you your ears were burning when we were talking about you earlier about how you brute force and write all your M code without even looking at the UI so we apparently you that’s that is your beacon we the bat signal for M so thank you very much

58:42 signal for M so thank you very much for everyone chatting and participating in the chat window we really appreciate you our only ask with this episode or any of our episodes if you like what you hear here if you like what you’re listening to or if this is challenging your thinking right if this is making you think more about what this new ecosystem is going to be doing we really would love you to share this with somebody else so please write a little social post give us a thumbs up maybe subscribe we really like the feedback and engagement from the community please share with someone else that you think might find this other this other this podcast either boring or valuable

59:13 podcast either boring or valuable give it to your evil enemies right here here we go listen to this great podcast and let them doze themself off to sleep or if you have friends let them know it’s great and you love it so we appreciate that too Tommy where else can you find the podcast you can find the podcast anywhere it’s available on Google Spotify and apple make sure to subscribe if you have a topic or an idea that you want us to talk about you can you have the ability to do so talk Upstream like dataflows are doing now go to powerbi. tips Slash the podcast and

59:44 to powerbi. tips Slash the podcast and just submit a mailbag and finally join us live every Tuesday and Thursday 7 30 a. m Central do we have a do we have a data flow for our mailbag we do don’t we I can make it a sequel table there we go let’s let’s put all the mail back into SQL let’s go right back to Source systems I love it this is amazing everyone thank you so much we appreciate your time we’ll catch you next time cheers

Thank You

Thanks for listening to the Explicit Measures Podcast.

If you want to help shape future episodes, submit a topic idea here: https://bit.ly/3i8LdBo

For the full archive (and more Power BI + Fabric resources), visit: https://powerbi.tips/podcast

Previous

The Root of All Problems – Ep. 235

More Posts

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.