PowerBI.tips

Workspace Design in Fabric – Ep. 248

September 8, 2023 By Mike Carlo , Tommy Puglia
Workspace Design in Fabric – Ep. 248

This episode breaks down practical Microsoft Fabric workspace design: how to separate environments, clarify ownership, and keep permissions sane as you scale. If your tenant is starting to feel messy, you’ll walk away with a repeatable pattern you can apply to new teams and new data products.

News & Announcements

Main Discussion

Workspace design is one of those ‘invisible’ decisions that determines whether Fabric feels effortless or chaotic. Mike, Tommy, and Seth lay out how to structure workspaces so teams can ship quickly without sacrificing governance or day-two operations.

Topic: Workspace design in Microsoft Fabric

  • Start with a small set of repeatable workspace types (for example: dev/test/prod or build/consume) and only add complexity when you have a clear reason.
  • Make ownership explicit: who is responsible for the workspace, who can deploy, and who is strictly a consumer.
  • Use naming conventions that encode domain + environment so people can reliably discover the right place to publish and the right place to read.
  • Design permissions from the start (groups + least privilege) so you don’t end up with ad-hoc access sprawl.
  • Use workspace boundaries to reduce blast radius: changes, outages, and performance issues should be contained to the smallest sensible area.
  • Build lifecycle into the pattern: deployment pipelines, documentation standards, and an archival/cleanup path for abandoned assets.
  • Treat workspaces as product surfaces—optimize for clarity, maintainability, and supportability, not just where files ‘fit.’

Looking Forward

Pick one high-traffic area of your tenant and refactor its workspace layout into a clear environment + ownership model, then standardize it as your default pattern.

Episode Transcript

0:01 [Music] foreign good morning everyone welcome back to the explicit measures podcast we’re here

0:33 the explicit measures podcast we’re here early this morning with Tommy Seth and Mike Mike for a live audience yes but for our normal that’s true we’re at 95 percent of the people it must be an interesting like listen right if I’m on a podcast and all the time every time we’re talking about hey we’re we’re live and we’re pre-recorded you’re like you’re always we’re live yeah it’s early and it’s like there’s a disconnect here yeah welcome

1:03 there’s a disconnect here yeah welcome to all our listeners we’re trying to be mindful for all of our audience for all the three percent out of if you actually for all three of you we’re trying to be mindful for two who listen live and one of you who listens there’s at least two listeners all the time right yeah no one of one of us is talking and the other two is true listeners it’s true that’s true awesome well let’s let’s jump in so I have a couple I have a couple ideas I

1:33 have a couple I have a couple ideas I want to run by you guys just because I’ve been having conversations around how to integrate parts of fabric with this new pbip format area and I had a recent conversation that I thought was very interesting around how do you use a pbip format when you’re a power bi Pro user like your license is literally a power bi Pro and you’re not using premium so I literally had to take a step back and be like what are you trying to do

2:04 and be like what are you trying to do and how do I have the conversation because I was trying to figure out it’s just like the customer was like I want to use the pbip format I want to track multiple changes I want to build a data model and have people make multiple changes the data model and then merge those changes throughout the day or you those changes throughout the day or at the end of the day merge changes know at the end of the day merge changes back together and I was trying to really understand like okay what is your use case here what’s really going on and so in my mind I was trying to formulate okay if you need to use get to track your changes you can

2:35 get to track your changes you can totally do so with the power bi Pro license rui just mentioned this month that you can now publish a pbip project file directly to the service so you independently of what happens inside power bi. com it’s easier if you have premium because the get integration just works with it right you have to get button it you can publish reports you can pull down from it and make changes no big deal but now that there’s this idea of you can have an entirely separate git repo that doesn’t even touch power bi desktop

3:05 that doesn’t even touch power bi desktop so your workflow would be go to the git or go to your vs code right pull down the latest version of code of code from your git open the pbip file work on things locally done right it’ll automatically change files for you great you can check them back in put them back on Main Branch or whatever you want to do do and then you can publish right from the pbap on your desktop you still have to publish from your desktop to the service because you can’t use the get integration directly in the service at

3:35 integration directly in the service at this point for a pro user anything else premium for use or anything else you could still just use the normal link your workspace to a git Library what are your thoughts on this have you guys explored this at all is this something that you’ve like played with a little bit well to your point Mike technically speaking if I just had access to the repo I could just push to the repo and right if that’s on the right Branch that’s gonna go show up in the service it doesn’t show who the user is well you in order it won’t show so that’s the that’s the one trick I

4:07 so that’s the that’s the one trick I think that’s different here between a power bi Pro licensed user and anything that’s in premium because you can’t link your workspaces workspace unless you’re in that user I see you’re falling but the thing the funny thing here Tommy is you’re falling into the same thing that I was like that’s exactly what I went to I was like well this is amazing like use it yeah and then I was and then I was like wait a minute you can’t get it from the you can’t get it from desktop or the git into the service without using premium

4:38 into the service without using premium workspace yeah correct technically speaking you got people outside your organization working on reports for you just put in the repo correct yes okay have them work off the repo right build things there and then check things check out on the repo and then you’re good to go go anyways I was just in my mind I was like oh there’s an aha moment here so there’s I think there’s almost like a session here around how to work with Git as a pro user how to work with Git as a premium user because the experience between the two or the process on how you would build and develop things is going to be uniquely different between

5:09 going to be uniquely different between those two spaces now there’s going to be some similarities right you’re going to have check-in checkouts you’re going to have branching like your team and this is where I was trying to think about like what does the team of Engineers need to understand because git is brand new to a lot of these business developers they have yeah like they understand a couple words like branching but like when I went to this like let me give you one example I’m getting really excited here I’m sorry when you have the model so again going

5:39 when you have the model so again going really deep into the pbip format there is this model. bim file the model. bim file describes the entirety of the report inside the report object there’s also a single file called report. json these two objects are anything you change on the landscape the look of the report add visuals add pages the the report file will just change so it’s not like Tyndall which is and this is a format that Microsoft is going to be using eventually for I think the pbip format

6:10 eventually for I think the pbip format but the Tyndall format breaks everything down into files that are smaller it’s better for checking in and checking out with Git with Git the way things are right now yes you can use the pbap format but when you change something you get basically two massive files that change and you have to sift through this really long long model. bim and figure out what part of Json actually adjusted itself when you add columns or change things same thing for the report the report’s not broken out into small little files that are easy to check and check out so if I make

6:40 easy to check and check out so if I make a new page there’s not a file that describes this is the single page I changed and I can just add that page to my report it’s one big file you’re merging the same report. json over and over again so when I was trying to describe this to the customer they’re like well this is great we can use pbip I’m like yeah but it doesn’t really do what you want like it works and you could definitely come from a repo down but imagine you have two people making two different branches the same day and I want to merge those changes back together right

7:10 merge those changes back together right if someone adds a new data table and

7:12 if someone adds a new data table and someone adds a new column or measure in a different table it’s a lot harder to merge those changes because the the merge of changes are coming through a single file and I think the worst maybe not the worst part but it really relies too on someone really utilizing like vs code or Dev Studio because you can’t connect to a existing Branch after you the first time so let’s say I connect to an existing Branch if you’re in the source control you have to go back you can’t do it from

7:42 you have to go back you can’t do it from looking at just the updates so you don’t know if there’s a new branch you mean in in are you talking about power bi service and power bi service oh correct yeah so that’s another thing where that’s different that’s that’s a difference part of the process though right if I if I’m working on something I don’t want to necessarily publish new because I’m going to see that in the service so you’re that’s a whole other layer here I’m not saying that’s a different part but whole other layer here too it’s gonna be it’s kind layer here too it’s gonna be it’s tricky like I’m trying to figure out of tricky like I’m trying to figure out what’s the best way to go about this

8:12 what’s the best way to go about this even thinking through like where does git live inside let’s imagine you are in premium yeah you have Dev test and prod maybe you’re using deployment pipelines what does that look like does it how does that work with your how does that work with your pipeline and I and I I liken to think here that you’re I’m trying to figure out how how do you justify or how do you make sure that each of the branches are doing or each of the environments are doing code branching correctly do you have three

8:43 branching correctly do you have three branches that are Dev test and prod that match each the environments or do you have just main that lives in all of them and then you only push reports through to the environments slowly so that we don’t accidentally push all all changes at once I think there’s still some work here for people to figure out what’s the best pattern or what a process would would look like to work through this system I think there’s just some more exploration there I would agree especially with the level of comp complexity right that that you’re introducing especially for business users right yes and and that’s

9:14 business users right yes and and that’s how you started this so I I guess the and I I thought you said there was a question in here although I well I think the question was more like have you thought about this have you have you experienced questions yeah I guess this falls into multiple buckets right like as as a team building power bi reports yes do you need some Version Control 100 we’d all agree with that simplified format there’s SharePoint right I can revert back to a previous pbix file or

9:45 revert back to a previous pbix file or file or whatever and redeploy that to change back to whatever that that thing was so proud of still prod and or that report looked like before somebody snafooed it yep where that may change is do you need to know what the specific changes were to the report and that’s where okay like then you’re ramping up into this manual get process yes potentially but is it worth it I

10:16 yes potentially but is it worth it I don’t I don’t know yet well right and then the other part of that would be are you in a situation and I’ve been there before where you have very tight timelines and you have a lot of reports to build to build and you need multiple people working on the same report at the same time yes yes so it would be great like to say okay part of our process is get we’re going to break this apart as we’re maybe delineating Pages

10:47 maybe delineating Pages whatever but the changes we’re going to resync every four hours or something to make sure that the model merges together and everything’s working then we yeah again well right so this is another challenge as I can see but then like but there is there’s significant ramp up and we we are talking heavy Dev tools and heavy processes and procedures on the back end that most business users are not familiar with and yeah and I think two things too that are important right now and Seth I I completely agree because one of the points I was going to

11:17 because one of the points I was going to say is how is gets not more efficient than normal publishing but it’s not meant to be and I think it’s supposed to go through the more that Source control or that the process the steps the other you can see every little thing that changes right but the other I think was the other thing no one’s editing the actual code and I think that’s where it gets usually powerful you’re not is I’m you’re going through the Json I just I just go right to the code I just write it yeah yeah good luck with not even tingling yeah yeah I sure do yeah

11:49 yeah I could just see you right there like let’s see where’s that wire I need a new visual I need to write 60 lines of code to get that to work yeah okay oh what Hat’s after you but still but there’s no efficient coding right now with just in a sense files it’s it’s still you have to either use like Tableau editor or you’re opening a power bi desktop and I think that’s where the disconnect is with get being efficient with power bi it is cool I can see all those things yes but I’m not really going in right now and I’m reviewing it

12:20 going in right now and I’m reviewing it but I’m not necessarily gonna go and build five ten measures and or build another report page within the get file and I don’t think people will do that until they get to like right so when when Tyndall shows up it takes the single massive file it describes the model model. m and breaks it into many different files that are easier to describe per table but still no disconnect way better no intellisense Unless somehow they integrate like the the Bim model there if you’re writing they’re supposed to be doing that like

12:51 they’re supposed to be doing that like that’s that’s something that I believe there’s an open project right now from Microsoft to make Tyndall an actual readable format and make it so it auto detects what’s going on in there okay I know this was a very long intro I was just having a lot of Head Thought around premium Pro users and the integration with Git and I think there’s a lot of value here I’m not sure it’s for every project like you said Seth and I don’t think it makes sense to have it for

13:22 sense to have it for all users I think there’s going to be a very a very focused area of development that needs to be taught and educated on what git is how to best use it and to your point Seth I’m just really nervous with people turning this on just because if someone takes a branch out and waits two days a lot can change in a file in two days so are you constantly pulling the changes back down to you locally are you constantly it’s just they’re seeing there’s going

13:53 it’s just they’re seeing there’s going to be more problems with this and you’re gonna have to figure out what happens when the get will not let you commit back because there’s a conflict and don’t forget you can still publish the pbix and it’ll convert it back to pib pbib pbib so you can completely also circumvented too yep so anyways all right enough of that let’s move on to our main topic today so our main topic relates a little bit here this also talks about like this is more like a governance Administration type topic

14:21 Administration type topic we are now getting this new thing called fabric fabric now shows up to our our environments now and we have the ability to add a data scientist a data engineer into our workflows there’s these these new workflows we have notebooks now we have lakes there’s more there’s more artifacts I would call them that are now living inside Power bi there’s now data activator cousto synapse pipelines like all these extra things that we’ve never had very good for the business user perspective because now there’s

14:51 user perspective because now there’s more options for you to build things so business users get ready you’re going to need to start learning things that’s part of this problem but on on that thought now that we’re bringing more data engineering tools into the fabric ecosystem how do we govern this like to me there was this previously on the previously on the episode of Power bi there was this whole dashboard report and data set like those are the three main things I was interested in negotiating where do I give

15:22 negotiating where do I give people access to do they get access to the workspace or do they get access to a data set or do I publish an app to not put people in the workspace so everything in the power bi. com experience felt like there are different layers of access that I’m going to provide to people right I’m going to provide you access to a report so you can see it with an app or we’re going to go even further down and say okay I’m gonna let you have access to a data set and build your own artifacts on top of that what we’ve just now added is three four I don’t know how many more

15:54 is three four I don’t know how many more new things more new things to govern and control control how’s this going to affect how many workspaces we build what does this look like in in context of The Medallion architecture bronze silver gold so that’s the topic for today kind that’s the topic for today open Curious your thoughts on on how of open Curious your thoughts on on how this works Tommy and Seth do you guys agree with what where what’s your perspective on what is fabric adding here and how do we govern it or control it I’ll kick it over to Seth here what do you think Seth

16:32 I think I think many things relevant to this conversation I don’t know how many I think what what workspaces in fabric are stretching here is if if you outlined all the different ways in which power bi itself was governing how we share content to end users this is like it almost seems like we’re we have that a stretch in the other direction now right there are many types of content creators creators within fabric that are part of these

17:04 within fabric that are part of these workspace ecosystems and any are we driving more towards like how do we potentially have to alter the way in which we created workspaces for end user sharing versus now we have like workspaces that Encompass many parts of the the data analytics ecosystem creation of data for reporting yes as well as that same reporting right so to

17:35 well as that same reporting right so to me it’s not we’re not incorporating or challenging like the way in which we share content because that’s still done the same way correct from a push out perspective but are you saying that because of all these new groups of people that are part of an ecosystem and because the way fabric Works across that is by expanding workspaces right yes because a workspace really was just a container for tierpoint reports and data sets right product yes and now

18:07 and data sets right product yes and now it’s a a an Avenue across the data estate right for all of it to happen so so it’s I think we we talked about this a little bit in terms of the roles and things but are you are you challenging the idea that we would potentially need more workspaces assigned to like the same business unit to separate out the content Builders

18:37 to separate out the content Builders versus what we have for content sharing that’s a great question let me let me give you some more scenario here that might help round out that question a bit more let’s think about let’s add just one new persona into our ecosystem the data scientist right so we have this concept of data sets and reports right now we’ve added this context of a data scientist a data scientist is going to potentially use notebooks and build things inside the notebook experience but they need access to information and or data so one of the

19:10 to information and or data so one of the best practice guidance is right now kind best practice guidance is right now prior to fabric was hey you probably of prior to fabric was hey you probably want if you’re going to go Enterprise data you’re probably going to want a workspace of just data sets right you have Dev test prod of just where you store the data sets and then you’re going to pair that with another workspace that is Dev test prod for just the reports so now I conversion the data sets independently of the reports I can control access to the team that builds just the report side and then I can give very meteored access to the data sets so the data sets and the reports basically

19:40 the data sets and the reports basically had their own pipelines to get things matured now incorporate this concept of the lake house so do I want my lake houses to live in the same space as the data sets does that make sense does it make sense to have Dev tests proud of those things if I’m this data science person that’s showing up to this what what data should they touch do they need access to the data set the workspace or some or the entire lake house environment so I

20:11 entire lake house environment so I does does the lake house or lake houses however you want to build that as you build data in tables or build data in that engineering space do those lake houses now become part another another pipeline so before I was managing six workspaces do I now manage nine because now I have this whole thing of lake houses that I care about so this is this is where I’m going is I feel like there’s a potential area here we have to be I’m trying to design the workspaces in a way that makes sense to not make too

20:41 way that makes sense to not make too many workspaces but is easy enough to manage and controllable enough to give direct access to the things that we need to have access to which lake house does that make sense what I’m trying to say like does that resonate there yeah and there’s also the complete conversation on the organization of the artifacts too honestly with all the things being brought up there’s a complete evaluation that needs to occur from the ground up on the design of a workspace in terms of not just who

21:13 of a workspace in terms of not just who but what goes in there and not even not even if you took out the dev test prod which not saying you would but even if you took that out of the equation the amount of artifacts going into one workspace from a like one notebook one lake house and a data flow correct and now trying to incorporate into there

21:31 and now trying to incorporate into there as well say reports in dashboards if we took a step back and looked at the ground up when we used to create a workspace or what we would recommend it was usually from an organ it’s like you was usually from an organ it’s like to fit 10 to or 5 to 15 reports know to fit 10 to or 5 to 15 reports meant for the collaborators building report and for the app audience that was the only two considerations right they usually had in terms of what would go in a workspace generally speaking yes I think I think I would agree with that one but again I’m going back to this data science experience

22:02 back to this data science experience right exactly but then like now what data do they need do do I build do I build a single workspace that is only lake house related and in that Lake House Co the lake house in itself has its own concept of like raw data you its own concept of like raw data silver data gold data right so you know silver data gold data right so you even even in the context of a lake house you have this idea of like maturing and building and grooming this data down so it’s actually more enriched right so right so do you want your data scientist showing up to a workspace where you have data sets that are in production and data

22:33 sets that are in production and data lake houses that are in production do you want them in there I don’t think you do I don’t think you want your data scientist in that workspace so what I’m saying is you may want to give them access to those data but they may have a separate workspace that’s just a test workspace and maybe that other workspace for the data scientist again I don’t know what they need or what they’re going to do with their job but that person could be wanting to connect to data in Dev data in test and data in production to do some analysis or only want that one workspace to only touch things that are in production so

23:03 touch things that are in production so now now that you have this notebook that’s built with data science how do you now work that notebook into the production pipeline where does that go does that like so there’s all these extra stories I think that are occurring now in how I want to provide direct access at various points in this pipeline does that make sense yeah I think it starts with the core right and what the core is are we utilizing lake houses are you utilizing SQL databases throughout this process because you may you may be doing SQL

23:33 because you may you may be doing SQL databases still you may not do lake house right and that’s going to I think dictate a lot if it’s lake house let’s say let me I’m gonna challenge you a little bit I think yes the answer is correct but I think if you’re going to be getting involved with a SQL dataware of SQL database right I think that’s still an area that we would call that’s still very centered inside it at this point right so you’d have to iterate again I’m not questioning that you wouldn’t would or would not use them that’s not my question here I’m just thinking about

24:03 question here I’m just thinking about what are the things that a business user like when I get in when I get into fabric what can I create there 100 there will be SQL databases we’re going to need a link to we’ll get data out of them like that makes sense I totally agree with that but I don’t think a fabric user is going to come in and trying to turn on a SQL database anywhere because there’s no capability to do that today inside synapse well yeah yeah but that is it it’s it’s a SQL database but it’s not though like I’m not sure I would want to use

24:33 like I’m not sure I would want to use SQL serverless so I think that I think of that SQL Server as SQL serverless you get a table you can make databases you can make data there but I’m not sure I’m convinced that I want to spend a lot of time time on that because I I don’t know my hesitation here is the SQL database is a more expensive more Enterprise grade solution the lake house is still very Enterprise grade but I think it’s a more lower cost solution when we’re talking about reporting and reporting of data

25:04 about reporting and reporting of data I think of SQL as more transactional than nature as opposed to I don’t so SQL serverless makes a lot of sense because when you access the data it turns on you access the information and then it turns off and goes away you don’t you don’t get billed 24 7 for that machine I’m thinking of a SQL server in more of a traditional sense it’s you turn it on and it’s it’s on for 24 hours a day 365. and I would agree with you but I think knowing the situations you’re going to run into also

25:40 we’re talking about workspaces that for like Dev test prod for final final reports but at the same time like is there anything in fabric that allows me to easily deploy like if I have a test environment and I’m building things in there to deploy to a production workspace like that when when you talk about test and prod to me right like that that is not functionality that I’m aware of that like oh hey what we have this like one massive environment and

26:11 like one massive environment and everything just lives in it and we’re going to segment things out but how if you’re trying to lock people into non-prod environments how did they deploy to prod how would they deploy from one workspace to another another I don’t know I haven’t I haven’t linked up it was just something I haven’t tried yet I haven’t tried making multiple workspaces I don’t know what artifacts move through the deployment pipelines at this point and if it doesn’t if it doesn’t pick up everything yet yeah I would imagine is eventually going to pick up everything so what

26:42 so what as you’re talking I’m going to leave test and prod out of this right now okay yep let’s just talk about fabric for a moment there there’s there’s a concept here where can can I make an argument that in this environment there is probably a workspace that I call like core engineering or core data science that are going to be working across all of the data within my ecosystem I think the answer is yes for that I think I think that will happen and these

27:13 think I think that will happen and these are the people that are creating the layers right like I’m doing the rods yes front what do you whatever you want to call it the three layers Brown silver gold whatever you want to do yep and then and then what happens is it’s highly in my mind if I’m going down this path these Engineers have access to all the data and then business units ultimately we’re going to be creating data marks or segments of some of this data or data that’s supplementing potentially what a business unit is doing in their workspace

27:44 doing in their workspace right but that like there’s there’s got to probably some be some symbiotic relationship or like we’re all sourcing from the same place of data otherwise you’re going to find yourself in silos again yes correct and in that environment if I’m if I’m in the business unit workspace to maybe it’s just a matter of like those like either they have their own Engineers or the engineers that have access to core also have access to that workspace and they’re doing the business

28:14 workspace and they’re doing the business unit work in that full flow and everybody has access to everything because in that case then you’re enabling enabling business users on the very front end who may not be aware of these things initially now but you would be opening up the door of like oh man something doesn’t look right in this report but we’ve cataloged this we know that here’s how it all works and I have access through all of this so I can I can trace where the breakdown is happening because

28:43 where the breakdown is happening because it’s all part of my purview and I think that’s one of the challenges where as independent Services right now in azure in azure most people don’t have that yes right like they don’t have access to something or they can’t do something now like this opens the question of like should they right like I don’t know if they if they were supposed to have access not without training or education I would say no say no so I I guess like I I guess the challenge here is like do

29:15 I I guess the challenge here is like do we just open up the workspace to the existing users that we would have that would the normal bi developers anyway that would be generating the reports and I would I’d probably argue yeah well I think this Mike I’d love to enter a question here because I think this dictates a lot of this how large is then a single lake house does how many reports in a sense does it Encompass is it acting like a single data set for multiple reports or are we gonna multiple data sets off of it is it treated as like a department level lake

29:47 treated as like a department level lake house are we creating multiple lake houses this is a great question I’m going to I think also dictate how large that workspace or what’s in that workspace yeah but that wouldn’t that be something that you would stick in a core core location and then you would share that’s what I’m thinking those data sets to the other workspaces well and this is like this is where we’re sharing the component like you they don’t need to see all the layers like but does do all of those other workspaces have access to the gold layer probably probably I would imagine at least in my head that maybe

30:17 imagine at least in my head that maybe you have a larger I’m not sure of the size yet but the lake house would serve multiple data sets and we’re utilizing I would hope so yeah and again we’re a little utilizing the shortcut feature to the other workspaces but again oh man this is getting good like how many like 20 lake houses in a workspace with all the artifacts that’s from a manageable point of view too not sure I agree there yet so so lake houses are designed to be super big yeah

30:47 houses are designed to be super big yeah finite by their design right so the idea here is I need to figure out a way to to allow people to so let me say it this way the way Microsoft has described the lake house in my my in my mental model they they describe things wrong though the language they’re using to describe one Lake versus lake house versus data sets in my mind is different if I had to take this from a equivalent SQL standpoint I would say one lake is technically the

31:19 I would say one lake is technically the one like icon that is technically the lake house the idea of a lake house is this massive thing that where all your data lives you can search from it anywhere and you’re now giving permissions down to table level details on things right so that’s that is the concept of one lake when we talk about a lake house the way that Microsoft describes it I think of that as being the equivalent of the database so the the one link is equivalent to the server the lake house is equivalent to a database so if you if you’re a SQL Developer that that’s how my mental

31:50 Developer that that’s how my mental model works here the server can host many databases now when we go into this concept of lake house right in your you’ll have these areas of enrichment of data and then the same way that I’m doing doing metered access to production and I think I’m thinking in the production environment everything’s read-only right production is a read-only environment for reports and data sets yes you can connect to them but you’re not making changes on that stuff you’re doing those changes somewhere else making sure those

32:20 changes somewhere else making sure those changes are good and then bringing it to that production workspace so I feel like the same thing happens around the lake houses so the lake house you’d have this Medallion architecture Brawn and silver they’re usually not in a place those tables are not ready to go for people to use them so those areas are usually relegated to only developers and only after we get things through into a final State the gold layer that’s when we want to start exposing things to a broader audience so again I’m just trying to think like you

32:52 again I’m just trying to think like you either throw everything in one workspace and then manage so nobody has access to the workspace but you manage everything with access via security groups I’m going to add the security group permissions to this lake house in this workspace or you give some more capability and you add more contributors to the workspace to allow them to actually work inside those fabric elements and so this is this is where I’m having a lot more thought around this because there’s just there’s just

33:23 this because there’s just there’s just so many more options here now and if I think about like let’s just talk about data flows data sets and power bi power bi so the report of the report sorry data flows data sets and and reports the report level access is one layer of access access if I want to give more access I could give access to the data set which means now I have a relationship model I’ve got measures I’m giving you other data it’s still enriched it’s still better than just raw tables but I’m giving you some more access you could even go even further and say I’m going to give you

33:53 further and say I’m going to give you raw table access now as well but the table access was like things that are coming out of a data flow and if you’re not using dataflow then that doesn’t isn’t relevant to you so if you think about those layers you have table data set and Report now we’ve added maybe one layer before that that is now lake house or maybe those tables no longer live in dataflows and now you have this Lakehouse structure that could have many more tables and then take that table architecture and say okay well there are different quality level tables there’s a bronze silver

34:23 level tables there’s a bronze silver gold data in that area so what level those tables it just gets so much more evolved here and you really need to have a strategy around what do you care about and what are you allowing people to have access to because I don’t think you want to give access to everyone right day one out of the box because it’s just going to turn into a big mess of stuff it’s just going to get more messy than it already is because like in clarify for me because security currently is set up well like once you is on the workspace

34:54 well like once you is on the workspace level right well that’s one layer of security yeah you have you have security at right today right you have security at the app level audiences apps that’s one layer of security to actually do this to two layers of security then you go down to the workspace security level and then you now have security individually onto the data set so I can give you access to the data set without giving you access to the the workspace yeah but all the work all if I’m if I’m part of a workspace and fabric and we have a lake house

35:24 fabric and we have a lake house and that is part of my workspace is that user defined at the moment I know like it’s going to be at some point isn’t it I believe you can actually manage permissions down to the lake house already so in the same way you can manage permissions to the data set so nobody has like so example prod workspace I have a data set I cannot give anyone access to the workspace but I can give them access to have build permissions on the data set meaning they can go build their own Excel sheets same

35:53 can go build their own Excel sheets same thing happens at the lake house what’s the experience where and you guys have played with this more than I have if I if I build a lake house in a workspace and I share share access or share table to a different workspace can that other workspace build on top of the lake house or build like or is it like creating you create short yeah you create your own repository of data that like virtually here I know yeah herein

36:25 like virtually here I know yeah herein lies my predicament so I believe so I think the answer is yes Seth I think so the answer is you could build your own thing correct so so this is and this is why I think oh man this is getting good okay so shortcuts to other data set is very relevant because I can have one table of Truth and I can give that table to other team members multiple right but when you’re giving access to a lake house you can then give access and again I think one security might address this more if fully in the future where I can actually give access to an individual table object inside so it won’t matter

36:57 table object inside so it won’t matter what workspace you’re in these are projects themselves understand who you are and what exactly you should have exactly right but right now again because we don’t have one security at the table level or even the file level right so so imagine folders in the files because the lake house has files and tables the files area could have a folder and you can provide access to a folder for people to then read and if I look right now at the access permissions for fabric right now at a lake house level if you add permissions to lake house you can add read all SQL endpoint data read all spark data so I can

37:28 data read all spark data so I can provide read level access to different compute engines to an individual user so the idea being I could have a separate workspace that I do build things in I could make shortcuts to this data sets and have that appear in my other workspace so this is this to me at some level this gets very this this could be very much a problem for organizations because if you start exposing the lake house itself and allowing the shortcut to be made everywhere anytime you change those

37:58 everywhere anytime you change those tables if you delete columns you now have a wider impact potentially of deleting or removing or adjusting other people’s data in other places it gets more squirrely not really like you can’t you can’t modify unless you have permissions on those core data sets you wouldn’t be able to modify them but what you’re in a read-only data set if I delete a column and you’re using it then you have a problem oh you’re saying Downstream effect Downstream of that yeah so so that’s like there’s two parts of this that I’m torn on right okay one is

38:28 this that I’m torn on right okay one is one is okay from from a build standpoint if I was starting from scratch right yeah all of us are right all of us clean environment no mess brand new companies all over the place is is is the okay I have a core data set I can share that with people and then you’re integrating your business user right like the people who with all the domain knowledge and you’re building out your data Mark right and it comes from the same core data I think that that makes sense right then that belongs in the

38:59 sense right then that belongs in the workspace it’s great but what’s missing here is how do I certify that and or how do I reuse what they’re creating in other parts of the environment like it’s almost like what should be happening or what should happen here is like if we’re creating a lake house for the organization as a source of data and then we’re building out different cuts of it called a data Mark right yeah how does like what’s the path to get

39:29 how does like what’s the path to get that back into core right so that hey like great I want these people to manage it but they shouldn’t manage it in this workspace Silo right like it should be part of an overall thing that everybody can yes use and and it would it shouldn’t be unless unless the environment is just going to work work so well that it’s like oh we share this content with you now these people who have created this additional content with dependencies on this are sharing this with you you’re gonna need more money you’re going to create a third

40:00 money you’re going to create a third third version of this yep and that that’s going to be shared with the organization unless all of this is just like here’s shared stuff that you can use but then then there’s the governance question let’s quickly go into is like who owns what and maybe that does work though though right maybe it does work that like hey I have a core this business business unit owns owns whatever happened over here and they’re sharing this content and they’re saying this is their certified content out of their business unit maybe that does work because then you have the

40:31 that does work because then you have the clear ownership it’s just segmented in a way I think that makes me nervous specifically from the from yeah where I built things in the past but if there is a path here where it’s like okay I have core the workspaces for this business unit they have their own people or people within there are using using the business use units knowledge to create these artifacts and maybe those artifacts do live there but you would definitely need to know in a corporate strategy that this is how

41:01 a corporate strategy that this is how you’re going to go about implementing exactly so that exactly things break and it’s going to be really responsible for like it’s going to be really challenging to modify anything too to your point what if somebody removes a column column what if the update happened in core and every data set across workspaces uses the same thing how do you test that and now you throw into this product all right good good conversation guys yeah confused

41:31 Tommy I think if anything I think you’re you’re gonna make a couple comments we’ll wrap here Tommy’s got to run off to something so let’s wrap it up here I will say is if anything all I take away from this conversation is there is now more importance than ever to have a strategy around if you’re going to use fabric you need to think about what does governance look like and I think from your comment Seth it’s going to be more important for us to identify data owners or data stewards of these different artifacts and making sure that we have don’t just add everyone as admin on everything that’s

42:01 everyone as admin on everything that’s going to create more problems all right with that this has been a very quick conversation thank you all very much for for jumping in here and talking to us about fabric things and things around fabric this is a topic that we will likely have to re-explore over and over again it’s going to evolve as Microsoft adds more things around fabric our only ask to you guys as the team here is to can you please share this content with somebody else if you like this if this was thought provoking if it helped you think through fabric a little bit more around how you govern an administrate your various artifacts

42:32 administrate your various artifacts give it a share let someone else know you enjoy this one Tommy where else can you find the podcast you find the podcast anywhere they’re available Apple Spotify YouTube join us live if you want for the majority of you are listening but also you can subscribe to make sure to do that leave us a rating if you have a question a topic or something you want us to talk about you can head over to power bi tips on our beautiful website leave us a question and we’ll talk about it maybe and then finally we’ll see you yeah next week sounds good we’ll see you next Tuesday

43:02 sounds good we’ll see you next Tuesday yeah sounds good bye

Thank You

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

Working in Data is Hard – Ep. 247

More Posts

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.