PowerBI.tips

The Separation of Data & Content – Ep. 324

May 31, 2024 By Mike Carlo
The Separation of Data & Content – Ep. 324

This episode discusses separating data and content in Power BI/Fabric—how to centralize semantic models while enabling other creators to build reports safely (including the messy reality of roles and RLS).

News & Announcements

Main Discussion

The main discussion reviews an article and then expands into the team’s real-world pattern for scaling report creation.

Key points from the conversation:

  • Separate model workspace vs. report workspace: keep semantic models in a controlled workspace, and let creators build thin reports elsewhere.
  • RLS and permissions: how row-level security still applies, and why “build” permission and workspace roles can get nuanced.
  • Enable creators without breaking governance: a practical way to scale report building while maintaining a single source of truth.
  • Security groups and repeatability: using groups and consistent permission patterns to avoid one-off access management.
  • Why this matters: it reduces duplication and prevents every report from inventing its own definitions.

Looking Forward

If you have many report authors, pilot the “central model + thin reports” pattern with one certified model, documented permissions, and a repeatable onboarding checklist.

Episode Transcript

0:29 good morning and welcome back to the explicit measures podcast with Tommy Seth and Mike good morning everybody good morning afternoon evening good time of day whatever time it is for you I don’t know he trips me up Tommy looks like you’re gonna say something no I’m usually used to such an exciting intro I don’t know what to do on Thursdays I feel like we still need some hook for our Thursdays should we have like a down a a somber

1:00 have like a down a a somber moment on Thursdays oh it’s Thursday almost time for a weekend to when we stopped working on powerbi nailed it you stopped working on powerbi nailed it my dad on Fridays getting out of know my dad on Fridays getting out of the elevator after work with a employees he would go the best thing about Fridays and they would go what John he’d go two more days till Monday that’s that’s the best thing might be the worst thing

1:34 all right let’s before we get into our introductions or the news here let’s talk about our main topic here today we we get to pull an article from a fellow MVP Tom Martins or Thomas I guess I guess would be full use a full name Thomas Martins has written a really good article around the roles in the semantic model so the semantic model and workspace roles kind semantic model and workspace roles really figuring out what is the of really figuring out what is the separation of my data versus the content that looks at the data so really good article very well written Tom I

2:05 article very well written Tom I believe comes from a very large organization that has lots of users of powerbi in it and so he has to really think about this at a at a large scale how does this work and what what is the right pattern to see here because I’m sure he sees a lot of good behaviors and bad behaviors in the space as well so with that that’s our main topic for today let’s go through a couple news articles Tommy what are you finding for news here yeah so a few fabric blog updates the event house is now generally

2:37 updates the event house is now generally available okay what is event house what is this vent house thing you’re saying yeah the event house quote unquote is The Cutting Edge database workspace to manage and store event-based data it can do a few things at real-time analytics as a scalable infrastructure and really it’s all about all the real time events that you might try to manage in Azure there was the stream streaming API we had in powerbi and so many other places where data is coming at you fast and what

3:09 data is coming at you fast and what Microsoft’s really been pushing and really been working on is actually having a centralized really works spacer or portal to manage all of your real time data both from the ingestion getting the events and then also managing that quering it and then acting on it so what do you what do you think about this it’s I I’m a little bit torn about all the real-time data stuff I think it’s interesting I definitely think there are use cases for it and again I can think of some but I

3:40 it and again I can think of some but I when I sit down and think about this I’m thinking to myself I’m not sure if I really understand what’s the large push for real time is this adding enough value to the business or are we just trying to throw a lot of compute at something to get it to be to building this real time thing I’m not quite sure yet yeah I think right before we were went live we were talking about this a little bit but one one that came to mind that I wonder if this is just not competing with but making easier within the Microsoft ecosystem is the

4:10 within the Microsoft ecosystem is the realtime dashboard analysis of systems okay yes right like I especially in very large distributed systems right where you have many many services in Azure that are handling events Etc and depending on how you set things up in microservices ETC there’s a lot of activities that you want to track and or insure are operating at a nominal level because there are certain things that hey like all of a sudden you have a

4:41 hey like all of a sudden you have a whole slew of customers using a new feature and they Hammer the API and start extracting a bunch of data what is that doing to your systems and the real-time events are like there are dashboards that companies use including ours yep to ensure alerts are fired off or people can see what’s going on and that that I think is is probably the most poignant one as opposed to some of the anal more analytical facing things we’ve we’ve

5:11 things we’ve we’ve talked about in the past yeah I I think that again I keep I keep leaning on the decision point of real time makes sense when people are watching and paying attention and getting information when I have to do actions on data very quick I feel like there’s a a Miss in my mind where in weekends in the evenings people aren’t really needing that data to be real time like so there’s like a window it’s like I to be I need to be alerted when I’m available and in the business

5:42 when I’m available and in the business is actively running but then what’s the purpose of having the rest of the 12 hours or 18 hours that you’re not even looking at the dashboards right does that stuff still need to be real time I need to turn it on and turn it off when I need to I don’t know I’m I’m because I keep thinking about till when I see the action I need to take yeah once again though Mike I think I think you’re leaning a bit into your probably experience your experience as far as being an analytics person

6:13 as far as being an analytics person right one of the greatest moves I ever made was going from a DBA into business intelligence sure because typically we aren’t the critical 24-hour s days a week week individual true just because you’re not working during the we there are people that do Monitor and make sure systems are still working true because there’s because there’s a lot of data movement that happens over the weekends right so if you if you think even in terms of you

6:43 if you if you think even in terms of you if you if you think even in terms of large data extractions resetting know large data extractions resetting reprocessing a whole bunch of data things like that when do companies want to do that it’s the weekend so if you’re supporting that like these those dashboards are still very relevant because there’s probably people monitoring it’s definit still Niche however I think there is going to be a bit of a an expansion of the types of scenarios a big part of this too is you can do real time on fabric items and artifacts so loaded into a Lakehouse and

7:14 artifacts so loaded into a Lakehouse and it’s very seamless and I think Microsoft’s really pushing that real time data has always been very difficult to pull for a non-developer what’s the SDK you’re using what is the platform you’re using how are you actually Gathering that data in you’re going to need to know the schema but now we can there’s actually another blog article that fabric released right after this about getting events which really giving a UI where it’s like hey I want

7:44 giving a UI where it’s like hey I want to know anytime a new rose added to Fab to my lake house sure while not the like it security or supply chain or call center real time needed I use a we’ve used a lot of that with power off me it was like Hey anytime a new records added we wanted to trigger some type of action that I do see being a use case and I see that more frequently I think of real time and maybe maybe where I’m thinking of I think of real time as like constantly having emitted events

8:16 like constantly having emitted events throughout time I do think there’s a much to your point Tommy and what Seth you were saying I think earlier is there is this use case of hey I’m going to get a handful of files at this time in the morning every time a file shows up I should process that file or go find that file and go do some stuff with it so I I have been doing some experimentation with the blobs the you have a blob storage event Hub so blob storage has event grid that goes along with it and so you can tap into other Azure Services

8:46 so you can tap into other Azure Services where when things show up you have the ability of saying hey I’m able to detect something showed up here an event is now created new file appeared and now I should go do something run this notebook go run this pipeline so so now that you mentioned this Seth maybe this is also a bit of a comp a competitive piece around if you look at what happens inside data bricks they have a very compelling story of do things in batches or micro batches or with one or two lines of code changes you can now stream

9:16 lines of code changes you can now stream data in here’s how streaming data looks like so maybe there’s a Competitive Edge here of Snowflake and or data bricks that I’m not seen as much or as frequently that this use case is attacking and that that could be a very valid use case yeah I think I think ones that that we well that I think about immediately are a while ago when Microsoft like plugged into the iot data and they’re very relevant use cases I think for you very relevant use cases I think for like delivery services or

9:47 know like delivery services or something but it it didn’t take off quite in terms of like that type of data syncing with some of the analytics data and I I do like how and I got to play with this this specific you

9:59 with this this specific you with this this specific flavor of events right but there’s know flavor of events right but there’s always a a need in in the past to kind always a a need in in the past to marry up the live stuff with of marry up the live stuff with yesterday’s or previous data right like and the system needs to handle both of those to produce some of the full-fledged reporting that really hits home with I think a lot of the use cases that I’m I’m familiar with I think this is a neat idea I’m looking forward to seeing where this goes one another theme I feel like I’m hearing a

10:30 another theme I feel like I’m hearing a lot here is there is this kql database I’m hearing a lot of people say hey kql is interesting you should be exploring it however I’m not seeing a lot of other MVPs or other teams or I’m not I don’t feel like I hear a lot of blog posts from the community around the kql database yet doesn’t mean it I shouldn’t be knowing about it but it feels like it’s it’s this new thing that people haven’t quite explored yet and I’m looking for more I guess maybe Community activity around where the k ql stuff is going as well so this also seems to be

11:01 going as well so this also seems to be tied very well with the the real time stuff stuff additionally all right maybe we move over to the next thing I think another part of this that is pairing with this real- time data is you now have the ability to connect and stream events into your fabric environments so there is a get events log article that just came out Tommy yeah so this is going along with making the event Hub generally available again the biggest hurdle for a lot of

11:32 again the biggest hurdle for a lot of people with trying to do streaming or realtime data is the source the metadata the schema and then what application or where you actually going to actually pull it into yeah they’re making that really easy in the same way that you would connect to a data source and a data flow in the service they’re making this pretty darn too and easy and I I’ve been testing this out with a SQL database because that’s one of the other things that they’re releasing releasing is not just your normal event

12:02 releasing is not just your normal event Hub and internet of things but you also have SQL databases we also have cosos DB Cosmos uhle Pub yeah so there’s a ton and they all the SQL databases the data capture I think it’s the yeah what is it CDC stands for change data capture that needs to be enabled but basically choosing your source just do some basic configuration and you’re really good to

12:32 configuration and you’re really good to go which honestly is more impactful more significant than you may think just to put a UI on that so this is one I’ve been complaining about for a while I just maybe maybe I maybe this is part of my solution that I need to start thinking about I always have this challenge of hey Microsoft I’m doing this listening to new data type thing all the time a lot of work I spend is hey I’m going to I’m going to batch a bunch of in I’m going to go get the data from a server from some location hit an

13:02 from a server from some location hit an API call something and the data will just come to me it’ll show up inside my environment what I want Microsoft to make this easy is just just make it current here’s the keys that I care about make it work just update the records where it needs to be updated bring the data in and it sounds to me this is a very close feature to what change data capture is doing so if you have a SQL database if you have some information that’s changing somewhere else else you can turn on turn on the change data capture and maybe this also dub tailes

13:33 capture and maybe this also dub tailes very closely with mirroring yeah so data mirroring is this other idea of like hey I’m going to mirror what’s in that other system it will send me only the change data captur elements and I’ll be able to stitch them back together and and produce a table that is actually real as to what’s happening in production that seems really interesting to me and I think that maybe will solve or simplify some of my pipelines potentially just need more data source es like not everyone has sequel or Cosmos or my squel there’s going to be

14:03 Cosmos or my squel there’s going to be other weird stuff going on so it’ll be interesting to see how that works so it’s another good another good article out there and the last one here Tommy this is probably you’re probably the biggest fan of this one because you love vs code like you’re a big VSS code guy I love me some vs code Gerald brel Seth Gart brle you are incredible at pronun I’ve also met GE hurt so I’ve met a lot of people and my friend I still have a

14:33 of people and my friend I still have a terrible time he if you are a vs code Visual Studio code user he has created a extension called powerbi studio and I believe just two days ago it was released for version two which has some major new features just version one if you don’t even get the update simply allowed you to look at all your workspaces all the semantic mod models you could actually pull do apis off

15:03 you could actually pull do apis off of SM model offer workspace just simply by choosing it now some of the major new features there’s integration with external tools and powerbi desktop huge I like that one a lot that’s crazy yeah bring down your files get your pbii files there and then open with tabular editor right from the UI that’s that’s that’s smart there’s someone memory statistics but there’s going to be a lot more support for the fabric apis I’ve

15:33 more support for the fabric apis I’ve had some frustrations with the fabric apis from a credential point of view so I’m really excited to see this but again just that additional new features along with what it’s already been I use it quite a bit and it’s a great way to look at my the service and what’s going on so definitely take a look at that the I will say this I feel I’ve been looking at the fabric apis over the last couple months or so and

16:05 over the last couple months or so and boy have the number of fabric apis really grown here recently there is a ton of apis now showing up and it feels like every time I look the list keeps getting longer and longer and longer there is a whole bunch of functionality coming to us in those in those apis now so very interesting there really neat to see them building out all these new API pieces item management admin ation stuff and then there’s like almost apis for every single feature spark event hubs lake house new

16:36 spark event hubs lake house new warehouses ml experiments ml models like notebooks all the stuff exists in here so there’s a lot of interesting things that are going on there inside the the the fabric apis got to figure out how people are going to use them it’ll be interesting to see what people decide as valuable inside those fabric apis excellent let’s go of the news let’s Jo let’s jump into our main topic for today so our main topic for today is from again we said earlier Tom Martins the separation of data and content so I’ll throw the article here

17:06 content so I’ll throw the article here in the chat window here and let’s jump into some of this Seth maybe give us a quick overview of a little bit of the article where you feel like this is where we should start talking about this one I I’m I’m chuckling here because I I really I really hope people have the opportunity to meet Tom Tom and he goes to like he goes to a a few conferences here and there but he’s he’s such an articulate guy he’s like he as you said before he works for a larger

17:36 before he works for a larger organization it’s almost like he’s like he’s like the old guy who’s seen it all yeah and he just lets you talk and then he figures out exactly where he wants to give his nugget of deep deep experience I would agree this is exactly how I perceive him as well and he writes that way as well which is fantastic because in a world where we talk about all of the things in Fabric and all the all the ways we can do the thing and different features that have been added he’s very specific about the

18:09 been added he’s very specific about the areas that he’s talking about and the benefits that it provides so essentially this article that we’re going to review today is a question that he received because he’s still in the fabric Community answering questions and the the question is how can users create their own reports but with RLS applied and what we’re going to talk about today is how do you do that in a very specific way and what are the reasons why you may want to and we’re going to bash the idea back and forth excellent I like

18:41 forth excellent I like it so we’ve talked about this a few times on our podcast and I I know Mike you you brought this up in user group I think years ago about this idea of should we have all of our semantic models in a given workspace and then reports in another Yep this is just continued to go on that route and I’ve seen this a lot with also manage sales service where all the models where all the power users so to speak would be able to grab the semantic models

19:13 able to grab the semantic models from and so this is yeah I think I think I like how Tom was very to your point Seth Tom Tom is very clear about what this article is meant to to answer the question around I have a content creator they need to create some content but the consumer of that content is not allowed to see all of the data so well the not the consumer sorry it’s the well technically the consumer of the model that has to build reports because they’re also a content creator that’s

19:44 they’re also a content creator that’s true but they can’t so it’s it’s specific around the security of the data within it yes where like situation we talk about all the time is we’re we’re the be te we’re we’re

19:56 we’re we’re the be te we’re we’re creating a model with security and there are needs within the organization of other content creators to build reports off the model but they can only see their slice and and let me let me reiterate what you said because to make sure I’m hearing you correctly and reading the article rightly too it’s not the person who’s consuming the content we’re trying to hide the data from it’s the idea that the person making the reports the content creator the person literally building the next thing it’s

20:27 literally building the next thing it’s their expected to have Road level security applied as they’re doing the build of said report doing the build yeah not I’m not I’m not talking about an app we’re not talking about a different workspace we’re talking about that that individual who’s expected to show up to that data model we want their credentials pre-applied with ro security so that way that when they generating that new report it’s already pre-filtered away so they can’t see what they shouldn’t see correct okay clear now and in true chat fashion

20:59 clear now and in true chat fashion we have hey we’re talking about rolls yet again yes we’re talking about rolls yet again inside powerbi and how important these are I guess apparently this is a very hot topic for us well so so Thomas like lays out the the backbone for how do you set up something like this because and Tommy’s point right like a centralized we’ve we’ve discussed having a single semantic model and creating thin reports off of it right and I think it’s it’s a great model to extend

21:29 great model to extend the same logic the same data sets the same everything and extend like what you’re building in an ecosystem this this I think I is a great article to talk about because there’s some Nuance here but it does it starts with the workspace and the roles right like what are the things within a workspace that individuals typically have access to right admin can do anything member and contributor can both publish right content in in sematic

22:01 publish right content in in sematic models and viewers can just view but in within that workspace right we can’t we can’t assign any of these roles to a contributor that has to build something and have Ro level security applied within that workspace right so how do you set up this scenario where I have a semantic model that people can leverage and extend and build their own reports where RLS is applied but that’s that’s the the basis for It is Well you you

22:31 the the basis for It is Well you you can’t do that in just the singular workspace in that way without all the content creators building their own stuff so seeing all the data there there’s two he has a Tom has a really good diagram you kind have to really look at his diagram that he’s provid provided to you in the article and great because this is a podcast you can’t really see it okay so we’ll try to describe it I guess a little bit here so definitely if you are listening or or running or doing something activity wise we encourage good job keep going but when you get

23:02 good job keep going but when you get back to computer you should definitely go click on the link in the description and go hit up the the image that is produced here because there is there is two distinct workspaces and I think something that I find many people stumble on when we’re talking about workspaces and who has access to things you can get access to semantic models without actually having access to the workspace the workspace is intended for people who are building Stu inside that workspace and I think this is a good example of this where a viewer

23:34 is a good example of this where a viewer of a workspace so if you add a user as a viewer they can see the models they can go in there they would be able to connect to the model but the row level security will apply to their persona but they can’t publish to that workspace they have to publish somewhere else so you could get people viewer access they’ll be able to see things but not create stuff and then there’s say other pattern of okay well that’s that’s potentially one way of getting a content creator to create content but

24:04 content creator to create content but what do they do when they have that file like once they’re done with the desktop or whatever they’re building where do they put it and so you you have to kind they put it and so you you have to give them access to another workspace of give them access to another workspace potentially that would let them publish that content somewhere else so this really in my mind I don’t I don’t think I really grack this whole idea of concept of the the content viewer has Road level security applied I don’t think I knew that actually I don’t very I very rarely use the viewer content I think the pattern that I use most

24:34 think the pattern that I use most frequently is giving security groups build permission to a semantic model that lives in a workspace and then giving the users a different workspace to work off of do you guys use this pattern at all today so really the the biggest application in again went on from an individual models rather than giving access to the workspace usually works again again with managed self-service a lot of times you have access to build off any semantic model

25:05 access to build off any semantic model in let’s say a certified or gold level workspace that the bi teams created and all the power users and people cing thin reports would do that here though it’s Tom’s also pushing that this is the right way to go regardless if you’re doing Ro level security I’ve only really seen this more mainstream with manag self service at different organizations where they already have manage Self Service set up

25:36 already have manage Self Service set up so I want to ask you guys is this a yeah yeah so it works great with self-service but for other types of organization structures is this also something that you should do regardless I think go can we can we table this question and walk through like the final solution yeah because the Nuance the Nuance to this implementation I think is important because yes he’s using multiple workspaces to deploy but he’s also

26:07 workspaces to deploy but he’s also permissioning people directly in in the semantic model itself right so this this approach for I haven’t had to implement this so this is a great article for me to read through because the Nuance here is rather than granting roles and permissions at a workspace level it’s done on the semantic model level so that these indiv ual can access that and see that and he walks through like how do you manage access with within the semantic model what users are you going to apply to it Etc and that’s

26:38 you going to apply to it Etc and that’s the key part of like okay great you have a content creator creating a semantic model the semantic model gets deployed to a workspace that then gets permissioned right at the semantic model level and then these content creators can access it with ro level security so wanted to just carry through that full thought of the article because it’s important because I didn’t even like I said I don’t have this use case where I have RLS on content creator right it’s just on consumer and there there’s the

27:09 just on consumer and there there’s the bit of nuance there so back to your question like so here’s the big here’s the big hurdle here I think with this solution is I usually in a self Ser in this structure self-service I have a workspace that has some marketing semantic models that the bi teams created and all the power users from marketing can go ahead and take a look at what they want to build well in this context or the argument here in the article is going through like well you have to remove access from the

27:40 know you have to remove access from the workspace but per semantic model gave that build permissions in order for the real level secur security to follow through that can I feel like that’s pretty hard to scale right so any new semantic model we just individually giving build permissions so I I like where your question is going so I think this really depends on the size the organization so again knowing

28:10 size the organization so again knowing that Tom comes from a very large organization I think the larger your organization becomes the more you’re willing to the more you want to focus on a centralized team or centralized data models or data sets that are going to be curated by a group of people and you’re setting up patterns that enable that Central designed model to be distributed to many people so to your to your point right we are talking like self-service to some degree but you may have a team that’s large enough and I again I would

28:41 that’s large enough and I again I would assume when you get to organizations that have lots of data are working with lots of customers that have lots of information you might have a team of people that are just content creators but there are either internal policies or regulated policies that require that there is clear separation you can’t see all customers data in a centralized model so you have to think about where that road level security element is being applied for those content creators so to to your point Tommy does it seem clunky to be

29:12 point Tommy does it seem clunky to be able to have to manage those individual data sets or semantic models and making sure that the security groups are applied applied correctly I yes it is more management but I think what we’re talking about here is we’re talking about an organization that has a lot of people doing a lot of things and we’re trying to build these reusable semantic models for a large scale of people like and I think of it this way for every for every one admin of your tenant you’re going to have five 10 or or admin of a

29:44 have five 10 or or admin of a workspace whatever you want to call it you’re going to have five or 10 maybe data modelers and for every five or 10 data modelers you’re going to like you data modelers you’re going to like 10x that you’re going to have know 10x that you’re going to have hopefully for five there’s going to be like 100 report consumers right so you

29:56 like 100 report consumers right so you the the surface or the number of people that are going to be consuming things I think gets larger when you get more towards I’m a pure consumer or I’m a pure data model administrator creator that number gets less and less to the left- hand side but then it gets more and more towards the consuming content side so to your point Tommy I don’t know if there’s an easier way maybe maybe maybe using things like vs code would help us vs code is there is there in Bern’s

30:28 code is there is there in Bern’s examp and his thing is there ability to add permissions to things will it be fast easy maybe at some point you actually get to a point here where you’re just automating a lot of this I you’re just automating a lot of this this there may be things here where mean this there may be things here where you’re no longer using the UI in power. com because it’s getting to be so big that you have to have automated API Calles that are just adding security groups to semantic models where you need them versus on the workspaces there’s I’m finding my opinion here the larger your organization becomes the more you’re willing to provide automation around

30:59 willing to provide automation around that to make sure that you’re getting a regular experience around deploying the same things over and over again with with regular content so there and lies the problem though so let’s say again I have one model I have 10 marketing semantic models in a workspace we’re doing Self Service sure one of those models we need to apply the RO level security okay so the only solution here is then that I have to then either do one or two things I’m removing everyone build permissions from the workspace

31:30 build permissions from the workspace because again they have access to any marketing model or I’m creating a separate workspace for low level Security Pacific semantic models right because I’m already thinking I’m doing that at this point right so I think I think these are strategic decisions that you’re making at your at your organizational level this is I think this is where the center of excellence really has to weigh in on these things I think this is what your company policy decides this is I don’t even think it’s company policy though right if we break it down what are what are

32:02 if we break it down what are what are how are you approaching reporting in the organization right and Tommy I think what you’re describing is you just said how you’re reporting in the organization that you just said it’s not an organization decision then you just made an organization comment no okay wrong words hung up on okay I got hung up on where things in general in general right like you’re you’re I think it it is company size but I the reason I made that comment is because I don’t think it requires a

32:32 because I don’t think it requires a center of excellence okay what I what I’m saying is the how you go about approaching the implementation of reporting wi willingly intentionally or unintentionally right there is the decision being made are two different ways yeah and what I what I think or what I’m seeing here is the first approach is very self-service we’re getting it’s all the stuff we talk about all the time getting insights to people data reporting they don’t have access like they they now do we’re creating

33:03 like they they now do we’re creating workspaces that they can work within etc etc etc and that is that approach is focused on getting people access to data to bring insights sure what we’re ignoring completely in that approach typically is what Tom is not ignoring especially in an extremely large organization where Mike you absolutely have bodies of people that are making decisions specifically around data governance Y

33:35 specifically around data governance Y and what I would argue is this is a an example or an approach of a well-governed organization because just like fabric has a future promise of it does not matter who like where you go essentially we know who you are and what access you should have and in that vein right it doesn’t matter who you are in your organization or what reporting tool

34:05 your organization or what reporting tool or workpace whatever or wherever you go you should never see data that you should not be permissioned to see correct and this is dis this is a design that allows self-service right in an environment where nobody has to worry about like I’m attaching to the semantic model and I’m building a report and I’m like it’s useful for me it’s useful for my team and it’s even more useful because it’s all predicated on role level security right so that that I think is you’re

34:35 right so that that I think is you’re going to have a challenge Tommy with how things are set up and as an organization evolves and says well we now have this expectation that everybody can’t see everything how do you go fix that because you’re going to have to when when that shift happens but that’s how I view the different the different approaches I don’t see it as a problem per se yeah Tommy I I see it as an inevitable outcome of where an organization is in their data culture process of

35:06 their data culture process of Coe understanding that every place is an opportunity for somebody to see something they shouldn’t as opposed to see everything and yeah there are certain very specific use cases where role level security is we know know applied right yes and that’s why I think I think speed to speed to getting things out like it it’s not like we’re saying business intelligence folks don’t care about security or people seeing data that they should we there are

35:37 data that they should we there are absolutely types of data that that are instantly fall into that bucket but how do we typically manage it it’s probably by the audience the workspace who has access to see the data yeah or we’re applying like light role Security on top of it depending but in systems where the default is security right and data governed security it starts from the central locations and goes downward it’s not an expectation that that’s managed way down

36:09 expectation that that’s managed way down on all the levels of people who are now content creators that are sitting in the business units makes total sense here there’s some conversation in the chat here as well talking about I I made the comment of hey we should ask the Coe and so Enterprise AR says yeah me looking you should ask the Coe and then he goes this comment is me looking into the mirror hey Coe what should we do here yeah exactly so I my comment here

36:39 yeah exactly so I my comment here around this one is yes I understand that most or many organizations may not have a a established Center of Excellence but how do what you don’t know unless you’re talking to people that are thinking about this stuff right what are the implications of this this is maybe maybe explicit measures maybe the podcast here is is some part of a an outside voice that’s helping you we’re movement you’re welcome to your Coe now you have three more people to bounce ideas off of I

37:10 more people to bounce ideas off of I more people to bounce ideas off of to some degree though it’s mean to some degree though it’s it’s helpful to have other people in that Administration space to start thinking about like this is why we like consuming content or people that have like Tom’s administrating huge amounts of things for his company right great but it’s wonderful to like sit down and listen to and really digest what his company is doing because this is a pattern that he’s finding that’s working we need to understand what the pattern is and should we apply it to our Center of Excellence or our team and I think this is these are the

37:40 I think this is these are the conversations that we’re not having on the Microsoft blog these are not conversations we’re having in other sites or so I think this is why this is so relevant for me because it’s really helping me tease out thinking through the implications of okay how do I build content with security enabled on the data levels this is great I’m going back to something as I’m reading the article I’m hearing what you guys are saying and it’s we’ve talked a lot about the structure of the workspaces but let’s talk about the

38:10 workspaces but let’s talk about the Persona here the person who’s actually building the content not the data because I think we’ll assume that the data Builder is going to be probably owning to some extent the role of security obviously the corre model but there’s different I think there’s a some some great area because when we say the content creator right are we saying this in what type of fashion are we saying this is this is someone who is

38:41 saying this is this is someone who is doing powerbi ad Hawk on the Fly is this one of those power users and I think that really does change the conversation in terms of what type of access and more importantly security I think it there is no oneid fits all with a content creator I feel like I’m going to put I’m going to point on something here Tommy that you’re going to not like so much but I’m going to say it anyways I think Tommy you’re you’re a very big proponent of when there’s a central bi team that team is creating all the content and and designing it for

39:13 all the content and and designing it for everyone to consume throughout the organization so I really feel like this is a this is a scenario where we’re we’re we’re pulling back that security boundary further up and saying look we don’t want a team that’s going to we’re going to have we’re to build a process that’s going to allow people to connect to the data but we are not going to own the report creation element of what you’re doing so if you want to so here’s our standard measures here’s our standard data sets this data set is certified we’re going to build this thing and I think the reason you build

39:44 thing and I think the reason you build this model the way we’re talking about here is the reason I want to build a model with lots of data that some people can’t see the question as I’m thinking

39:53 can’t see the question as I’m thinking this out loud here I’m literally real time streaming this loud here why the question could be also asked why would you build a model that has data in it that some people can’t see some people can and cannot see what’s what’s the advantage there right so that’s maybe a pivotal question you have to answer that first I think you do that because every domain or customer or whatever you have a model that has clear segmentation of the structure is the same the data is

40:24 of the structure is the same the data is similar for many different audiences so I think of the idea of like it’s it’s a model that contains multi-tenant information right it’s one big model with many customers data in the same model there are sales representatives or there’s people that are only able to see the customers that they belong to something along those lines right there’s so the idea is the structure of the data doesn’t change and as a strategic decision point I don’t want to manage one model for every single customer I may have thousands

40:55 single customer I may have thousands okay we don’t that’s just not helpful so how do you do this and I think now that you now that I’m saying this out loud I feel like Alex dupler has had a very similar challenge with some data things that he’s been doing with where he has a similar challenge where they have such large data that he’s dealing with he has to think about many models and manage them all separately for each team of people right I have a a huge amount of data I had to send it out to these different sales teams they had to

41:25 these different sales teams they had to be able to use these these data teams have to have their own models because the data is just so large so when you listen to Alex talk about this problem he’s had to build like a whole automated system to create many different models in an automatic way and deploy changes to those models for all these different customers like it gets really complicated if you have to take every single customer and make a single model for each one of them that’s hard so where I’m going with this is I’m

41:55 where I’m going with this is I’m thinking to myself like the reason why you build these bigger centralized models with relo security is because it’s easier for you to manage the one big model and then now okay all I need to do is just carve out the data differently for each team member to build their own stuff sorry there’s a lot of thoughts in there that I’m like real time thinking through as you’re saying that statement did that any of that make that make sense no I I think that’s that’s great in that particular situation but I think you’re talking about something while huge in Enterprises is a little more Niche for not necessarily every

42:26 Niche for not necessarily every organization’s going to have that cross client or the external customer so to speak right espe so I’m trying to think about this if I’m going to apply this and this is where I’m coming from if I’m going to really universally apply this approach where my content creators know to go to the event Hub or the the data Hub the one Lake Hub to and I have access to certain certified workspace or semantic

42:57 certified workspace or semantic models to build off of and then there are certain ones that I have individual access to with the RO level security that to me would be the universal approach for any content creator whether it’s someone who we call power user who’s building measures report level measures and they know all the cool features in the powerbi service or it’s someone who like I said is a little more ad hoc and they just know how to drag and drop regardless

43:27 how to drag and drop regardless regardless if you’re this huge industrial approach because you have like Alex dupler I I know exactly the situation you’re talking about I think it was a series of tweets that more data than we can imagine but that’s not the case with the typ we’ll call it the typical or or let’s say the common semantic model the common boiler plate so I would argue honestly different than what you said that I want bi the centralized bi team to own the

43:58 bi the centralized bi team to own the report and the model I’ve changed I’ve changed my my my tune with that I know but I’m I think I will say I will say when you are creating the model and you’re also creating the report if you’re too far removed from those needs that collaboration between how are we going to build the model not just for from a Toler point of view but also how you like what are the users needing when I’m in that Forefront of the

44:29 when I’m in that Forefront of the technical and understanding okay here are the metrics that they need okay here’s the situations that I was not aware of that I learned from the stakeholders if I’m just a semantic model builder I’m not front-facing with the stakeholders I’m really only working with the content creators right yeah but I think I think this is the difference too that goes back to my previous point of where you’re at as an organization right because if if you’re doing if your focus is is data

44:59 doing if your focus is is data governance and semantic models that are applied you’re applying Ro level security to so content a bunch of content creators can go create things I’m also making the assumption that you’ve probably got some pretty salid warehouses or data marks or things that are the standard things for the organization yes right like we’re not in this realm of I don’t know what metrics you need and I’m going to build them one off and thing I think I think these are those those

45:30 I think I think these are those those sources of information that are just used throughout the organization somebody wants to go build their own stuff yes I is does that mean every single semantic model fits this mode I don’t know I don’t think so like you you’re always going to have business cases where somebody’s spooling up off of their own thing or you’re still in the same realm that your your saying is a right now because you’re saying I guess what I’m hearing from you

46:01 saying I guess what I’m hearing from you is do do we have to apply this architecture to every single workspace semantic model user Etc no I don’t think so I think it is separated out to some degree and these would be managed data sets by by teams right manage semantic models that others can use but the specification is there’s R of security applied to them I don’t think you roll everything on to

46:32 don’t think you roll everything on to this so you still need self-service this is true Tom does speak there’s actually in the first paragraph he says there are more reasons to separate the data from the content that are not due to or us that’s going to be the part two article that’s not out yet sure so and and know you’re absolutely right but I guess my concerns here as I’m I’m thinking about this is you the RO level security situation makes a lot of sense perfect sense but if again does that lead to a

47:02 sense but if again does that lead to a poor poorer experience for a content creator like where but that and this is where I say it it depends on where the organization is if there is it all it takes is hey somebody saw something they shouldn’t have and they shared it and they’re fired but we are now as an organization going to take data security as our number one priority yep Everything Changes Tommy right like every architecture you ever had means that it goes through this model what’s

47:32 that it goes through this model what’s fantastic about this article is we now have a path to go do that right but like but this is like you do have data security and governance and things you can and can’t do that will put roadblocks into just your random implementation of powerbi right because you have to follow policies you have to follow structures and if one of those things eventually in an organization is anybody who’s accessing data in our system should never see something that they’re not specifically assigned to see

48:04 they’re not specifically assigned to see then these are the types of solutions that you need but also are supported in Fabric and the powerbi ecosystem even without like the the full One Security thing yet and I and this is where I’m I’m really I’m not sure how to articulate this point here but I feel like you’re making a lot of solid points Seth around the maturity of the organization to get to this level probably is already there’s already been some muscle memory being built in the organization to do

48:34 being built in the organization to do this already right I think I think there’s two approaches to how you can look at this article one I’m brand new to powerbi We’re just trying to figure things out where we know we this is interesting maybe we should apply some relo security so I think that’s there’s like this new approach but there’s also this approach of hey we’re moving into powerbi from other systems it may have been click it may be Tableau it may be other reporting things and you’ve already built solutions that use those tools and now you’re transitioning into the powerbi world but you already have data Mars you

49:05 world but you already have data Mars you already know your data right not every organization is trying to figure out what their requirements are for what reporting should look like this may be a very well-known entity I’m working with an organization right now they’ve got many many years of how they want to have the data reported they know exactly what measures and dimensions they need on every single report because they have existing reporting that they’re using today that they’re migrating off of into powerbi so in this example we have a very well-known here’s how we think the

49:35 very well-known here’s how we think the data needs to be presented now we have to articulate that into okay let’s transform our known experience of warehousing data modeling into powerbi Ru of security and now thin reports and models one thing I would point out in

49:52 models one thing I would point out in this article I like how Tom has listed there’s a data workspace and there’s a Content workspace I don’t I will say me personally I don’t use those languages with customers I I talk more about the the semantic model and the thin report that’s just the language I use I think we’re saying the same things in this in this world I have the semantic model workspace and I have thin report workspaces and I think it’s the same concept of what we’re trying to get at here as well anyways I just my my opinion here is it seems like

50:25 just my my opinion here is it seems like he’s making it’s it’s it’s a good language but that’s just not what the language I use when I talk about models and thin reports well well we’re we’re always talking about how we need to redesign our workspaces in fabric true so I guess I guess to to to make one last point about like the the differences with where organizations are at and where this is relevant if if you were to listen to us today and you have a a small organization and they have no

50:55 a a small organization and they have no visibility into data right now and you were like what security and governance is the number one important thing so I’m going to take more time to set up this architecture so that we can create these semantic models with role of security so that nobody has access to data that they shouldn’t the answer is likely going to be something like I’m glad you’re thinking about that but I can’t see anything right now right like let’s solve that in the future but right now I really need to make this important decision and you need to build me a

51:25 decision and you need to build me a report just share it with me or this should just be shared with you think and this is the evolution right like get people the insights they need to make good business decisions versus okay we have access to all the data we actually have a lot more users that want to use get access and start building things we need to secure this because we don’t know who’s going to be accessing all this stuff once we say Yep this user group can now go get access to this

51:55 go get access to this data we need to solve that before we open the floodgates on our organization please set up an architecture that’s going to support that those are two wildly different things on the road map of where you want to be in your in your organization as it relates to data and Reporting yeah and that’s also assuming to set the point that you already do have the Coe and a pretty solid process already set up you you would have to no I think I think it’s something that

52:25 I think I think it’s something that anyone who listens to the podcast would could say okay well where where am I in my organization I’m the singl Man person like if our organization is going to grow and scale and this thing here’s the road map of how I’m going to drive reporting forward right are you are you the individual that’s just building reports at ad hoc as the business needs them or are you the driver for making business intelligence better within the organization right so one could argue it’s not my job whatever I’m just saying

52:55 it’s not my job whatever I’m just saying like you there is a path that you should set up and have if you are the guy or the leader of a team no matter how big saying here’s where I want to go and here’s incrementally the things we need to work towards to get there and this this is well before like I could be I could have 10 steps before I get to we need to establish a Coe right and and that really is just like okay our Coe is going to be as one of our

53:25 our Coe is going to be as one of our listeners like I’m going to ask myself in the mirror hey cue like it could be that but it that’s that’s the organization recognizing the person saying hey here are the things that you saying hey here are the things that we’re going to run past you and the know we’re going to run past you and the processes that we’re following based on your advice and what you’re building for the organization it doesn’t have to be the 20 person Coe right in the 10, 000 person organization it so I’m just saying like there’s steps to building the blocks of like things you

53:55 building the blocks of like things you need to support your Solutions the broader they get within the organization there’s a really good comment that came up here I just want to close this up as we WP with final thoughts here Adrian says go Google see powerbi usage scenario diagrams and this is a lot of what appears to me is like either Kurt has done this or this is like initial work that was done by Melissa coats talking through advanced scenarios I put the link in the the the

54:26 the link in the the the chat window here as well and I’ll make sure I include this as well this is really helpful it talks about hey are you Advanced model preparation are you doing customizable managed self-service are you doing departmental bi in bed for your customers in bed for your organization Enterprise bi manage self-service on premise reporting there’s a whole like personal bi there’s a whole bunch of patterns of like what are the things that we should be thinking about inside the diagram of powerbi and there’s there’s I don’t know

54:56 powerbi and there’s there’s I don’t know team departmental Enterprise it goes through all the different levels and then it even talks about like some of the advanced things that you’d be doing don’t feel like you have to do all these architectures you kind have to pick which one will work for your organization and you may start at a you organization and you may start at a team bi work up to departmental and know team bi work up to departmental and then eventually move yourself into Enterprise as the as the solution grows but not every company is going to have the same build or structure which is nice about fabric but it also is hurtful about fabric it means you now

55:26 hurtful about fabric it means you now need to to know the different versions of reporting and what works best for your your organization so anyways I thought that was a very relevant link to put out there as well that’s a cool link I’ll give my final thoughts here so I think this is a great article I like where Tom’s going with the managed self-service pieces definitely worth a read I do think this is more for larger organizations that have needs around larger models for many people to consume and you’re trying to delineate

55:57 consume and you’re trying to delineate between different content creators based on their permissions to data so very relevant super helpful here the only change I would make probably would be is I would call it a a model and a thin report so we have model workspaces and we have thin report workspaces so very very neat article really liked it definitely am using this today I’m already using this and I’m actually teaching this to organizations today that are growing into this they they have other systems and they’re trying to figure out how to leverage this because they do realize those bigger larger

56:28 they do realize those bigger larger centralized models do make sense in certain use cases but this also pairs with team and departmental bi as well so this doesn’t this doesn’t have to be this is not every model you’re building is doesn’t always fit this Tom any final thoughts I think the biggest thing it’s going to go back to training and awareness however you you roll this out to your point this is more than just self-service but that planet I do I’m I I have no idea those diagrams existed and it looks like all Kurt youer

57:00 and it looks like all Kurt youer designed that’s a great place to start and putting together the plan first off to I think sum up everyone’s point and the training and the awareness and how you’re going to roll it out is going to be pivotal because the technology alone is not going to solve it no matter if you separate them or if you combine them but making sure people understand where they need to go who’s in charge of what but I I really do love the SM the model workspace and the thin report workspace for more than just

57:30 workspace for more than just selfservice awesome Seth I think you’ve already given your final thoughts any any other comments as you want to wrap up here I lo love the article it’s definitely worth a read go go learn this this Security First pattern with with data and I love I love these conversations man like I did not know this this link existed so getting that from the audience is even better double win today all right there we go welcome to your for rent a Coe this this comment and and commentary was free

58:00 comment and and commentary was free there is no warranty with this recommendations or how how this goes so take this for what you will hopefully if nothing else hopefully this is going to Spur some internal conversation at your company about what you should be doing with your organization and potentially thinking about options for how you want to handle and and govern your C Center of Excellence if you don’t have another Center of Excellence person you’re welcome to use the explicit measures podcast as your sounding board for some of these Coe topics apparently we do a lot of this anyways maybe we should rename ourselves it should be the the

58:31 rename ourselves it should be the the rente Coe explicit measures podcast maybe maybe not with that thank you very much we appreciate your ears and you’re listening for what we’re talking about today we hope you have found some value from this if nothing else there’s a couple links here that may be relevant that you didn’t know about that you would like to click into so definitely check those out as well with that we only ask for you as the listener it’s free to you let someone else know let someone else know you found this article interesting let someone else reach out to Tom Tom’s got a little comment section at

59:01 Tom’s got a little comment section at the bottom of his his article if you hit it and you have questions you want to know more about it go talk to Tom he’s a great guy he would love to probably have some more feedback there please share this podcast with somebody else Tommy where else can you find the podcast you can find us on Apple Spotify or wherever you your podcast make sure to subscribe and leave a rating it helps us out a ton do you have a question an idea or a topic that you want us to talk about a future episode head over to powerbi. com podcast leave your name and a great question finally join us live every

59:31 question finally join us live every Tuesday and Thursday enjoy the conversation all power. tips social media channels you put the pause in too long I thought we were done sorry line bre all right well we flubbed the whole end of this episode well that there you go thank you all very much and we’ll see you next time

Thank You

Thanks for listening to Explicit Measures. If you enjoyed this episode, please subscribe on YouTube and share it with a friend. You can also find more Power BI tips and tutorials at PowerBI.tips.

Previous

The Impact of Task Flows – Ep. 323

More Posts

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.