Managed Self Service & Power BI – Ep. 266
In Ep. 266, Mike, Tommy, and Seth dig into one of the most practical (and most misunderstood) operating models in Power BI: managed self-service.
The goal isn’t to shut down self-service—it’s to make it sustainable. That means giving teams freedom to build reports while still protecting shared definitions through curated semantic models, clear responsibility boundaries, and an endorsement signal (like certified) that tells the organization what it can trust.
Along the way, the crew also touches on the semantic model naming shift and why Fabric is making programmatic ‘model operations’ (like auditing and testing) more realistic than ever.
News & Announcements
- So what is the BI Semantic Model? — Helpful background on the BISM concept and why ‘semantic model’ is more than just a rebrand for a dataset.
- m-kovalsky/Fabric — A strong collection of Fabric utilities and experiments (great if you like to learn by reading real notebooks/scripts).
- Microsoft Fabric: Writing JSON data into a Lakehouse table — A concrete example of turning semi-structured JSON into a table you can query and model.
- Managed self-service BI (Power BI guidance) — Microsoft’s reference architecture for shared semantic models with distributed report creation.
- Customizable managed self-service BI (Power BI guidance) — A variant pattern that acknowledges reality: business units sometimes need to extend what the central team publishes.
- Submit a topic idea (Explicit Measures Podcast) — Send a scenario or question for a future episode.
- PowerBI.tips Podcast — Subscribe and browse the full back catalog.
- Power BI Theme Generator (Tips+) — Keep report styling consistent with a reusable theme (and stop hand-editing JSON).
Main Discussion
Managed self-service is the ‘middle lane’ between two extremes:
- Fully centralized BI (stable, but slow and capacity-constrained)
- Unmanaged self-service (fast, but inconsistent and hard to govern)
In this model, a central team curates shared semantic models (enterprise definitions, measures, relationships, and security), while distributed teams build reports and analysis on top of those shared assets. The success criteria isn’t ‘zero self-service’—it’s repeatable trust: people should know what’s official, what’s exploratory, and who owns what.
Key takeaways:
- Treat certification as a communication tool. It should clearly signal what the organization stands behind (and what it doesn’t).
- Separate model trust from report trust. A certified semantic model does not automatically certify every downstream report built on top of it.
- Define the ownership handoff. Model owners are accountable for model quality and definitions; report authors are accountable for interpretation and presentation.
- Design the request + backlog process. If certified models are the gateway to ‘official’ numbers, you need intake, prioritization, and a way to reduce duplicates.
- Choose access patterns on purpose. Workspace access vs. app access vs. direct Build permissions each change discoverability and governance.
- Have a promotion path. When a self-service artifact gains audience or becomes KPI-critical, promote it into a governed lifecycle (docs, tests, owners, endorsement).
- Plan for automation. Fabric notebooks + semantic-model tooling enable programmatic checks (DMV queries, validations, monitoring) so governance isn’t purely manual.
Looking Forward
Write down one rule set for your tenant (endorsement + ownership + access), then pilot it with a single domain semantic model and one business unit building thin reports on top of it.
Episode Transcript
0:29 good morning everyone and welcome back to the explicit ERS podcast with Tommy Seth the mike in today’s episode I’ll give you a quick hook right away we’re going to jump in and we’re going to talk about what is managed self-service and how does this work with powerbi what does a selfservice model look like for a powerbi organization and how will organizations need to build this so that’s going to be our main topic for today but before we get into that let’s do some news anything across the internet that we’re finding that is interesting and worthy of talking about Tommy you have any articles you found
0:59 Tommy you have any articles you found yeah yeah another one by our our best friend Chris Webb on what’s a semantic model he writes so many good things it’s crazy even if he’s I know even if he’s just summarizing something so we we previously talked that there’s no such thing as a data model in powerbi or as it it’s now called a semantic model and he talks about the name change this is just been something that I’ve realized all my training has to change I I literally been in between training and realize I’m
1:30 been in between training and realize I’m like should I start using the term semantic model now yes yeah I see how uhuh yeah you have there’s a lot of transition and I’ve already transitioned to be calling it that even in training with people like well you don’t know what a data model is yet so we’re just gonna call a semantic model or you’ve heard so but I think that transition needs to happen now for everyone is that’s really like the new phrase change what what is a semantic mod model
2:00 change what what is a semantic mod model Tommy semantic model is previously known as the powerbi data model it’s everything but the report canvas and a powerbi data set it sounds like Chris was talking about like it’s the the bi semantic model which is the bism b i s m yeah and that’s and that’s where I think the who was the other gentleman who made the bism normalizer oh yeah I’m a blank on the
2:30 normalizer oh yeah I’m a blank on the name at this moment but there was another tool Daniel was that it wasn’t Daniel no bism normalizer that’s from Christian Christian Wade oh yeah yeah yeah so his bism normalizer would let you look at the the Bim file that comes out of powerbi essentially it’s the cube the definition of the cube it’s not actually the cube it’s just the definition of it which is interesting and that’s what this whole semantic model is right it’s it’s the it’s the describing features of what calculations what
3:00 features of what calculations what measures where are the tables is the table partitioned like all the information that makes the model work it’s a definition file it doesn’t it’s not actually the file which is very interesting but it’s also the phrase that we use now when we say hey there’s in powerbi there’s a semantic model on report every time you create a open up powerbi it is the new terminology we’re using for what was previously known as the powerbi data model it’s going to take some time for people to really absorb this one cuz I
3:31 really absorb this one cuz I think they’re very much is does it does this mean we’re going to see inside arb. com yeah we’re no longer going to see the data set yeah it’s going to say semantic model when you wanted your filters interesting where I’m I’m sorry I’m catching up here because like his article is is basically breaking it down to the the fact that there are still all the underlying components and this is just a marketing thing f it it’s that’s that’s all it is it’s just a name
4:01 all it is it’s just a name change but it’s not a change to the what it was or what it is it’s simply just ter giving it a different name now it’s a big name change because it’s one of the core components of everything we do but that’s really all it is there’s no feature update or additional items or artifacts it’s just your normal powerbi data model that now no longer has the data in front of it a semantic model which again makes a lot of sense because data set data model are used
4:33 because data set data model are used interchangeably in other tooling and you other parts of data engineering which there’s a lot of mixup so now we are specifying when I say semantic model that is referring only to the what the powerbi creation interesting Tom that you mentioned that one because that’s that’s actually a really good point because in the data Science World you make these things called models however I feel like I would I would argue that the the cube has been around longer than the data science
5:03 around longer than the data science model model thing fine maybe so now when I make now when I make a thin report I’ll have a semantic model and thin report yes exactly that’s exactly what it is yes timeing to update your SEO yeah it’s it’s gonna take me a bit of time to like get my head around this one I I’m gonna have a hard time getting away from the word data set I think I’m still going to use data set quite frequently it’s going to be hard it’s a big change this is all
5:33 hard it’s a big change this is all because there’s now a new tab inside desktop the actual semantic model where you can see the calculation groups the measures definition of all the things that are inside the semantic model and they needed that to be there so you could actually see when you make a calculation group where does it live see unconvinced Seth I I guess just the name change right like it it it’s it’s there’s nothing new here not a new Fe right just
6:03 nothing new here not a new Fe right just marketing change someone decided that dat was too confusing or data set was not descriptive enough I guess and they decided oh let’s change it do do you guys not like it because I actually like it I think I don’t it’s not the fact that I don’t like it it’s more the idea of I understand why it exists understand why it’s there I feel like saying the we data set is easier than saying semantic model and I’m also thinking writing this
6:34 model and I’m also thinking writing this thing out every time to learn how to spell semantic correctly all the time this is going to create a headache for me it is what it is you for me it is what it is fine no no big deal I’ll I’ll get know fine no no big deal I’ll I’ll get over it and we’ll move on I don’t know here’s but here’s my thing though if you didn’t do if you didn’t call it semantic model I don’t know what else you’d call it like I have no other term that I would would use for reference it’s either going to be a data set or it’s going to be a semantic model so like that’s to be fair they pick
7:06 so like that’s to be fair they pick something that is descriptive to what it is and I wouldn’t I wouldn’t change it there’s no other term that I would use to it unless you’re going to spell like the European way with model with two L’s it just drives me bonkers that that happens also like what the heck are we doing here there’s two L’s in the middle of model like that doesn’t seem right but some people spell like that so had a random U random U somewhere exactly or make up a brand new ter I don’t know anyways interesting let’s move on to our next topic other topics we have here so Chris
7:38 topic other topics we have here so Chris Webb was talking about his blog there’s this new thing coming out and this is particularly I think around some of the fabric pieces so this is two parts here so Microsoft fabric has opened up a lot more of the data engineering side of things with this there are two blog posts one from Michael kovalski which everything he touches is like very technical and super good so Michael kovalski comes out and says hey look you can use this package called semantic link which allows you to put
8:11 semantic link which allows you to put this pack package inside a notebook and from that notebook you can now go interrogate or run queries directly against a powerbi data set which is pretty dang cool you can get all the column names out you can go look at the DMVs the the what does the DMV stand for Seth Seth Dynam it’s either data management view or dynamic man managed view managed view something it tells you about like what is the model doing so you can now start going to grab data around I think this
8:42 going to grab data around I think this will be important for monitoring on key data sets that you care about things that are going to your customers bigger data sets like getting the information out of it Microsoft has come up with this really interesting scenario of like hey data scientist we’ve done all this hard work to make these cubes and things you may want to pull in a measure and a column name into your data science project project okay you and I were just disagreeing with this and I don’t know why you think
9:12 with this and I don’t know why you think that’s so weird you have this semantic model now it’s not weird it’s not weird I’ve just never heard of any data scientist going man I have so much better data science if I could get it out of my model from powerbi it just doesn’t seem to happen possible so it’s never possible but now with with the semantic link Whispering now with this with this new package what I can do is rather than having to curate the entire data set I can basically grab the columns and tables that I want grab
9:43 the columns and tables that I want grab the data and already have it package or in a sense as a simple data frame I can take a powerbi model and take the components that I need and create a data frame from that which is really powerful like you think of that linkage of trying to do addition analytics that’s not outside of powerbi where it’s like I don’t want to use Dax to do my forecasting I just want to take the data that’s already been curated and cleaned in a semantic model and now I can create a data frame from that easily very
10:14 a data frame from that easily very easily this is a huge story this is a huge growup story for what the powerbi model is in and is in the realm of literally not just business intelligence but the data story I I think it’s cool I do like it I just I just have to keep I I I would ask questions to the chat chat for you who have been out there have you had any data scientists directly asking for access to the parbi data sets that are
10:44 access to the parbi data sets that are being built I I think that’s the wrong
10:48 being built I I think that’s the wrong question is it though why is it was never they I think they these teams have always existed I think in different spaces and you never could have this ability or even close to this they they both go down different paths on what the data model was the end goal was from the reporting solution and data science which is the forecasting never existed in powerbi but now there’s a linkage between the creation of the semantic model and now the further
11:21 semantic model and now the further like now we’re building that bridge to data science so even if the question wasn’t asked before this is a a building the bridge for what the prxs are for data scientists for what the data they need like like like like so I I can see a use case where a data like you would want to interrogate or look at data from a model for some quick POC type things but
11:55 for some quick POC type things but you’re you’re saying that it you you’re you would recommend you would recommend that the model be a data source for the the activities that they build on top of that well think about this because again I don’t have to just pull the model I can pull a table or a colle a single table from a powerbi semantic you write full decks against yeah I can write I can write full decks or I can just grab a single table so for for data scientists who are what are they pulling from maybe they’re pulling from synapse
12:26 from maybe they’re pulling from synapse or they’re pulling from a data source anyways now one of those additional data sources is already a cleaned model and they don’t have to pull the measures in they can they’re already pre-created but this is just now an additional source of already clean D data so I do think that’s a good leading I think it’s I think it’s a good I think it’s helpful I think it definitely add I think there is a use case although I’ll I’ll say I believe the use case is a little bit contrived of the companies I’ve worked for
12:56 I’ve worked for one a lot of a lot of companies are just trying to to get their feet off the ground in just data Engineering in general so a lot of companies I think are not at this level of like wow now that we have these really welldefined super groomed efficient data models were ready to give access to a data scientist team to actually go utilize that information not saying that there isn’t a need for that but in my experience with working with data scientists they’re not looking to connect to cubes and models and things they’re just going to write whatever they need to write and they’re going to go after data that’s in that
13:27 go after data that’s in that earlier stage before you get to the data model and they’re going to do predictive things somewhere else in the system that’s just my experience I don’t know if that’s going to change and potentially maybe this feature will El enable a change in mentality here because you are spending on you here because you are spending on let’s talk about a moment here for know let’s talk about a moment here for certified data sets it’s it’s groom clean good data right we should be using that for other things outside of just using that data for serving to reports
13:57 using that data for serving to reports and getting it into pag reports not sure organizations are thinking that way yet they might need some more time to adjust and start really finding solid use cases around that it’s a good point of the use cases that there’s provided I think using the DMVs to see what’s going on in the data set is actually going to be more relevant than anything else I think management of large data sets this will be way more helpful this actually to me this points out you can now build a test framework
14:29 out you can now build a test framework around your data set using this pattern this will give you the ability to go pick and pull data out of your model every time the model was produced and you can run notebooks that will test the validity or test the accuracy of your data every single time to me that is way more impactful than saying hey data scientist you can go now access the data in the data model so I think from my perspective the testing and auditing ability of a data set is going to be way more improved with this feature
15:01 more improved with this feature as opposed to the data scientist world is going to be super happy and put their hands up and sing the Praises of Microsoft for giving them access to a data model and I get and that’s I guess for scenarios where you wouldn’t have access to the data sources already going into the the model you’re talking for the data science story right well for what you just were talking about data validation and quality checking well I think there’s actually a better way of aut things using this this is more of a programmatic way of getting at the data
15:32 programmatic way of getting at the data inside the data model than I think you would do if you were Building inside desktop or when you have incremental refresh there are going to be certain things that you can’t control really you have to so this method also allows you to like utilize everything within the model like you could create ax measures against it ET you can do evaluate summarize columns and you can summarize tables out of it you can you could you could literally write know you could you could literally write okay I care about this fact table and I care about these months of data and I’m going to store yeah I would agree that’s
16:02 going to store yeah I would agree that’s that that would be but to me pretty good use case that’s way more impactful than the data science story so I think Microsoft should have led with the hey look at this you can get data into notebooks now and now you can start automating regular loads out of data great use case do some testing around it I’m not sure I’m sure where’s John KY John calling John kky John please add this to your deployment devops pipeline of things that you could use here I think this would be very helpful as well anyways interesting feature I need to
16:32 anyways interesting feature I need to play with it a bit more to have maybe a a stronger opinion and along that that’s another article that came out recently with Darren gospel from random procrastination blog he talks about now that we have fabric everyone’s like look at all this cool stuff we can do he’s hitting the refreshable API and he’s grabbing data back and he’s parsing the data out from a from a Json object into like a data frame everyone’s excited like this is what I was expecting to happen like everyone’s finding these
17:02 happen like everyone’s finding these very Niche use cases and like oh man it’ be really cool if I could oh I could just do that in a notebook now all these extra dead engineering experiences now are like being lit up and everyone’s like wow look at this like dudes this has been this has been happening for years in other systems that are already there it’s now just being brought into fabric right well and I think but that’s the that is the benefit and I I think I mentioned that in the last glibly in in our conversation where totally we were talking about oh like U SP rename right
17:34 talking about oh like U SP rename right yep like it’s a function in sequel It’s been there since time exactly but it’s like but it’s a refresher right so folks who didn’t know that right are getting this like constant stream of of new of things that have been saying like yeah they’re out there they’re in different tools but now about it and now you can like it’s it’s refreshing the excitement around things you can do in these new environments or introducing more people to them so it’s a good thing I think so anyways all
18:06 it’s a good thing I think so anyways all good topics and intros let’s get into the main topic I think these these two things really push us now okay let’s let’s talk about what is self-service maybe we should Define that first and then what does managed self-service look like in this ecosystem of powerbi I think we’re going to have some words to say about this one so Tommy give me give a quick introduction let’s let’s talk about Give me the definition for manage self-service and we’ll take it from there and we’ll start talking about it oh yes I there this may be become a
18:37 oh yes I there this may be become a series but managed self-service is the man the Blended approach be between discipline and governance around data or semantic models and the creation curation of data models and powerbi while allowing other teams to still create reporting it’s the mix between rep support creators and data set creators that’s really the difference here typically in a Mana self-service solution you have the semantic model creators I’m already
19:08 semantic model creators I’m already using it or centralized it’s governed those models are only coming from a subset of the organization and really the rest of the organization however split U between analytic teams or departments are only utilizing those semantic models there are use cases of them creating using non or based on the data sources themselves so if there’s Enterprise data it’s coming from the Enterprise team who’s creating the
19:38 the Enterprise team who’s creating the centralized semantic models if you want to create any reporting off of that you’re already using curated semantic models created any business data or something outside of Enterprise data does not live in the same area so there’s this really strong divide between what type of data goes into Enterprise reporting and who can curate semantic models from but it still allows teams to utilize their own business data
20:08 teams to utilize their own business data or create reports off of Enterprise data but utilizing pre-created semantic model data sets however whatever the terminology is yeah exactly right this I think is and you’re talking about this there there’s a lens I think around this that we’re talking about potentially the the data the the semantic model it’s gonna it’s gonna bug me today I’m sorry what a data oh man what a DAT transition now I’m going to say data set a thousand times
20:38 going to say data set a thousand times it it is because all the documentation still says data sets yes it does it’s not it’s not fully up the the semantic model and then accessing that semantic model with different patterns right so you you can go after the semantic model by getting access to the workspace where the semantic model lives you can get access to the semantic model by getting delegated permissions through an app that has a report in it so you can you can get by proxy get a I’m going to be able to build things from that semantic model through
21:09 things from that semantic model through the app and then there’s a third option you can get permissions by getting directly added to the security of the data set itself without having to go through the app so the app can say I’m going to deny you access to build things from the model but we will give you direct access so to me that’s those are all the parts of this ecosystem that are
21:30 all the parts of this ecosystem that are providing access to just the semantic model but where does self-service come into play now with fabric because with fabric we also now have this lake house full of tables we now have a SQL server list that has tables in it as well so does self-service allow other artifacts inside your call it your data set or your data engineering workspace is there more access being granted to users in here is that is that part of the story and Tommy looks like you’re waving your head no this is not part of your story I
22:01 head no this is not part of your story I was going to say that’s a whole other conversation the data warehouse and the louses I think for the majority of scenarios right now just focusing on the semantic model Journey okay so you want to focus this conversation for now around just focusing on the data set and the report layer yes layer yes okay is it certified or not not well no I and I think there this is the
22:32 well no I and I think there this is the story or the documentation for Microsoft if it’s in the chat will be in the podcast episode description this all sounds great and this is usually for larger organizations they already have a centralized development team who’s creating semantic models or like and their job their number one job is the creation of the semantic models for the use case for the business or other bi teams other analytical teams to create the reports off of so their job their task is not
23:04 off of so their job their task is not reporting but semantic model creation and using Enterprise or what’s been deemed Enterprise data so the only way anyone else in the organization if they want to utilize core data from the B from the organization is not going to SQL it’s utilizing a pre-created already cleaned data or semantic model and connecting to that powerbi so that’s
23:34 connecting to that powerbi so that’s where their reporter creation is coming from for again Enterprise data if I have like something from MailChimp or I’m in marketing and I’m trying to utilize some email analytics yeah I can use that I can do import but one that lives in a different area so it’s not going to be mixed with this the semantic model or the Enterprise data and then two there is also from what e apps or who gets access but the idea here is great because it does allow like there’s
24:05 great because it does allow like there’s this governance area of hey we know that we have these sources of Truth we know we have these semantic models that for our core sales are are operations that are already created and cleaned I think my question here is while the implementation of this or if an organization is starting or they want to move to this scenario I idea is great but I think there’s a lot more in the Weeds on getting this working properly where I think there can be a lot of frustration with
24:36 be a lot of frustration with organizations and teams not getting the data when they need it and also I think there’s a lot of bottlenecks that are not discovered in the documentation here I think this is I think this is this is really this is I think conversation so if you look at the two diagrams and these diagrams look extremely familiar because these look like things that Melissa coats has built for her here’s the parb ecosystem and she’s built out these diagrams I wouldn’t be surprised if these came from her but you you when you look at this diagram that they’re showing you
25:07 this diagram that they’re showing you here’s a scenario diagram and there’s like a thousand things on this one there’s models there’s direct query there’s data sets being all over the place I really think the idea here here is you’re as a as an organization there are and again I will point on this too because there is this concept of what is certified and I think that’s what they’re trying to note here in the main data set that they’re pulling everything from in this example is there is a you from in this example is there is a a shared semantic model data set
25:37 know a shared semantic model data set that has an endorsement it becomes discoverable we understand the lineage of it and there are people associated with owning and managing that data set for you there’s a there’s a point of contact so there’s there is in my mind there is a series of checkboxes that need to occur these are all things that need to be made at the process level within the center of excellence the center of excellence should determine how does your organization want to play this game and what level of comfort are you willing to provide back to the business to build things that they want
26:09 business to build things that they want there’s also this idea of I think there’s a m I think there’s a maturity inide the organization that needs to happen and I don’t think many organizations are thinking this way yet but they should start thinking this way is how do we articulate how do we Define when responsibility is moving from one team to another other team and as I think about this the responsibility of the data sets the responsibility of the report Creation in this example of self-service there is a central team or
26:41 self-service there is a central team or some team it doesn’t matter where it is that is owning the data set itself the semantic semantic model they are responsible for that part as soon as someone builds a report on top of it the report owner or the report Creator they’re the ones that are need to being held responsible for what comes out of that data set so there has to be some checks and balances here because I hear all the time well the business doesn’t know what they’re doing they’re going to build something that’s totally
27:11 going to build something that’s totally wrong the numbers are going to be represented incorrectly well then you don’t want self-service then you want it to be all managed if you if you say that to me you’re already implying to me that you’re not trusting or we haven’t educated the business side of things enough to be able to trust what’s coming out of their system so then what you’re telling me is all of your stuff needs to be in the certified realm and you can only trust things coming out of certified and that’s why we have that that Gap we have we have the whole certified thing just to to separate or
27:42 certified thing just to to separate or delineate between here’s things that have been tried and and tested we we stand behind these numbers everyone else is building something that’s not certified and that’s a that is a center of excellence I think policy that you need to figure out it could be different company yeah the the way I I used to view this was in each one of these Realms like what which one are you going to support in your organization and I don’t I don’t think it’s an all or nothing because if if it was all or
28:13 nothing because if if it was all or nothing you’re you’re just like yep we’re doing manage self-service Tommy to your point all all this is doing is pushing an IT controlled blocking mechanism into data set I’m sorry semantic models I know I know it’s hard as o as opposed to the data right like you don’t have access to the data has always been the problem for business for business right now we’re just saying well you may have access to the day but you don’t have access to the certified semantic model we have to go update that
28:44 semantic model we have to go update that for you right so so I’m not saying that’s bad because to Mike’s point I think rather than this being an all or nothing there’s the the story that we talk about that is more likely there should be self self service available to an organization in order for self-service to work people need access to the data sources right so they they’re they’re in their own ecosystem but I think it’s also the responsibility of the business intelligence team or the the data teams to have curated data
29:15 the data teams to have curated data sources for them to plug into where I think this is a natural graduation is in that growup story where they’ve built something that now needs to be pulled up back into a managed State because we talked about like the it’s a much wider audience we need to put data quality and governance around the metrics we need to ensure that everything that’s being created is is vetted and certified at which point we’ve taken something out of self-service and said this is more
29:46 self-service and said this is more important for us to manage but you’re not going to take that away from the report author right and that’s where I think the scenario works well because you’re saying hey this is still a data set that you can use or expand and build and and grow on but we’re just going to need to put this in this managed realm where yeah things might be a little slower but we deem it necessary because of the high value of the report or the audience or the whatever so like I I
30:18 audience or the whatever so like I I think where where I’ve landed now is is this is just it’s important that we we understand like this isn’t an all or nothing but it is very important for us to understand the type of implementation or the type of semantic model that you or the type of semantic model that is out in this ecosystem and I know is out in this ecosystem and I think you do that potentially with the certified label right like if it is certified this is part of something that is a ma a at least managed self-service or fully managed right
30:49 self-service or fully managed right where the the business May no longer have access to it yeah and I think there’s there’s two distinct areas I want to tackle something for Mike first but Seth you make a really good point about the people involved with this the S the the trusting of the data I think there’s two splits here on how this is perceived there’s trusting the data the semantic model itself and what’s coming from there and the source of Truth there but then there’s also the certification on the reports themselves and I think the there
31:21 reports themselves and I think the there the big difference there where there are certified reports that are coming probably from enter from the Enterprise team or the same people creating the data models living in certain workspaces maybe it’s a probably certified app yes I would agree with all that and I think I think that’s to me that’s like that’s like table Stakes that what we’re talking about here that there has even if it even if that report doesn’t answer everyone’s single question like the idea is we understand like the people who
31:51 is we understand like the people who build the model will understand it the best and hopefully there’s a business user or there’s a a PM that’s working between that Central data model and saying okay business users we’re going the reason we’re making this model is because we’re listening to what the things you care about are sure you said you care about these types of data in our organization oh and by the way these are how products and customers and
32:15 are how products and customers and orders these are how these things stitch together here’s an example of a model that you would help you get to data you care about sure exactly but I I just I want to I don’t want to understate the importance that those semantic models are also obviously the the certified reports and I think there’s a lot of Cur there’s a lot of adoption and communication that need to happen on if you’re looking for our numbers on X Y and Z from the organization point of view what we believe it’s going to be in the certified app coming from the
32:45 the certified app coming from the Enterprise team and then obviously the data models any manage self-service from those those models yeah they there may be a certified model but that does not mean the report certified that that’s for the business use case that’s for an analytical purpose so there’s there needs to be a very strong distinction and I think communication plan around that because you’re trying to adopt from the consumer side on the certified models and reports but then also you need to get the buyin from these
33:15 need to get the buyin from these analytical teams that this is going to be the process your all of your models will come from this data Hub and that how do you actually get access what if I need to configure something and like provideed a different solution this is where I might I I agree with the most of what you say I think there might be a couple things there I would tweak from my language on how I would describe the same thing right and the only reason I’m saying this is because I’m trying to think about here’s a scenario let me paint a scenario the scenario I’m thinking through is we have
33:46 scenario I’m thinking through is we have a central team that’s trying to get all the financial data together we have a financial data set we’re going to be building this is very common MH once I have a financial semantic model created I go into the finance team even say look here’s all the columns and and relationships of data the way you’ve told us we need to link this stuff together great here you go as the as as we open up that model to that the financial team right we as the central bi team or saying we’re going to the operational system we’re maybe doing some things in fabric we’re transforming data we’re getting shaped
34:17 transforming data we’re getting shaped out right that is now a data set we’re saying we’ve we’ve vetted this data set this this data set is good up until the data set portion at that point I think this is where it’s important for companies to understand there is a transition in ownership occurring at that point now let me let me be clear here let me keep playing the situation I you’re you’re already trying to push back here at some point I’m going to say the data set is good we’re going to certify it we’re GNA we’re going to publish it the team that owns The Next Step because there will need to be an owner someone who’s responsible for
34:49 owner someone who’s responsible for making the financial reports now whether or not you certify the content that comes out of that team I think is is a decision of the center of excellence however I don’t want to limit the idea that we can delegate certification to the financial VP that is responsible for taking the data set we have over here doing whatever you want to do and any report that comes out of that team they have a process that they also follow
35:19 have a process that they also follow that allows them to certify things so inside that team they may be building internal reports for themselves that they can’t publish to the organization because it’s not quote unquote certified but where I’m the reason why I’m pointing this out is because the center of excellence which the the leads of the finance group should be a part of they should be communicated to in the fact that hey look this is how our company’s going to handle certified things your team needs the ability to certify your own stuff because you you are the end
35:49 own stuff because you you are the end all story around what certification doing so what do you what do you what can that how is that okay how is is that team going to go through the same what let me let me back up when you’re going to take something from that team in the first place to certify it what are what are you doing as the central bi team or the Coe whatever to certify that data set what are the what are the steps that you’re taking to certify that so certified data sets have documentation certified data sets have an owner
36:19 certified data sets have an owner attached to them certified data sets have data quality checks that are going through it so you’re going through Source system elements and you’re verifying or you’re checking that the data is all there it’s complete and and current right so there is a process you you have defined a process that says again this is potentially different by every organization there are a certain number of metrics that you’re going to you’re literally phase Gates that when this data set checks these tick boxes we can say this thing is now certified so another a requirement for some certified
36:49 another a requirement for some certified data sets and organizations will be no sources from Excel you can’t have a data source be Excel it has to come from the Enterprise Warehouse somewhere it’s a just it’s just a potential for failure on that data set running and refreshing every day and you’re trying to remove those things can you have a certified data set that doesn’t come from a central store right in many cases like today right like you have many many different third party systems right I is is a can you have a certified semantic model in in reporting that is pulling
37:21 model in in reporting that is pulling directly from multiple third party systems there I think there’s an upgrade story there well I so the answer I think to your question Seth is yes and in and what I had in Epiphany yesterday as I was talking through this whole fabric story there’s a potential for even delegating certification to different teams across the organization yeah know I Tom you’re making your funny faces here I think you’re going to have to again this is a
37:52 you’re going to have to again this is a data culture issue this is an education issue on the company side there there is need there is the need to have multiple people or multiple teams be able to certify their own stuff and the only reason you can do that is because everyone or at least the central the Coe team is defining okay if we’re here is the process by which we say a report can become certified as long as the team understands that and the team is going to hire there’s there’s 100%
38:22 to hire there’s there’s 100% there’s already organizations hiring powerbi developers that are really good in their business units because they know they want to do better powerbi and they’re looking and seeking skills to bring into those business units those business units need to have the ability to say we will certifi our own stuff and this will be a certified data set and then the decision can be backed to the center of excellence does that certifi stuff get scared across the whole organization how how does that how does that how does that work in conjunction with larger efforts around
38:52 conjunction with larger efforts around the data organization to consolidate right the the the logic from all of those systems together to a a source location like lake house right like I have artifacts that are available to the organization that are curated com many many times combinations of multiple systems that can be used as data sources how how you need to bring it to one lake house the the the one Lake experience is everything so again
39:22 Lake experience is everything so again this is where someone’s using one Lake well so one lake is all the lake is roll together if someone if if the organization has adopted fabric sure sure I I guess where where I’m driving at is in in my mind there’s a difference be between a semantic model that’s created from multiple different systems and a lot of business logic and ETL is in that singular singular location which doesn’t have a lot of
39:54 location which doesn’t have a lot of reuse capabilities outside of reporting right and unless all of the sudden we’re throwing things on its head and like now pulling semantic models into notebooks like this new thing we talked about earlier and like it being this thing this spiderweb of it’s going to be dependencies y or versus like a a true in my mind quality governed data sets that are combined in a central location first and then aggregate like done whatever we want to
40:25 aggregate like done whatever we want to from a data set or semantic model perspective well we need to make the clear distinction between the story for a report to go on that path for promotion or certification and then the semantic model itself if correct I agree there are two separate stories there’s own yes yeah there’s a certified there’s a certified path for the data set and there’s a certified path for what a report would look like there’s the process of ownership too if any cement let’s say let’s take the example of we have all these different curated sources
40:55 have all these different curated sources and there’s data that exists outside of Enterprise or outside of what’s acceptable that needs to be part of it needs to be part of the certification process the it really the the focus needs to be on the company’s metrics and if there’s now something being introduced that the company is gold upon or hold on hold on hold on all right keep going that if it’s an Enterprise level kpi or metric and that’s not part of a semantic model part of the data
41:25 of a semantic model part of the data set Hub or part of the certification then there needs to be this story of hey business or Enterprise Center of Excellence we have this semantic model that we’ve been using for self-service but we need to curate this additional Source that’s coming from this API that API is not managed by our team but we need this in the on the Enterprise level that model then becomes a candidate to then get be owned by the Enterprise team they need to because they need to own it and then it lives in
41:56 they need to own it and then it lives in that location because don’t disagree I’m talking about I’m talking more about the reporting side of things the reporting is different so but I want I wanted to tackle that first because I will transition my ownership of a semantic model to the Enterprise team and they will have to adopt those sources and make that part of their own workflow and Engineering this is this is this is this is my challenge right like don’t we have to be cautious when that happens though because you like the int of a lot of these newer
42:29 int of a lot of these newer implementations and fabric breaking down walls of like these teams Etc is it in order for us to be successful we keep driving at this conversation around the the report author the data Steed from these business areas really needing to be part of this process sure and and what like what I’m hearing described here is like nope we’re gonna we’re going to pull it away from them and then we’re going to need a project person in between the two groups yeah if it if is is that the right story though or or
43:00 is is that the right story though or or is there is there or is there an opportunity or should that person go along for the ride with the certified data sets and just understand how we do things do we train them them up do we have them manage their own models in a new certified way like but this is this is this is all of the sudden where like the idea Fabric and that ecosystem starts to make sense because it’s like Hey listen we don’t have any expectation that you’re going to understand this out of the out of the box correct we’re we are going to
43:32 of the box correct we’re we are going to assume control of this for the moment but we’re once we’ve rebuilt it or once we’ve put it on the this this train track you’re you’re now jumping on the car right like here’s how we did the things that that you already created here’s the things and and it’s a symbiotic relationship where I’m not now in your way I’m not like the it’s going to slow things down 100% but I I think it also reduces the the conflict that
44:03 it also reduces the the conflict that would sometimes be created if you’re just saying that the Central team’s now going to assume responsibility because that business unit isn’t going to stop innovating they’re not going to stop need needing changes to that that data set that they created that they wanted to use or as I’m talking about this or is it thrown over the fence and they create their they had they now like have their own version and they they they start building against their next rev of something and then that gets
44:35 next rev of something and then that gets pushed up I’m going to challenge to so this this was my point that Tommy you were so two points here Tommy you were talking about kpis that are Global for kpis or kpis are coming from someone somewhere in the business one thing I would I would argue with that one as well is where do those kpis come from the central bi team is not making the kpis o come from no kpis come from the different business units or leadership the central ba team is only implementing what they’ve heard other businesses say
45:05 what they’ve heard other businesses say are important so Central ba team is a is a group of people that are doing the work because technically we don’t have that skill set inside the businesses that can say at a central they have Central bi team has access to data that other people don’t have access to and they’re listening to here’s how we’re going to C they’re working on the idea of what is a customer how do we count sales when do we count for sales but they are implementing it and potentially facilitating this discussion but who’s
45:35 facilitating this discussion but who’s making those decisions finances in a lot of cases so there are teams of people that are defining the kpis and they’re owning them and that’s what is then broadcasted across the organization that’s how this has to work the bi the central bi team does not define these these metrics they only use the definitions that are given to them Mike where do the metrics live they live in the semantic model they live in the definition of what the business says it is which then get implemented in reality into the semantic model it doesn’t just
46:06 into the semantic model it doesn’t just it doesn’t they don’t live there it’s not that’s the only it’s an implementation of what we say as a company that’s going to be utilized right but it gets so to to Seth’s point then right the central bi team if there’s not the skills in the business unit in the finance department to do these centralized models as we’re as we’re describing the center of exell should be saying look this is what we say is is quality this is how we know you can trust a data source and and you have to use there’s there’s some political horsepower that needs to be
46:36 political horsepower that needs to be used here and that’s why you have a central team because the central team can report up to those executive sponsors or VP level people and say look this is what we’re doing this is the team that will own this I I fully think and and and now and more firm believing in this is the once that model is created there’s a high likelihood that that model may get pushed into the finance department and there may be an identified leader in that team that says for the kpis that you’re responsible for
47:07 for the kpis that you’re responsible for you are required to provide out financial reporting to the organization your team needs to own this one we’ve worked with you we’ve educated you here’s the model that we we started with we all agree upon this is the great place to start here you go business unit you are now the owner of that certified thing and that becomes a collection of potentially other parts of the business operations HR Finance just to name a couple where those teams in themselves are creating grooming and certifying
47:37 are creating grooming and certifying data sets and to your point Seth it doesn’t necess these certified things it is it is a a certified thing is someone is taking ownership and responsibility around creating data that is being shared broadly across the organization that’s how I would Define it and because of that HR May be pulling data from Rando systems that are not coming from a lake house that’s okay as long as they own it and if there’s any problems with the data or any issues with it there’s a person to go back to to fix the
48:08 person to go back to to fix the information so there there’s an identified leader of that report and or data set but I think yeah but to to me to me a promoted dat that that’s a good use of promoted like I have a clear documentation in ownership like that’s a governance thing where I would say like yep if everybody has the ability to say that I put my name behind this data set yeah certified puts it in a different realm for me where we’ve got quality check to your point quality checks monitoring on
48:38 point quality checks monitoring on sources some other validations and and insuring that the downstream reporting it absolutely has checks and balances in place before it gets to that that end audience and and I think there’s there’s a big distinction here between a data source and again again what does that semantic model have that would be coming from Enterprise it need it has to have in the semantic model regardless of how the logic’s getting there working with the business the core metrics if this is
49:09 the business the core metrics if this is how we count sales this is our member count whatever those core metrics are need to live in that Enterprise semantic model sure yeah and I think that’s not disagree I’m not disagree with you that at all I’m just saying who owns who ultimately has the responsibility for the Financial semantic model sure but that like and and ultimately it’s the difference between I I I would think the ultimate ownership of areas or the the business goals and strategy are
49:40 the business goals and strategy are owned by the business teams right the bi team is the facilitator we’re the it’s the same thing as saying like like we’re the facilitator of many good things for a business between that strategy and the business users right and this even d Tails into a comment that James L is making right Enterprise teams need to prove what value they bring to the business unit since business unit already has their own solution correct I don’t agree well I agree with that to a point right they’re gonna make their own solution regardless well Well here here’s this here’s this
50:11 well Well here here’s this here’s this thing rather than be combative right which that is a loaded statement like oh you have to prove your value well a a phrase like think like the business act like it is not just Whimsical right data people in positions that we are recognize the value in both areas it and and the business the business yes you find Solutions and many times it’s it’s wasteful it’s not efficient and what are
50:41 wasteful it’s not efficient and what are the things that we constantly come into and talk about a lot automation efficiencies right timeliness of data because more often than not can you build your own solution absolutely you do but when somebody asks you to go build it again what happens it takes you another day what if they want to look at in a different way that’s another four hours what what happens when you need to regenerate the report you’re you’re you’re regenerating the report takes you another four hour like we call that virtuous waste that’s that’s not valuable use of company time or you
51:13 valuable use of company time or you solving problems that’s you being a data person and and we have efficient much more efficient ways to do that and that’s where Solutions like this and our and teams like this playing advocate for both sides create huge wins for organizations because you streamline all that I want to see the data you can see the data how you want to see the data when you want to see the data and that allows people to make decisions much much faster so there’s this this
51:45 much much faster so there’s this this huge part of all of that being like the integral Parts between the the organizational units but this is also I think part of the conversation where who owns these types of artifacts that are that are created out of these exercises where we’re solving these problems right and and and and like it it’s there’s there’s so much conflict in me when it’s like yeah I think we I think that makes a ton of sense and but then there’s this scenario
52:16 sense and but then there’s this scenario and then like like this one doesn’t and we can’t we don’t want to go back into a realm of old school bi where there are walls between us but do we have enough of is there enough technology or have we lowered the standard enough to bring everybody onto the same page and I think we’re making like direct Microsoft is making directional shifts yes so that these are actually conversations where we’re like yeah I maybe there is an opportunity where we can like pull the groups here together but we’re like I
52:48 groups here together but we’re like I think the the rough spots in here for me still are it’s not it’s not fully fleshed like this is a new I think realm for all of us in business intelligence and data because so many more people from the business are on the data train right recognize that there’s a bunch of value that can be created in tool sets that they’re now accessible to them and then you butt that up against the Enterprise side where there have been teams who’ve been doing this stuff for a very long time and you’re not just going to shed all of the learnings that we’ve
53:19 to shed all of the learnings that we’ve had along the way but when you start talking about managed and self-service all in the the same conversation these clash and there are some good paths I think back and forth but there still are some walls that feel like we we need to solve like climb or get around well and I I I love that because I think with the introduction there are so many more people are getting into the data
53:46 more people are getting into the data space and really it’s all about I’m getting back to the theme of location and this is the big difference where in the power Barry platform what exists is the ability for where does the data not just the semantic model live but also where do I live when I’m working with the data and where are the consumers is going to live looking at certain data so if I have the separation of I can still access the data I can do analytical things on it but there’s a distinction between the location of the
54:17 distinction between the location of the this enterpris walled in data of the access but of me as an analyst that’s outside of the realm of Enterprise but can connect to it I can see if we want to see our source of Truth company level metrics fine that is managed walled in that goes through the all process but I still have the flexibility and I have the access in curated workspaces manage self-service distinct workspaces and apps to do customization and come of that flexibility but I
54:49 and come of that flexibility but I think those work streams are because it they all live in the same space I can still publish to a workspace that’s been created and approved by the organization I can utilize the reporting Solutions and again that doesn’t have to be so to speak vetted but again there’s that clear distinction of certified and I think that’s where the big it’s not just getting access to the data it’s not just having access to Enterprise data it’s the difference between when I’m want to stamp something as certified be
55:19 want to stamp something as certified be it the model be it a metric or be it a set of reports and being able to in a sense share that that’s to me where the difference is but allowing Manis Health Service also pass of content to be shared is where the I think that flexibility comes into play there’s so many thoughts I have on this one I I think as long as you’re working with a a common process that is
55:51 working with a a common process that is communicated across the organization and you are aptly thinking through what does a data stewardship process look like you can you can decide as an organization does your is your organization and again this is part of the decision here is what is the skill of your people if the skill of your people is everyone just wants to consume consume reports without actually trying to build anything you’re going to you’re going to have to have a central team that’s going to build stuff for you because the broader part ofct organization does not
56:23 broader part ofct organization does not understand how to build a semantic model and own it and keep it fresh and good and clean on the other hand as your data culture evolves more and more organizations will be able to evaluate their team’s skills and I think this is a critical point here skills in different business units will be very important act like the business think like it the business needs knowledge the business needs capable people that are going to be able to do what part of the actions that come out of the central
56:53 actions that come out of the central team or that Central bi team you’re going to hire you’re going to bring your smartest powerbi person who can build models and data engineer stuff they’re going to live in a single team that may that may work well for a handful of those really centralized models but I have to imagine over time if you’re ma if you’re grooming your people well if you’re grooming organization well the knowledge level of everyone in the company will start Rising people will understand how to use powerbi. com more they’re going to get more comfortable building their own models they may work more in in LS and Excel and you’re going to find not everyone will grow at the
57:24 to find not everyone will grow at the same rate there’s going to be certain departments that get this faster and can do more quickly there are some amazing Excel things that are being built inside the finance world that I’m like what the heck are you doing you should have been a developer because this stuff is so stinking complex you’re writing these really like you’re you’re literally making a database inside an Excel sheet like that’s not cool it is cool but it’s not cool like so so I know there’s skill capability inside organizations I’ve seen it it’s now just a matter of okay let’s channel that energy into the world
57:56 let’s channel that energy into the world thinking around Microsoft and powerbi things and then also at this and this is where my mind goes there used to be some very hard barriers to data across the organization like I traditionally when we built data things we’d have a Dev environment separated by a test environment separated by a production environment Microsoft is changing this whole mentality especially with data because now it’s just powerbi. com the service is the same service there’s no
58:26 service is the same service there’s no Dev servers there’s no prod servers it’s all one pile of powerbi and I think this is something that organizations need to Grapple with a lot of organizations still feel like they need full separation of Hardware between different environments that is a very on Prem old not old school that is a very that is a way of a way of thinking what is happening now is Microsoft is trying to meld all of this together so so I now use only one
58:56 together so so I now use only one service called power. com and then inside that now I’m building Dev test prod workspaces that are essentially are all living on the same machine the same Harbor it’s all it’s all in the same spot when you move from on Prem to now software as a service I think our mental model of how easy it is to share data across teams the one Lake service is all is all lake houses all rolled into the same API Group so
59:27 into the same API Group so realistically my Dev data is sitting right next to my production data in this thing called one link and I’m using an a user managed identity to access that data or not between those different environments we need to start thinking about we’re not trying to separate physical things by Hardware anymore we’re all going to put everything into a platform and we’ll have proper controls on top of that that actually segment things out and I I think I think for for another conversation is does
59:57 another conversation is does does the new does that new mentality and fabric work with embedded scenarios where where there there there may may not be Hardware but in many areas there’s completely different resource groups there’s a completely different bifurcation of data that does not span environments right and and for used to be how it was it is not used to for production applications it’s a very real thing one lake is different now if you use if you use things in production with
60:28 use if you use things in production with one Lake we we can we can talk about it but this is what I’m saying though I get the the what I’m saying is I’m observing the trend that Microsoft has already done this with power. com you don’t get Dev and prod power. com you get one is that because that’s by Design or because if you tried to spool up multiple of these things it would be cost prohibitive economy of scale would tell me the economy of scale tell me I know the economy scale would tell me it’s not that it’s it’s efficient to
60:58 me it’s not that it’s it’s efficient to have one thing and only one thing only so let me I’ll say it this way that’s probably another conversation about how that works as far as like there to me there’s a distinct Trend or shift happening here that is moving away from physical Hardware separation and now we’re moving more towards a service-based application and when you move into that software as of service it changes your entire mentality and and Donald you’re right on point here it makes it really easy to move between Dev test and PR if you have the proper access granted to you it’s all about your email address that can be
61:29 your email address that can be controlled all the way down to the different service areas or control pieces and the yeah the last all this all this is let’s we are at time we’re a little bit over here this is a really good conversation I think this is something that organizations need to start grappling with and start figuring out for themselves where does it make sense for you to start educating your business units versus Central teams and you probably will start with a central team building a lot of the really good certified things but you should have a
61:59 certified things but you should have a plan around does it stay that way or are you willing to delegate some of that information or delegate some of that responsibility to other teams in your organization I think it’s definitely worth at least a conversation to have that and what is the plan or strategy around that because that will really inform what you do on a day-to-day basis to make that work all right with that thank you all very much for all five of you who have hung around the entire time throughout the podcast we appreciate your listenership I hope this was a good conversation and make you start thinking about what is
62:30 start thinking about what is certified how do we integrate self-service with the certified thing it’s going to continue to be a discussion point and I think as we start seeing this technology change particularly with now all the addition of fabric this is going to open up a whole new world for us for more self-service things on top of other things beyond the semantic model so for those of you who are playing the drinking game around semantic model you must now take another shot anyways thank you all for listening we really appreciate your listenership if you like this podcast if you found some value from it please share with
63:01 some value from it please share with somebody else put down your thoughts what do you think share share the link to the to the podcast and give someone else some what is irrelevant to you and what do you think about self-service bi how does this work with powerbi Tommy where else can you find the podcast you can find us in apple Spotify or wherever get your podcast make sure to subscribe and leave a rating it helps us out a ton do you have a question idea or topic that you want us to talk about a future episode head over to powerbi tips podcast leave your name and a great question join us live every Tuesday and Thursday
63:32 live every Tuesday and Thursday a. m. Central and join the conversation on all power bi. tips social media channels and if you really want to listen to us late at night and need to go to sleep we will be more than happy to drone you to sleep so I think this was a very detail I really like this topic but everyone else who who listens like Michael you’re always on at a. m. I see you all the time on my YouTubes and my LinkedIn I don’t know what you’re saying but you’re just talking a lot so I must be hitting a very Niche Niche group here on on this topic and stuff like so anyways thank you all for listening we appreciate it we’ll see you next
64:03 we’ll see you next [Music] [Music] out
Thank You
Thanks for listening to the Explicit Measures Podcast. If you have a topic you’d like us to cover, drop it in the suggestion link above, and we’ll add it to the queue.
