PowerBI.tips

CLM Part 4: Validate – Ep. 348

CLM Part 4: Validate – Ep. 348

Validation is where content lifecycle management turns from a plan into a repeatable, trustworthy process. In this episode, Mike, Tommy, and Seth walk through what to validate, who should validate it, and how to make validation part of your release rhythm.

News & Announcements

Main Discussion

This episode continues the Content Lifecycle Management (CLM) mini-series, moving into the “validate” stage — the work that ensures changes are correct, consistent, and safe to ship.

Key themes include:

  • What “validation” means across the stack (data, model, report, security, and performance)
  • How to define acceptance criteria so releases aren’t subjective
  • The role of business users vs technical reviewers in sign-off
  • Building validation into your process so it happens every time (and not only after an incident)

Looking Forward

If you’re trying to mature your deployment process, start by writing down the validation checklist you already do in your head, then make it visible and repeatable. Over time, you can automate pieces (tests, performance checks, and guardrails), but the first win is aligning the team on what “good to release” actually means.

Episode Transcript

0:38 good morning everyone and welcome back to the explicit Ms podcast with Tommy Seth and Mike good morning everybody good morning Mike good morning ready to jump into another end of our of our week it feels like the end of summer is coming near we’re getting close to kids getting back to school things feel like they’re very busy we had yeah we just sent two on the bus today which was very exciting it’s already quieter but the boy doesn’t know

1:08 already quieter but the boy doesn’t know what to do when he can’t mess with someone yeah right we’re finding out now we starting to pick on Mommy and then that’s not gonna go anywhere because Mom’s twice twice your size size kid he’s I think he weighs as much as a dying Stars already so amazing is awesome our main topic today is we’re going through the content life cycle management this is a documentation that has been produced by Microsoft and clearly has Kurt Beer’s hands all over it so we’re going through

1:39 hands all over it so we’re going through that again we’re going through step number three or section number three called validating validating your data in the reports what are techniques that we use there and that’ll be our main topic for today but before we do that we do have a couple news items one of the main news items being the Microsoft blog is now out for the August update and do have to give an update on my August version of desktop so I I said hey I caught it right on time I updated my desktop and it totally died I couldn’t

2:09 desktop and it totally died I couldn’t do anything nothing was working so what I found up I what I did is I actually downloaded a new version of the installer and uninstalled entirely desktop and then reinstalled it again and that seemed to fix it so I don’t know if I got like a bad version if they quickly updated a package and and they maybe I don’t know something something could have happened where they fixed it but on my end the version number didn’t appear to change but after I reinstalled everything it seemed to work again it just really odd I’ve never had a

2:39 just really odd I’ve never had a download not work from Microsoft at least in the prob desktop it’s been it’s been a while you used to do that all the time really oh my gosh yeah dude in in Visual Studio you’re working in there with all the addins and all the the oh man just uninstalled all reinstall bi developer tools that’s open source problems yeah oh that tool was a mess like it was just so I remember just working in tabular editor or the tabular

3:12 working in tabular editor or the tabular models inside Visual Studio it just crashed all the time it was really bad this a little rough a little rough back in the day where this is where the Scars come from yeah exactly right I have a little quick beat from the street I guess in addition to the news actually sorry I didn’t I didn’t mean to go too far there I I went too fast did any major announcements or things we want to pull out of the monthly blog post about that I’ll put that in the chat window as well hey there there is a a cool one yeah

3:43 there there is a a cool one yeah I want to stress test it but it’s also separate is adding data limits I saw that manage performance issues so number of rows that what is this is this on export or is it on it’s in the visual actually shows so you’re only showing you’re only rendering 100 rows or a thousand or whatever the case may be yeah I didn’t know what that M have yeah but like from a performance standpoint

4:14 but like from a performance standpoint that would help out extremely right like it’s it’s it goes back to that question I know we talked about this probably in the hundreds first for 100s episodes where there was I shared some frustration when we were talking about tables where it’s like I I just want a top 100 yeah right just give me the top 100 like they don’t need to see everything like give give them examples but instead it renders the whole thing right it’s it’s constantly trying to True chew through the way too much data

4:44 True chew through the way too much data sometimes 500, 000 row data set in a table and and this I love I think this I think this could have some significant improvements to a page load and I think even if you so this is on the filter pane it’s a data limit card as if it was like an additional filter on the on the visual and it even notes I think there I think I saw I read up on this one a little bit it something about don’t worry when you have the little you’ve exceeded the limit and you’re getting

5:14 exceeded the limit and you’re getting limited there’s a little warning symbol that shows up on the visual even though you’re being limited it will still show you that symbol so that way you hopefully pick different things and then Whittle down more of the data so you have even less records so it actually would show what you would want you could show easily a thousand data points on a visual no problem but tables are interesting because they have to scroll and they’re off the page and they get really wide and so even though you’re rendering a lot of rows those are

5:44 you’re rendering a lot of rows those are the ones that seem to just be interesting you’re not really doing aggregations or you’re you’re returning all the text Data of the table which could get I guess very verbos Dax query views in the web but you can’t save them so that’s fine to just I guess if you’re Tes you can’t save them that’s got to be a close follow there yeah yeah that’s that’s like we got to get it out we gota get we gota get feature out they can say get

6:14 we gota get feature out they can say get them used to it they can save next month yeah yeah exactly and yeah that was I’ve been playing around with that one I like the Dax cor reviews it does a lot it does a couple things that are relevant for they’re relevant for modeling experiences things that you would typically do in tablet editor go grab the definition of this measure go grab this measure and all of its dependencies write them all out right so it it does let you edit multiple measures at the same time which I do think is very

6:45 same time which I do think is very valuable so I really like that feature of this if anything else it’s really good for that John KY does a lot with I don’t know how he does it but there’s testing that he does I think with this either I don’t I’m not sure if it’s automated but I believe it’s manual he actually writes tests in here and says I want this data run this table of data return the results and then he uses that to validate or use data quality this is actually very well with our topic today around talking about validation to be

7:15 around talking about validation to be something you could use to do that with anything else that stood out to you in the in the blog h more than half of it’s this actual service I I do miss the days of everything just being all major desktop updates and it’s like oh we did something quick on the service so still a ton of co-pilot again if you use it if you got it so I had a beat on the street I I just had someone asked me they wanted to dive into US

7:45 asked me they wanted to dive into US using co-pilot and they’re like just having that conversation it’s like well it’s a separate co-pilot than Microsoft’s co-pilot but it is so well because if I get the co-pilot for office or for Azure which is coming out I’m not paying per for usage this is a as what you describe about the licensing of co-pilot is a sore part in my my mind I pay for the Microsoft Office

8:17 mind I pay for the Microsoft Office co-pilot and I don’t think I use it to its full potential or enough I don’t I have it I’ve done it a couple times to have me write a couple like starting points for emails or refine what I’m saying in an MH I don’t use it much more than that Telly what do you use co-pilot for I use it for Microsoft Word primarily I tried using it for PowerPoint until it ruined my ruined my slides slides so there was no undo button there well there was an

8:48 button there well there was an undo but what they did I was I got it was enough to be a word it was so bad it was it it added weird images it removed slides that were important it was like it’s it’s not look yeah anyway added weird images like yeah I was like now we’re my PowerPoint slide have a whole bunch of vegetables in it now I don’t know what happened I was typ wrong you know what happened I was typ wrong like those those images that you know like those those images that you would see like in third grade or fourth grade for like a presentation and it was all like what were they called again the they were like a type of logo or a

9:18 they were like a type of logo or a type of image generated on Old Computers oh was it like apic Graphics thing kind yeah very close but it was all that like you could always choose whatever program was you could choose from the a collection of those oh clip art maybe yeah yeah very close to that but it was worse than that and not even related to hey we’re gonna talk about data so it showed a bunch of offices so I like okay not helpful not helpful so yeah I’ve also in teams a little bit I I’ve

9:48 I’ve also in teams a little bit I I’ve used copilot I think of the ones I use copilot the most around I use copilot a little bit in teams especially after a video has been recorded there’s I

9:57 video has been recorded there’s I guess there’s the the summary or the recap I use that a little bit but again I I feel like if I pay for co-pilot I should be able to use co-pilot everywhere RBI Office Outlook it should be if I’m paying for it it should be one package bundle deal okay Microsoft figure out how to cross charge yourself I I would I would attempt to use it more yeah not not a fan of the co- Pilots right now I will say for those who do have copilot they’re trying to find use cases one that’s actually great is Outlook not for writing because it

10:27 Outlook not for writing because it that’s weird always to say like I I need to write two sentences and it’s not g to it’s going to be so foreign off my own format that’s like I don’t really need it but you can actually ask teams say like hey give me a summary of what did I miss yes and that will actually go through F not just your inbox but all the folders I’m like okay this can be helpful if I was an executive that would have been awesome or that would be awesome that’s a good point I have one more other news article

10:57 point I have one more other news article or or maybe call it beat from the street as well something I’m learning is right right now if if we’re okay to move forward can we transition a bit you can all right so we’ll get off the the co-pilot gravy train here and go to another beat from the street so I’ve been working with moving a git code so you how you have the workpace and you can attach it to it as your devops get repo I’ve been trying to move code between two different tenants so I have one tenant that has a get code in it and have a

11:27 that has a get code in it and have a second tenant that does not have any code in it and so I’m literally checking in my workspace grabbing all the most recent definitions of everything that I’m using and I’m doing everything except data flows because data flows doesn’t go into doesn’t go into the the git repos so I’m not using data flows gen two but what I found is if you want to upload like notebooks and powerb reports and semantic models all of those items are easily able to migrate between tenants when you try try

11:59 migrate between tenants when you try try to migrate a pipeline you think about a pipeline if your pipeline calls a second pipeline there’s a reference between the two it basically creates a link between the two pipelines it calls it’s called connection or reference Connection in the Json if you have any references between things it will immediately break and give you a non-descript error that says you’re missing something so if you have a connection missing something yeah that was the error it’s a little bit more than can’t top my head it’s it’s

12:31 more than can’t top my head it’s it’s like it can’t find no but it’s not much better it’s like can’t find dependency okay it doesn’t tell you which one it doesn’t tell you what the name of it or the go of it it just says you can’t missing dependency can’t upload it and I’m thinking to myself how do we fix this and it was very mysterious to me initially because I would upload a pipeline one would work and one didn’t work well one had no references to anything else yeah the other one had a bunch of dependencies of like interdependent like on a connection string which was in The Connection Manager and then it had a reference to

13:03 Manager and then it had a reference to another pipeline which was another referenced external object so when you move between tenets it doesn’t use the same guids or it can’t find those objects and then it’s like I don’t know what to do and I’m like why would you ever build something like that if you can’t find it just say broken and let me come in and fix it after the fact it was it was just I like really I don’t know like your description of things especially when you’re moving through tenants it’s not recreate you’re trying to create

13:35 not recreate you’re trying to create recreate a environment create or you’re trying to trying to deploy something that already existed because it had the ID already Associated your code yeah but you’re like it’s got to it’s got to generate its own code right so what I would argue is you would need like we would run into that same thing it’d be like hey I want to deploy deploy Dev the test same thing but it’s the same as like saying I want to deploy code from Dev in a SQL Server

14:07 to deploy code from Dev in a SQL Server right SQL deploy new code into a new environment yeah like I can’t do that without the objects already created so my deployment has to take into account the base level things before I start building like adding in all of the other things that have the dependencies and those those base level things may have different IDs right so your deployment may require you to go build the first stuff then update

14:38 the first stuff then update your second set of deployments to have the references for the existing things this is exactly my point but the challenge is if I’m trying to build continuous integration continuous deployment of these things I should not have to manually do anything they should all be scriptable and it’s not that’s my problem that’s my challenge really right so all the connection like so the main issue was was well there’s connections that are not there okay well how do I so what you can do in the pipeline’s experience is okay which pipeline doesn’t have any dependencies deploy that one first and then deploy the one

15:08 that one first and then deploy the one that depends on that one so you have to deploy the pipelines in a certain order to get them to deploy correctly okay fine no big deal the interdependent parts of the pipelines maybe not an issue but the connections to an API a SQL Server none of those you can’t script them out you can’t export them from the the central connection management screen and they don’t live in the workspace so you have to go in and either delete all of the connection IDs in the Json and then

15:41 connection IDs in the Json and then deploy it and then rebuild it all it’s just it it feels like we have the ability like it feels like we jumped the gun a little bit we jumped the gun on hey look everything’s in get you can check all your changes cool now we have all these other gaps now to deploy things again and this is the same problem you’re going to have when you go from Dev to test to prod because in that world as well imagine you have a Dev Lakehouse and you’re going to go to test you’re going to have a test Lakehouse which will be a different guid and so the deployment pipeline actually needs to be able to parameterize oh we don’t

16:13 to be able to parameterize oh we don’t want to touch that old lake house we actually want to touch a new one or a different one so there’s there’s all kinds of new patterns that are appearing now because we can go into get it’s just all I’m saying is it’s exposing a new layer of things that I didn’t know existed or challenges that that I didn’t know existed before I thought it would be much easier to deploy stuff and I’m realizing it’s actually way more difficult than I thought anyways just a just a word of note there or a word of caution there needs to be some better tooling and improving in that area you found some more use cases that require

16:45 found some more use cases that require some attention oh boy more waiting till things deploy or don’t work as you would expect them but this falls back to like a couple episodes ago right where we were talking about like when you adopt these these certain things and this is why you test that out this is why you test your use cases your scenarios your like your strategies your things because in some cases it will work for you right now and others there are going to be challenges that you have to work through and I think that’s that’s more of a blocker to these new things than

17:16 blocker to these new things than anything is not necessarily like obvious the the blockers would be you could work your way around this yes right or you could dedicate a bunch of time to figure it out but that’s where I think most people just be like yeah if it’s I don’t have the time to go figure out all the the bits and pieces to make it work and I think that’s where MVPs play a bit of a role here do a little bit so people come up with Solutions

17:47 people come up with Solutions it doesn’t work the first time they like H and then you see a bunch of MVPs who are were B let’s let’s be real we’re basically n we love pain so we’re on the bleeding edge we’re building things that don’t work and then we’re like great this is what we wanted to do the experience isn’t what we wanted and therefore how do we work around that or what are what are best practices in doing this so that it’s less work for easier to manage or all those things Mike rather than griping about it you should have just brought the solution today like what

18:18 the solution today like what you ran into this problem I you just go figure it out for everybody well right now the solution has been everything yeah right right now the solution has been well I have to go create well right now the solution has been delete all the connections out so I can start over again and actually just manually build the connections in the new world so basically the solution has been if you’re doing this movement of things delete all the connections get rid of them then deploy because it now will deploy successfully and then once it’s

18:48 deploy successfully and then once it’s deployed then go back through each activity that needs a connection and rebuild it not ideal but it seems to work for now anyway just FYI about that I just want everyone to know that it it it is a interesting experience it probably doesn’t work as you will expect be patient I’m hoping Microsoft fixes this and adjusts this later on but for right now continuous integration continuous deployment not not my favorite still not smooth enough yet for me between tenants between it’s between

19:19 tenants between it’s between tenants right now it’s the example that I’m using but this could also be between environments I I think the same problem exists in that realm as well Roger Roger Roger Roger Dodger all right moving on ahead Tommy kick us into our next continuous integration oh sorry oh man too many words can kick us into our content life cycle management CLM and what what are we going to talk about for validating and give us some framework here about why this is important

19:49 here about why this is important arguably one of the parts of what we do

19:52 arguably one of the parts of what we do that gets the least amount of Love probably so we gluttons for pun and do it do it [Laughter] anyway because who doesn’t like to go back and back and check so or we have no idea how to do it but we’re again just in case you missed it we’re going through Microsoft’s documentation on content life cycle management which is really the life cycle of a given piece of artifact probably semantic model in powerbi

20:23 probably semantic model in powerbi we’ve always focused on building but there’s still a lot more so we’ve done we’ve dived into planning developing and managing and now we’re on to validating the content and we’re going to talk about why it’s so important and methods we can do so based on Microsoft documentation what they recommend and really what other probably alternatives are are there I lik the intro of the article where it talks about the role of the center of excellence in how to what role

20:55 center of excellence in how to what role they play in making sure content content is Val validated and I liked their definition here the content of the center of excellent is responsible for over overseeing powerbi in the organization But it includes decision makers who manage the life cycle of the powerbi content so they don’t manage the content necessarily maybe there’s people that do review and manage content on there directly but it’s more about building the process the organization’s going to adhere to to say this is how

21:25 going to adhere to to say this is how we’re going to play games this is how we’re going to play ball we’re going to set standards and then communicate to the team or the company here’s what we think is a good pattern to use with PBI with managing content I really like that use case for the center of excellence I I also like that it didn’t emphasize that the center of excellence is the gatekeeper for all release content because I don’t think that’s the right approach I’m more of a Federated approach your thoughts Seth you’re going to say something I agree with you right like the center the center of excellence

21:55 like the center the center of excellence is is not the it can do everything team for you right it it helps you guide and build the standards for business units and organizations yeah I agree with that one all right jumping in let’s talk about what things should we validate so let’s when I when we mean validation I think this is the step in the process where Tommy’s made a report it’s got some bookmarks in it we’ve got the the data in there is correct there’s a lot of things you potentially could validate I would assume right does

22:26 validate I would assume right does the do the colors match the company branding that we would expect right that’s maybe a potential validation does does the data actually represent the right numbers when I click on sales by region or whatever the use case is Right does the data accurately represent what we would expect it to represent based on the operational system that it came from do we have the most recent data and then there’s another note they point out here is there road L security has the road level security been implemented correctly is there a security model on

22:56 correctly is there a security model on this one is it filtering the data out per those requirements as well there’s actually a lot of things to check here I was actually not surprised but I me you just yeah it looks good we’re done maybe move it out but as you get more rigorous around those reports you need like a checklist make sure you get everything right yeah and I I looking at their checklist here and I think it might be worth us going through each one just to talk through like the different areas of testing right it’s valid validation is is

23:27 right it’s valid validation is is testing there isn’t there isn’t any from their bullet list that I disagreed with and or could add to really well I I shouldn’t say I can always add something to it like like the first one like we talk about functionality like my my first thought was hey how how many times do users or Builders of reports especially when you start integrating and using bookmarks and all the the filtering or cross filtering or whatever you would want to do on a

23:57 whatever you would want to do on a particular visual do you walk through the user Journey right of using the report and trying to pull the insights and it’s surprising or not but the functionality of the report page and the call out here isn’t necessarily on that so that’s where maybe this is added Focus but the functionality of like does everything work as you would expect on the report page is like one of the fundamental things you should always do before you’re deploying a report to an audience like a lot of times I’ve run

24:27 audience like a lot of times I’ve run into things where it’s like hey I’m trying to do this thing but when I click this button I keep getting reverted back to what my other selection one right like because I’m saving the filter context in a in a bookmark or something right so the functionality what they lean like on the report interaction page is is the the the first add-on to the point in the article they they speak to functionality of after it’s deployed you functionality of after it’s deployed is a semantic model complet to

24:58 know is a semantic model complet to refresh right are you monitoring the report from a functional standpoint that you didn’t just jettison it out there and forget to schedule a refresh right or the first refresh was successful so good good starting point in terms of before you launch and just some of the basics of hooking up the the report itself right like after deployment did you connect it to the the data source that it needs to be like a Gateway right do you or is it

25:29 Gateway right do you or is it cloud-based and you don’t have to worry about that can’t tell you the number of times I’ve had folks inadvertently forget that what you have in your PBX file requires a little Plumbing connection when you deploy it to to the service service I yeah and I like how they broke it down in that first one so let’s there’s there’s six things six things here six things here first one’s functionality what is the functionality of the report I think you pointed out the one there is schedule refresh does it schedule that first refresh and then

26:00 it schedule that first refresh and then they said testing functionality which I also agree with in testing functionality it talks about there’s actually a link to somewhere also talks do the slicers work are your drill down actions working do you have drill through working correctly that was the link in there okay yeah is there any expressions or dacks that you built that it needs to work correctly cuz I’m also assuming here too we have like potentially a Model A semantic model and a thin report so you may have measures in the thin report that are doing things making sure all the look at the pages do all the visuals render is there is there any

26:31 the visuals render is there is there any broken visuals there is there a default page you should be on right because another when you save the report whatever you save it as that’s the page it renders so sometimes you want people to land only on a single page I can’t tell you the number of times I’ve saved reports with the wrong page highlighted and have to go back out and redeploy the entire report just because I didn’t save the right page yeah so those are the things that are interesting there when you go into reports anything

27:01 there when you go into reports anything else that that we missed so when it comes to the checklist I want to go in the the admin briefly because I so you’re transitioning topics it’s really no no no no no no no no no no it’s definitely related but this is where I go with this when we talk about manly testing okay there’s nothing I disagree with everything okay you’re talking about the buttons and the drill throughs and all those things well not just that it’s the Val so I’m going through and I’m doing a

27:31 Val so I’m going through and I’m doing a manual test right this is nothing is at this point is automated what are you testing everything we just said what here’s what I’m trying to say if I’m functionality that’s they’re calling that functionality yes okay so I’m thinking about the side view because we we where how do we verify that someone’s actually gone through and done the checklist itself because I think that’s one of the problems too unlike an something in a more coding environment where there’s pull request and there’s verification

28:03 pull request and there’s verification there’s not a lot of in a sense verification besides just hey yeah I checked it where there’s that open communication and I think that’s where a lot of validation falls short with powerbi because either we have a checklist that’s a Word document that we go through and things can be a little gray interesting Tommy that you mentioned mentioned that I see where you’re going with this one I see where you’re going with this one I don’t know if that’s parking lot for later but no man keep going it’s I I

28:33 for later but no man keep going it’s I I know you’re going with this one so you develop a process I would argue with you well I would argue there’s a couple things right one is the center of excellence should Define what this looks like when we say validate like so for example typically the people who are building report should not be the ones always validating it they can be if your Department’s small enough but the people building it will be like yeah yeah it works I’ve done I’ve clicked the buttons and things get missed so I think part of this is potentially a second person checking out those changes but you bring up an interesting

29:03 changes but you bring up an interesting point Tommy with the git Repose because as soon as you turn on that for your reporting or your your workspace you can actually see individually what code pieces have changed and so you you stop doing the did I adjust the bookmark or did The Bookmark work correctly you can actually go in and see did the code of the bookmark actually change so what you’re speaking to I think is more of if you think about a developer developers do this all the time you have

29:34 developers do this all the time you have a website you’re building some code there’s a feature request Seth comes in and says look this is not working these bookmarks are all jacked up go fix it Tommy Tommy says okay great I’m going to make a task to go fix the bookmarks so

29:46 make a task to go fix the bookmarks so you go into the workspace you download the reports you make the updates and then you check it back in and when you do that with Git you have the ability of to your point there is a phase gate where Tommy says I’m done the changes have been implemented and there’s literally a record a system record that says okay you’ve made your changes we’re checking it back in and then Seth can go back in and say okay I’m going to review the code and physically see which items you changed okay you you did change all

30:16 you changed okay you you did change all the bookmarks I’m going to go to the report verify that it does work that as expected and then when someone else approves it then you say yeah we’re done and move on is that what you’re speaking to Tommy it’s like this is what EX ex what I’m saying but that process should be built by the center of excellence and I don’t think that process is going to live the same as a self-service user as it is for a central certified user or or content I think I think depending on the content you have these rules might be

30:50 different I’m a little I’m a I’m a little lost here only from the standpoint that what I think we’re talking about is two different types of validation right one is one is code code tracking code deployment right I I I went and built something and it meets the specs of what we want to push out the door yes sure I like is there is that rigorous yes could we develop test cases yes but it takes a long time and does it fit this scenario at the end of the day somebody still needs to validate or or that the report

31:23 needs to validate or or that the report is working as would as expected that still falls into into I think the initial premise of of Tommy which is the checklist it’s how do we make sure we enforce the checklist with a a business group or if the Coe generates that and any report that gets shared to a wider audience goes through a methodology or functionality testing to ensure that we’re putting out a quality product and I I think that that

31:53 quality product and I I think that that falls I’m not aware of any way in which that doesn’t fall outside of the people process right like I have like and and the if I I’d be interested in if you think there’s some automated way we can do this but like to Tom Tommy to your point like you can lead a horse to water that’s the process it still requires that people follow the process and that’s why like within here there’s probably a racy where there’s owners your job is to ensure you’re following

32:24 your job is to ensure you’re following the process right and I like that’s to me the only hit like that’s how you enforce the checklist is you whoever’s job it is to to walk through the checklist as long as that checklist exists and they’re clear on how to follow it it’s their job to follow it you’re not like the Coe shouldn’t like this is the other thing though too is like you shouldn’t be just building process for process sake and if you have all these processes and they’re there for good reasons because the value to the business is there then people should

32:55 the business is there then people should be following them agree right like it’s not it’s not just a you need the push though this is a recommendation you need ownership in that yeah there’s and there’s a push back thing that comes from if the business is asking for these changes and the it says wait a minute we’ve done them we’re done like there’s there’s a handshake of responsibility that you’re trying to do with this process well but but the difference is like I guess in my mind call it the business call it the the team that owns what the report is

33:25 the team that owns what the report is supposed to look like right or or or do right go build me this thing this is what I want it to do you return a product to them and it either does what they want it to do or it doesn’t right and then there’s some back and forth and there’s some bugs and you go fix them and then there’s the product that they have and that’s the value behind somebody doing functional testing you somebody doing functional testing on a report is it doing the things know on a report is it doing the things we would expect to do and and then further which we haven’t talked about which is probably going to take the rest

33:55 which is probably going to take the rest of the podcast is data accuracy right like the data is the data coming through from the report accurate and valid I think at the end of the day I believe you’re right Seth at the end of the day regardless whatever this process looks like to make sure stuff is validated I do know there are and I don’t know how you do this with powerbi again this is something I haven’t tested yet but I know if you’re building an application there is UI tests you can do in an application where you can physically program click this

34:25 you can physically program click this button does the screen show up and you can you can visually check things or objects on the page that verify if the UI still works or not so imagine you build an app and you’re going to add users to the app part of your testing process should be hey I make an app if I click this button type in these things and hit save an object appears and then I can see a user right something along those lines that’s very complicated to program there’s not a lot of easy things to do that with and also you’re using an iframe or you’re using a

34:55 you’re using an iframe or you’re using a powerbi report you need to have your app your testing app go into powerbi Portal find the report there’s just a lot of things that are mechanically there that don’t necessarily all I’m saying is to add more automation to that checking process is work it’s people it’s time and so you either solve it with individuals time and I think you really do have a QA environment when you’re so the there sorry there’s so many thoughts I’m trying to get organized on my thoughts when you’re talking about

35:26 thoughts when you’re talking about certified content I think the importance of that content is higher therefore it’s worth the spend to make sure to your point set the functionality is still right the data accuracy is correct are the visuals still performant right does it have the right security those things need to be much clearer and the center of excellence should identify you’re right a checklist that says when every time this gets deployed let’s define the process what does this look like how do what are the same things we should be doing the same things every single time

35:57 doing the same things every single time and to your point Seth when we get bigger more important data sets or models or whatever hopefully we don’t have a ton of them Define what that looks like so that sounds interesting to me when we start talking about self-service I think you bend the rules a lot more because I don’t think you want to spend so much time or money or effort on people that’s your most expensive asset and and so you’re not on self-service you’re just going to kind self-service you’re just going to take their word for it and then yeah of take their word for it and then yeah you’re going to kick out stuff that’s going to be junk it’s going to be messy

36:27 going to be junk it’s going to be messy but as long as you have a mechanism for ownership ownership right I Mike’s building something in my department I push out something and publish it to the organization and Tommy goes in and says mike these numbers are all wrong like there’s no way these numbers are valida incorrectly I have a question there needs to be a process for Tommy to push back to me and say prove it or go check these these are not the numbers that I see and then I need to be able to have an expectation to respond with no here’s what I did to valid the

36:57 with no here’s what I did to valid the number yes these numbers are in fact correct and and that needs to be an onus like I need to own that information if I’m publishing out to the organization or Shar with people me the publisher is the owner of it I take responsibility for that information and so I think a lot of what I see right now in organizations and probably it’s been there forever is trust trust becomes very difficult when you start delegating per like trust to different teams and making sure they do their things correctly yeah but I I also think that’s where you can lean into the certification process of the of the

37:28 certification process of the of the report where it’s like you put some skin in the game right you can’t get this level of certification without going through yes a more rigorous process and and like for as much as I push back on the functionality of or testing functionality in an automated fashion I do think of all of the areas data accuracy is one that is probably the easiest to build automated test cases around or at least a page that is your

37:59 around or at least a page that is your checks and balances of calculations right like where you can do the quick hey all all of everything I’m doing in this report the vast majority of it is going to run through these measures so if I’m filtering or slicing dicing could there be some Oddities yes there could be but in terms of deployment or predeployment to models validating data like that is one thing that I think is probably doable without a ton of effort it still

38:31 without a ton of effort it still requires the the building of the specific use cases and and checks and balances but there’s a lot you can catch when you understand the data sets and what you’re looking for just from a validation perspective I think I think that one would would be the first I’d tackle if I was going to automate something in in this this validation process I want to give our listeners a very tactical or tangible way of doing this so bear with me here so that’s

39:02 this so bear with me here so that’s while I be a little pointed around my question this is this and you guys feel free to push back on this if you you want what is the process that you use today to validate the functionality of things what does that look like for you right now like if if you going to do you have a process what and I’ll start because I’m I’m poting a pointed question here on this one functionality validation is developers take on features developers build a report they get the report completed did and because I do a lot of certified content we push it to a separate environment so

39:32 we push it to a separate environment so we we did build the report build the data set publish them to Dev from there we use a deployment pipeline we move them to a test environment and then the

39:41 them to a test environment and then the users of said report the developer is expected to say Tommy I’ve updated this data set I’ve added this visual here’s a bookmark that I’ve made this is a button that I’ve been added or modified go test these things I literally send them a list of things that I changed so it’s like almost like a change log here’s the things I’ve changed go check those specific items and then I send it to them and then I wait or well wait is a relative term I wait slash nag them to

40:11 relative term I wait slash nag them to get it done because everyone’s busy and sometimes we don’t have a lot of time if it’s urgent it gets done quickly but if not sometimes you have to ping people a couple times say hey did you review it hey we’re waiting for it to get done hey check it out and then once they say I’ve reviewed it I their word back to me and I don’t we document in the fact that maybe it’s in a teams chat or maybe they send me an email at the best but that communication back to me is like okay it’s good and then I let then I move it to production and then we go to prod so that’s how I think my process works today do you have

40:42 think my process works today do you have it does your similar do you have a better process so one attempt that we did because to your point that back and forth especially when it is yeah and and it get it can get every pushing especially when you’re getting outside of that world of seeing the actual changes where you’re like yeah yeah I’m still looking at it yeah we actually utiliz the the power the API and we basically created a database of

41:12 and we basically created a database of all the current whatever reports were out there before we had mainly do it and had this catalog and we then set up in a SharePoint list basically like the items of checklist and we created a power app okay when actually through it so the power app would say this has been verified here functionality okay so you’re writing it down now you’re taking things and writing them down somewhere yeah and it was not again it’s just handwritten note yeah you actually went to this app and then everyone can see if

41:42 to this app and then everyone can see if that’s been tested but now you have record of hey I sent it to Tommy Tommy said yes these I did check them yes yes yes done move on right it exactly now you have a documented results and that way when later on when someone comes back and says hey this this doesn’t work and you’re like well you signed off on it so that it but again you’re now adding responsibility right this is the point exactly you’re documenting you were responsible you checked it we’re moving moving on I like that that’s cool I I think

42:14 on I like that that’s cool I I think that’s a great a great solution yeah in in our environment it there are well in many places I’ve been right so it depends on the type of reporting that you’re doing whether it’s external for customers or internal for te oh okay that makes sense in in all of it we do code reviews first first and foremost especially in the data engineering layer and or model specifically right so it’s not it’s not just one person deploying something out into the wild without somebody else reviewing it I want to ask you a

42:45 reviewing it I want to ask you a question on that real quick yeah in code reviewing are you doing you’re you’re probably doing model and thin reports right that’s what you’re doing there predominantly are you using timle or the Bim to do those code check through like what are you no it’s a it’s a it’s a show me the solution Alm toolkit yeah it’s not in yes we use Alm toolkit but it’s not it’s not deploying code review correct it’s just show me like walk me through the business logic yes let’s look at let’s look at the Cod like let’s

43:16 look at let’s look at the Cod like let’s look at the peer review the yeah it’s the business logic that is generating the dimensions facts show me the relationships in the model show me the show me the like the Dax measures your calculating and it’s not like a hey like click through every button right so it’s more on the it’s more focused on the data accuracy yes than it is functionality because there’s an expectation that you’re You’re Building Things in a functional way I’m gonna give this one’s for freeth do you use this do you use semantic labs to do that

43:50 this do you use semantic labs to do that no you should highly look at it I’m seriously looking at semantic lab to do this because you can do all the things you’re so you’re saying the things that I love to hear right check this table check this measure check these things you can make a semantic notebook that has all those things prepared for you and I’m just learning this right now I’m like I’m so excited about it that would be a great opportunity I love what you’re doing there that’s a great process but if you join that with the semantic Labs notebook you can literally run the notebook against the model the xmla endpoint and say Here’s a list of all relationships did anything change

44:21 all relationships did anything change here’s a list of the table with a couple dimensions in it and it’s the same dimensional table every single time you run your checks and you can start really so so that that would be on changes of an existing artifact that’s that’s very true very true if it’s net new you can’t do that a lot of what we’re doing is net new all okay never mind so that wouldn’t work as well but yeah I’ll I’ll check that out thanks very but also like documentation right after that we document document it depending on like I said the particular use case sometimes

44:52 said the particular use case sometimes centic Labs too Confluence or or page in the report is what we found super helpful for bus because you you add in like hey this is what no more so like hey this is what you expect can expect in the report okay right here’s the the the snapshot or the the the high high level logic of like how we’ve defined the business speak of like what you should expect to see in the report that’s cool and that helps I

45:22 report that’s cool and that helps I think drive an understanding Andor any issues right with hey my report it doesn’t show me the right stuff like are you sure about that did you check the page that tells you what what part of the business we’re looking at because it’s not everything yeah and when we deploy like obviously one goes to a QA team right they’re they’re there with their test cases and running things through and making sure it’s operational and performant and all the other the and internally wherever possible we

45:53 internally wherever possible we typically have one or two teams that are like the smees of the the bus logic and area that we engage in so we try to give them access but to your point where you don’t get feedback from business what we typically try to do is only release to like one or two of the audience that are the best that understand the U the data the most yeah because then there’s a push for them to review and give feedback because the

46:23 to review and give feedback because the whole the whole audience is not getting the report right away because when you do do that there there it’s more of a well we’ll get back to if something’s wrong but it’s not in the timeline that you want you want to fix you want to make sure that it’s good go because you want to move on to something else what they don’t understand is it’s extremely disruptive if they don’t give you the feedback and like three weeks later they’re like hey and I’m like no man I’m I’m way down the road I don’t have time for this anymore you’re out of the queue

46:53 for this anymore you’re out of the queue man you’re window yeah you’re window back at the back of the line now the back of the line that’s exactly right and then that’s cool like potentially not deployment but I I think about it in in that way in is continual monitoring so especially in our external cases we have a lot of monitoring reports around our raw like our data that generates the generates the data for the models are front facing and we’re we’re just we have we have an eye

47:25 we’re we’re just we have we have an eye on refresh schedules like data limits data sizes per table that’s coming in to those pipelines to ensure that on a daily basis we catch any breakdowns within the pipeline that would automatically show up in reports I think I think your game has changed a little bit Seth because when you start giving things to customers externally there’s a different level of scrutiny I guess maybe what I would say around the stuff that comes out of there and I really like your idea of like

47:55 and I really like your idea of like having almost like a I’m going to call it what Microsoft does dog food you’re dog fooding a little bit of the reporting to directly to the customers that you trust because it’s like hey we’re coming out with a revision of this report here’s a here’s a beta version it’s early right check it out let me know what you see right so you could use some of those trusted customers that understand you’re in the space of yeah it’s not quite there but the the the advantage is hey you’re getting very early stuff before everyone else cool

48:25 early stuff before everyone else cool but you they also have the understanding that it may not be perfect and they they actually provide you the challenge I think is really getting the feedback correctly of like it didn’t work as expected that’s hard to do and Microsoft uses their MVPs to do that a lot for their product hey this experience is not what I expected I I find random things all the time I was just talking about I found something inside the spark definitions in powerbi. com if you change you can change like individual parameters or or settings in the spark cluster when it

48:55 settings in the spark cluster when it spins up it’s a a spark environment definition the cells to pick the parameter are too short and everything gets cut off it’s like dot dot dot on all these things like okay fine but all these settings I had to hover to see what the different settings are and I’m like this is just not a good experience I want it it needs to be long it’s just simple things like this it’s little edge casy things like that occur and you’d be like hey this doesn’t make sense to me I can you fix this and it seems fixable but it’s those small or minor things you’re like wait a minute what

49:26 things you’re like wait a minute what are we really building in the UI here so it actually works yeah I I would I would highly recommend the smaller group or Smee of of data especially in

49:36 Smee of of data especially in environments that are that are more complex in terms of putting together business business logic invaluable as far as teams are concerned but anyway so that that’s what I do I think I think one of the the areas that gets the least love in terms of Val is probably performance just in general like how much time are your folks actually spending in Performance Tuning or or is

50:07 spending in Performance Tuning or or is it once they get the right answer that’s what’s going out the door what I’m saying a lot of that can happen in Dax calculations and I agree with that and can cause problems how do how would you performance tune something like my question here is like how do you how do you do that that is where having the one person on the team who who is very well-versed in all of sqlbi toxs is very useful on a team I’m

50:38 sqlbi toxs is very useful on a team I’m not kidding not wrong not wrong I’m not kidding having someone who can study it and understand that is definitely invaluable on your team 100% agree with that yeah so let me put let me put it let me put it two ways then right if if tuning models and writing dacks is your jam go become that person this is a great opportunity for every company needs a one of these things these people that skill set if you want to be awesome and you want to be hired immediately for powerbi things no matter what you’re doing if you can be

51:08 what you’re doing if you can be incredible at modeling and performance tuning be I’m going through a lot of resumés for companies right now helping them hire people no one talks about tuning or optimizing or I’ve spent time delegating my time around being a tuner and Optimizer of Dax statements you put that in a resume I’m my ears are perking up I want to talk to you a minimum to see what about that stuff because that’s I think incredibly important and we’re talking about cost and spend things another thing I look for now in resumes if we’re giving like

51:38 for now in resumes if we’re giving like tips and tricks here is yes be the be the one to talk about hey I build processes but I build them with optimization and cost management as a for Focus as a as a front and center issue because lean into that like I save money what what is my what is my Niche skill set to I build you stuff scale cheaply not just no not just that it’s like it’s if even in large organizations or people like that can like that’s

52:09 or people like that can like that’s where you keep track of hey we’re on a A4 A5 A6 capacity and running oh I like that like tons of tons of customers on this your your value ad as one of these performance tuner is hey yeah I I made sure we we were able to stay on this capacity instead of previous years having to ramp up for every X number of customers or X number of interacting with the system like there

52:41 like there the I’m I’m 100% with you the value there is is in longevity in Performance Tuning existing models right and there is just just it’s the same thing I used to do in SQL like d Bas management like you’d walk in just be like oh my gosh what is somebody doing with this it’s like this is so that’s the that’s the 50 nested view like view on view on view on view and that’s why it takes it takes 20 23 hours for this process to run

53:11 20 23 hours for this process to run and you can walk in and fix process and it runs in 35 minutes done that right same with Dax and models yes right people can come in they go yeah that works but that’s not the best way to do this what you should do is reshape like and it it’s always simplified in Dax right it’s like I’m I’m just going to use this function instead and this is how we call that and now it works you how we call that and now it works a hundred times faster than it did know a hundred times faster than it did in the past I totally agree that super valuable Niche skill right there is is

53:44 valuable Niche skill right there is is that understanding right of I know there’s five different ways of doing this the way you’re doing it is not required right now right and it’s the slowest way I know the fastest way so I’m going to modify and make it make make it the fastest M agree that massive improvements man massive so that sorry I didn’t mean to go on a tangent there about the article around that piece but that was that was one of my my main nuggets there was to focus on that because I think that’s extremely relevant in the space yeah and

54:17 extremely relevant in the space yeah and so the other points in there I don’t think that we we talked a ton too so we just talk referenced performance security obviously I think the data accuracy go along with that when you’re using Ro level security and things like that Effectiveness right like are you producing the insights that the the end user wants and then accessibility as well like always always something to be cognizant of yeah and I think I feel like a lot of times accessibility is is definitely a needed talking point there’s a lot of people talking about and discussing that I

54:49 talking about and discussing that I think it’s a great definitely do it but I feel like it’s one of those things that gets lost in the translation sometimes you’re not always once you check the report for accessibility one time you’re not always checking like are all the colors always right are all the things labeled so I think that’s one area that would be nice if Microsoft actually built us some tooling this is accessibility should be something that Microsoft should just give us out of the box it should just literally be a note on the page it says hey did that accessibility you need more contrast between this there should be just a

55:19 between this there should be just a couple checks that it can do to just give it back to you I know people the community has built some things around this but I really do think the accessibility Checker should be something that is just a list of things that you can automate and there needs to be a tool to help you produce some of that stuff which would be interesting I don’t really have any other comments I guess maybe we should wrap here I think I think the three that I care most about when I’m doing this is for sure does the thing function the way it’s supposed to manual

55:49 function the way it’s supposed to manual processes building building people and process pieces to usually solve that one data accuracy either from coming into the model and then representing what’s in the model accurately and I think with simply notebooks and now some more with the notebook experiences you can you can start automating some of that and then performance I think for sure you could automate a bit more that in the notebook experience but again it takes time and and effort to do those things one thing I just wanted to note about

56:19 I just wanted to note about performance have you have those when you go into a report and the one visual is slow or the slow visual there’s no reason why you can’t take that Dax of that Visual and land it inside a senpai notebook and execute that exact Dax for that one visual on the page that is slow every time you do a deployment or becomes one of your talking points in the test hey how does

56:49 talking points in the test hey how does this visual operate what is the Dax used to generate that Visual and can I just use that Dax inside the not notbook and then just get a a millisecond run time on how long it took to run or execute those things are really interesting to me I that’s where I’m going to spend a lot more time figuring out how to build automatic tooling around that stuff would be really neat cool all right with that any other final thoughts Tommy any final thoughts before we wrap on this I feel like we could extend this for an entire month on just either both

57:20 entire month on just either both ideas and that might be a bit aggressive but I we definitely could talk about it again for sure it gets hard to Breeze through the even though we’re doing this series it’s hard even on an article like this just to try to talk about in an hour yeah which is why which is why we put the article in the chats so that folks can go out and hopefully be tickled with some of what we’re talking about on the page and go read the rest of it for themselves if it interests them more it’s true yeah I I definitely

57:51 them more it’s true yeah I I definitely do want to point out the documentation put up by Microsoft is 100% topnotch it’s good stuff so definitely go check it out it’s worth your time to go read it and get up to speed on it awesome with that we really appreciate your listenership your ears are important to us both of them if you have both I don’t know if everyone has two ears but if you have an ear we we’d love to bend your ear to our conversation we really appreciate you taking the time out of your day your run your bicycle whatever stay safe out there if you’re running we we appreciate you listening to the podcast if you like this content if this

58:21 podcast if you like this content if this was something that was relevant to you we really appreciate it we don’t do any advertising we just do word of mouth so please let let someone else know give us some feedback on a LinkedIn or Twitter or something that you you use give us a heads up there and let other people know that you like this content we really appreciate your feedback and chat thank you very much for putting some really great thoughts in there as well so we really appreciate you being joining us live as well Tommy where else can you find the podcast you can find us on Apple Spotify or wherever you get your podcast make sure to subscribe and leave a rating it helps us out a ton if you

58:52 a rating it helps us out a ton if you have an idea a question or a topic that you want us to talk about in a future episode head over to power. podcast leave your name and a great question finally join us live every Tuesday and Thursday A. M Central and join the conversation on all of power. tips social media channels awesome thank you all so much and we’ll talk to you again soon

59:30 you [Music]

Thank You

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

How Do You Use Microsoft Fabric Domains? – Ep. 347

More Posts

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.