PowerBI.tips

CLM Part 3: Develop & Manage Content – Ep. 341

CLM Part 3: Develop & Manage Content – Ep. 341

In this episode, the team continues the Content Lifecycle Management (CLM) series with a practical look at developing Power BI content and managing changes. They also cover CI/CD updates in Fabric warehouses and share a quick take on the AI bubble question.

News & Announcements

Main Discussion

This episode continues the CLM series with an emphasis on how teams actually build content, promote changes, and avoid “wild west” report development. The crew discusses practical development workflows, where standards help (and where they don’t), and how you can keep Power BI work moving without breaking downstream users.

Looking Forward

As more Power BI teams adopt Fabric-native tooling and Git-backed workflows, having a clear development and change management strategy becomes the difference between “we ship” and “we scramble.”

Episode Transcript

0:32 good morning and welcome back to the explicit M podcast with Tommy Seth and Mike good morning everyone good morning gentlemen and a happy Tuesday there’s that voice I miss feels like Wednesday or Thursday I wish it was same been a long week for two days already yeah all right well jumping in today our main topic will be talking about our content life cycle management series that we’ve been doing for a while now that’s Microsoft has put this out

1:02 that’s Microsoft has put this out recently we’re going through part three about developing and managing content we’re going to go through that section next all right jumping in let’s do some news here what do you got for us Tommy we have a an update from the Microsoft fabric blog what do you got there a pretty a pretty major update CI CD with warehouses and Microsoft Fabric and there’s really four major features here that Microsoft is releasing today today and is one fabric G integration with

1:33 and is one fabric G integration with warehouses which is awesome really excited to see that deployment pipelines as well for automating schem of changes around different stages but there’s also two other ones I think people heavy in the seel world are going to find fascinating it’s going to be SQL project support and also DBT integration and those are more I would say developer SQL based that is a pretty big update especially for those who are pretty lenient dependent on warehousing yeah I

2:03 warehousing yeah I see I see the reason for DBT I mean I see I see the reason for DBT I just don’t I don’t love it it just feels like it’s a lot of extra work unless you have the right project where you’re scaling out a lot of a lot of items I’ve been working with some companies that are using DBT and it’s interesting to see how it’s being integrated but it it feels like it’s more of a well it’s a very common pattern if you’re using data bricks people like to use DBT across data bricks so need to see the Integrations coming I really do like the fact that it’s now added into the git provider or your

2:35 now added into the git provider or your you’re providing which is now we can use Azure devops or GitHub which is very nice options there yeah we’ll see where this one is going to go I’m very happy that there’s better automation integration with the deployment pipelines just feels like deployment pipelines just don’t quite do what I want all the time I there needs to be some more enhancements I think in the future for how deployment pipelines will work to develop now that we have G yes I think it’s changing my perspective of like what it should be doing yeah I

3:06 of like what it should be doing yeah I feel like it should be doing more now because we have get integration this is actually pretty funny I I’ve been talking with my wife about training and content she’s like so you’ve done all the this was probably a few weeks ago she’s like so you don’t have to make any more slides or training sessions I’m like actually I have to recreate everything so I had a recordings for deployment pipelines not that it’s gone but there’s a whole series around get and obviously everything around here it’s like well remember what I did like three years ago I basically have to do that again now

3:37 I basically have to do that again now yeah it’s it’s constant changing yeah the life of a trainer the Life Training of training as part of we weap to all yeah I don’t know how people keep up with it honestly there’s so much stuff changing it’s the UI changes the the interactions Chang there’s so many features coming out now it’s it’s hard to keep up with things see this is where I think the the strength of someone with adds oh another thing oh I need to find it out work really well Lord oh let me redo it I

4:07 really well Lord oh let me redo it I knew I could do it better the first time yeah exactly yeah I don’t share that same passion with stuff I’ve already [Laughter] done I’ll put more details around this new database projects in the warehouse with using git there’s actually a nice little article little learn Microsoft documentation that supplements the the blog post as well so I’ll put that also in the the chat as well all right Tommy Let’s let’s do a quick beat from the street here you’ve got some thoughts on AI yeah so this

4:38 got some thoughts on AI yeah so this is actually coming from two places that I think made it worthy for a beat on the street from some a few client conversations asking a lot more questions than previous and then also honestly just from the industry itself and and I’ll share the industry one since out there not private for those who are Baseball fans and this is you may watch some commercials and it seems like every technology company has an AI commercial now but I’m realizing not all of them

5:08 now but I’m realizing not all of them have to do with AI Google has one they’re like hey 4% more strikeouts equals 5% more hot dog sales Google AI I’m like that’s just statistics and so this trend started happening and then I’m talking to a lot of clients just about that like what can we do for AI I think Seth I did send you the required reading around Ai and we’re I’m finding that in the forecast that I think we’re going to get into an AI bubble and I hate saying this

5:39 into an AI bubble and I hate saying this but I think we’re getting to this point where it’s AI everything it’s AI is gonna be stamped on Salesforce stamped on like Yelp like you have to have some AI integration and the same way that my clients and I’m sure your own organizations out there are probably talking about what can we do with AI without understanding what it is two other aspects of this that I’m again I don’t want to say concerned but are just leading up to this Zuckerberg from meta who’s Big in the AI and

6:10 from meta who’s Big in the AI and Google’s Google’s CEO basically said the same thing they said we would rather right now heavily overinvestment not really a market for it in terms of what is the product here like everything’s open source everything but there’s not really something I’m gonna actually sell to someone yet so I’m C I’m curious what

6:42 someone yet so I’m C I’m curious what you guys hear because I’m hearing somewhat of the same beat the same Rhythm from a few of the projects I’m working on a few of the clients and I want to hear if you’re sensing anything anything similar well I feel like of sensing so yeah what aspect be your yeah well I yeah what aspect be your yeah well that’s what you guys are here for mean that’s what you guys are here for for the filter okay so let’s just go on the aspect of every Everything AI we needed integrate AI but again without

7:14 needed integrate AI but again without the clear picture of what you’re actually going to sell or what actually the effective results going to be we just need just need AI I think people are still discovering where to put it honestly I think I read a recent Gartner article that said AI is interesting especially the generative side right summarize this article generate something from something else I think that’s useful and it can be used in very specific used cases but I think the article is really pointing at the real value for companies is the collection of data into

7:44 companies is the collection of data into a single area getting more access to your data and the the idea that you’re going to be able to apply algorithms to your point Tommy it’s just statistics it’s easier to use it’s closer to where you need your data to be there’s going to be a lot of algorithm in improvements and those will also produce a lot of really good value here too but I think everyone’s trying to lump in that statistics or maths or throwing a lot of things at the computer to do the AI based things where I’m very excited and I I’ve definitely talked to

8:14 excited and I I’ve definitely talked to data scientists about this and they to they toally the ones that I’ve talked to have totally disagree with me on this one but I there’s something to say for being able to throw a computer at lower cost and a whole bunch of algorithms at a bunch of d and say figed out computer test you d and say figed out computer test hyper parameterize the data figure know hyper parameterize the data figure out what is actionable and what is corelated and what is not yeah you need a person there to prepare and and get the data ready but to be honest

8:45 and get the data ready but to be honest it’s cheaper to run every single algorithm through a computer overnight than it is to pay a person to find the right algorithm and tune it exactly the way you want so I don’t know Tommy I it maybe this is a solution looking for problem problem potentially I but when I go to code and VSS code and data bricks and when I’m seeing code and I’m asking for recommendations or patterns or things that show up it’s pretty good so I don’t know I I don’t think it’s a bubble per

9:16 know I I don’t think it’s a bubble per se but I do think there’s a lot of hype around it and if if I’m thinking like you’re an adobe a Microsoft a Google if you don’t have this builted into your product I think people are going to physically leave cuz they’re just excited about it mhm what your well man it it’s it’s a hype bubble is what it is is right because like something was really telling in what you just said right at

9:46 telling in what you just said right at least that I I heard which was sentiment around the challenges and no one really knowing what like the challenges of finding the right the appropriate for AI and then like developing on it and and just the ambiguity of it all and nothing really being out there and then the converse of that same consumers are going to leave or not use companies for not using AI do you realize how much of a conflict that

10:16 you realize how much of a conflict that is like of of those two things and that’s that’s what is so frustrating to me and we’ve talked about this a little bit in the past right where I think what’s happening in organization

10:29 organization a lot of them anyway is you have every company wants it as a differentiator right so instantly to your point Tommy yes you’re going to see some things that is legit just statistics or legit just things that you can do in training models and doing some things that have been out there for forever and a day in data science that’s not AI but they’re going to call it AI right it’s the it’s the predictor it’s predicting my forecast like yeah but that’s just algorithms that’s just

11:00 algorithms that’s just right you’re not doing anything specially special in fancy like so there’s going to be tons of stuff to Wade through right from a consumer perspective because AI is the buzzword it’s the thing that everybody wants to sell is that to say that pursuit of like J leveraging like large language models or some of the things that AI is accelerating for us in business is not worth it no I think tons of companies

11:30 worth it no I think tons of companies are are spooling up tons of ideas and poc’s and trying to like Leverage where and how to plug AI in for business value and one of those for companies selling something is saying like yeah we leverage this and this is how we do it the challenge with that though is and what’s so frustrating

12:01 is AI leverages well formed sets of data right it still has to categorize it into meaning before it starts spitting out like answers to to to internal or or customers where it’s like those answers have to be 100% accurate and that’s where I think some of the challenges that is coming in like I’m assuming and tons of companies have POC stood up what I’m also assuming is they’re not getting the results that they want and there’s no way they can productionize it or put it in a place that is going to shape the direction of their business yet and that’s where also

12:34 their business yet and that’s where also I would imagine in many cases they’re realizing that if you haven’t invested in the data infrastructure and consolidating all your data guess what that’s step one right like you got to get this stuff into a place where you can start to make meaning out of it or put it in buckets that if you’re going to leverage an llm like it knows how to navigate your ecosystem like if it’s going to go look for documentation on something and provide an answer like there can’t be 10 versions of it it it

13:04 there can’t be 10 versions of it it it there’s got to be one or it has to know how to grab the one or there’s so many different challenges of like the output right everybody’s focused on and if you put this into different context everybody looks at what it’s the report it’s the delivered thing they have absolutely no idea the complexity of everything underneath that has to be there in order for the thing to work and and ultimately like with AI man those engines right everything that’s doing ripping across all this data like making meaning out of things

13:36 data like making meaning out of things and in the unstructured realm I think of a lot of business I think it’s I I don’t think people are getting it to work where we’ve where we’ve seen success is in very well structured areas like code right coding tool like there almost every tool you see like has the the assistant or the prompt or the help me make my code because there’s defined parameters for that right there’s only so much you can do within

14:06 there’s only so much you can do within the code framework for AI to have to figure out to like deliver you a better and better response every single time and I think the value ad for streamlining those processes is is probably going to be much more in front of what the market in general is throwing out there from like ai ai ai right so it’s like it is it’s a hype bubble right now I think you’re Hing the nail on the head and I think to your point the coding is excellent but

14:37 your point the coding is excellent but right now this really dominated by the people who are already doing coding like GitHub data bricks and what they’re doing but then you think about all these other companies what what meta is doing what co-pilot is right right now it’s like okay how are we actually going to make this and then this the companies who are going to dedicate themselves to AI what’s going to be the where are we going to get generate the revenue like I I guarantee you all these companies right now are losing Handover what is it

15:07 right now are losing Handover what is it hand over fist dollars on the amount of money they’ve invested into the gpus in co-pilot compared to what they’re actually getting out of it yeah but at the same time like that’s a different ball game in my in my estimation because it’s like it’s like one of those Mega companies lands on the thing and you’re you’re billions it’s billions instantly right like people are clamor for it they’re going to want it yeah and like an interesting scenario in my head you think about the Microsoft know you think about the Microsoft ecosystem maybe in the future something

15:39 ecosystem maybe in the future something is extremely expensive or an add out like perview all the sudden becom super cheap and free because the AI version of that is hey do you want to click this button I’ll tell you your whole ecosystem right yeah like well yeah I’ll pay wait speed to what like you’re charge me 5, 000 a month and I just hit this button sure as opposed to I hit it but then I have to go build it and configure it and categorize and do all the things

16:09 categorize and do all the things myself there’s tons of opportunity there yeah I think I’m with you Seth on this one I really agree with your point there around if you don’t compete in this space if you don’t Supply the infrastructure this there’s a barrier to entry to play in this space there’s a barrier to entry to to make large language models there’s a barrier to entry to to render and output results and I think if you don’t play the space if you don’t have I think their their point Tomy is is well taken they’re not going to be able to invest in the infrastructure someone’s going to come

16:39 infrastructure someone’s going to come in and build a new chip a more efficient way of doing it a model that takes less time to train and if they can if they can return a prompt in half the cost that you can there’s like another there’s like another price War Happening Here I think at some level right yeah if I can if you can run a prompt at a dollar and I can run at 50 cents great that makes I’m more efficient and so good point I think they’re going to continue to like keep and this is all new like I think to your point also Seth is we don’t know what people are going to produce with this right the technology is there it’s doing some

17:11 technology is there it’s doing some incredible things but what does this really mean what does it look like and how does the collective training of these models really going to impact things long term I don’t think we know yet and so this is a lot of people saying look if we don’t prepare for it no one will be able to innovate and create good ideas one thing I will say all of this stuff takes an inordinate amount of energy it’s all very expensive from a just from an energy consumption standpoint I think in order for the us

17:41 standpoint I think in order for the us to stay competitive in this space we need better cheaper more Renewables of energy so it’s we can bring the price of energy down I think if you did that it would be immensely helpful for a lot of these organizations and I think you see really big companies like Microsoft trying to say hey we’re buying green credits we’re building data centers that are all green and they they produce as much as they consume I think that’s the right approach because you can’t have these large data centers just consuming all the power from everything it’s G to take a lot of effort to make these things work well take it for me I have a

18:12 things work well take it for me I have a fan in my office now for my PC because I’m running a ton of local things and it gets warm my room gets warm from the PC you don’t need a heater now I think I think you’re ready for the winner one of one of the things I want to stress is I’m not poo pooing the pursuit of AI in businesses what I’m challenging is the hype that like you can’t skip the line in the stuff you’re you can’t and I think that’s what that’s why like you’re not seeing companies

18:42 why like you’re not seeing companies everywhere just be like oh we we don’t need to do anything it’s AI right like it’s so much simpler and faster AI right like you’re you’re just you’re seeing it in spots we’re seeing it in and it helps our lives so far right so let’s pursue it 100% agree with pursuit of it it’s just Temper Temper the hype cycle because this is not like a normal a normal technology thing that like once you implement it you can say yeah we’re a powerbi shop right like

19:13 yeah we’re a powerbi shop right like congratulations what is that mean like oh we built a report well you’re an organization of a thousand people like people think that it’s all over your organization now right yeah with AI it’s not like oh yeah we turned on AI okay do that like what does that mean so I just a little pessimistic about the hype cycle thing not not not liking it I guess because it cheapens or hopefully it whiplashes around to the point where your organizations who still

19:45 point where your organizations who still don’t realize data is important need to really get on the bandwagon and understand that like lots there’s a lot of data engineering work that has to happen to to set these things up for Success my my my last thought here the semi equivalent example is you look at the dot bubble so to speak and I’m not saying it’s going to be the same thing the internet didn’t go away after the bubble burst we knew the internet was here to stay but there was such a hype around what we can do all these things and a lot of companies weren’t

20:15 things and a lot of companies weren’t ready for it one of the biggest I think stocks at the time was a pet stom or like delivering pet food but we were too early in the game in the in the early 90s the bubble burst but guess what we’re still using the internet we’re still using a lot of websites so I think there’s somewhat somewhat of an equivalent I don’t think to the the same degree but I think we’re going to see the same thing where AI is not going away but we’re definitely going to see that to your point Seth it’s not

20:46 that to your point Seth it’s not necessarily going to be everyone being AI especially when it comes a lot more formulated because the realization of well there’s a lot of foundational things we need in order for this to work I’m really to be honest I’m very excited for it I think it’s going to be

20:59 excited for it I think it’s going to be really good I think it’s going to be very interesting and I think it’s going to make it’s already making things a lot harder to discern what’s going on because people can easily run a video through these things and have it autogenerate content I think what you’re going to find is a lot of the AI is going to generate a lot of new video so video seems to be like the main media that everyone wants to use and consume because now we have little video players on our phone for everything so I think everything’s going towards this video media consumption summarized

21:29 media consumption summarized trying to deliver distilled information quickly it’s yeah it’s impressive what it can do honestly we didn’t we experiment with taking our podcast and making it making it International yeah we did we tried to like change the language on it right let’s let’s let’s do the Spanish podcast yeah version of podcast I you can do it I don’t know how well I don’t know how well it would I don’t know I can’t check it I can’t see if it’s right or not if you’re get swearing to people the whole time I don’t know like ah not sure

22:01 time I don’t know like ah not sure but but I don’t know it’s very interesting any anytime we have this conversation in any Forum I always conclude my side with since this is a recording or since I like when I’m talking into a microphone and we’ll live on and AI is listening I look forward to the day when our new overlords take over like just so what’s out of I’m on your side I personally machines am not a fan of

22:36 Neo that’s funny yeah that’s great that’s good that’s good all right well with that let’s jump into our main topic for today so really good really good conversation that makes me laugh I’m enjoying that so let’s jump into our main topic for today the main topic today is going to go down our to call part three we’ve been talking about the content of life cycle management for a while this is an article that has been written by the Microsoft team we believe it’s been written by or has influence of the Kurt ber on there so we can we love all the articles that come out of from Kurt so we’re really excited about that this is part three developing and managing

23:07 part three developing and managing content so Tommy you want to give us maybe a bit of a a summary of what’s what we’re talking about today and then we’ll I guess we’ll dissect the article a little bit more absolutely so yes this is our third part in a series around content life cycle management CLM for short first one we went over to overview our last part was around the planning designing the content the infrastructure around it and today is all about developing the content how you’re going to develop it file types

23:38 you’re going to develop it file types and then where you’re actually going to put that pathway this is really focused a lot on git integration talking about using SharePoint using what types of files are you going to make it binary our old friend PB or we going to go into PB or we’re going to go into a lot more developer Centric tooling regardless of the path that’s chosen here after we’ve done our planning it’s how do we develop it whether it’s individually on a team or centralized I like it I think that does

24:09 centralized I like it I think that does change depending on how large your team is and what occurs there I also think it I think part of this consideration for me doves tailes into how are you going to maintain it what’s the maintenance that this going to look like when do you need changes we’ve we’ve done a number of Monolithic I’ll call them the monolithic data models we threw everything in it and it just got really difficult to like manage understand work with it and so we decided okay is is there a logical way to break that model

24:39 there a logical way to break that model into sections or pieces or areas that make sense to those business users but not have to provide this massive really complex model because sometimes weird things happen with Dax and you really have to really understand exactly what you’re doing and understand if there’s any weird relationships or something in there that you want to be ble to cleanly adjust so I think a simpler model is easier to maintain but let’s get into the rest of the article here so all right how deciding how to develop your I’m gonna I think we should I think we

25:10 I’m gonna I think we should I think we should just walk from top down would be my Approach here so that kind how we want to take this article yeah again you’d have to rip through it there’s a lot there’s a lot here lot I the one caveat I I think I do want to point out here though is this in in this develop and manage content section and I’m not going to be critical but some of this is still future is right like just be aware that some of the some of the these things are not GA right so

25:40 of the these things are not GA right so you I don’t like the fact that it’s like presented as such because it’s like oh use this and it’s like yeah but that’s not g like there’s still limitations to that so I’m supposed to like start implementing processes and standards around cicd when like these tools aren’t even where I need them all to be yet and like that’s my only so as we read it some of this is pil yeah that’s good point and I think there’s a big part the to Point Mike that decide how to develop content

26:10 Mike that decide how to develop content which is the first main section here this actually was something that I wanted to bring up right off the bat too one of the things that and I’m going to quote here When developing semantic models and reports using powerbi desktop we recommend use the pbip files instead of PBX files okay no note note warning warning warning this is in preview so this is literally the first the first note on this is already sorry I didn’t mean okay keep going no no yeah and When developing spanic models using xmla tools we recommend use Tabler model

26:42 xmla tools we recommend use Tabler model Tindle also in preview instead of Bim files this is my first and again I’m not going to say my caution or concern but I want to raise this to you guys if if you’re working with a team who’s saying how should we develop content and I I’m going to say regardless of the skill level of the team because I I think at this point that would you say your primary recommendation the primary path you goes peip because right now that’s not what I would say well there there’s an interesting

27:14 say well there there’s an interesting feature so the the pbip format for those who don’t know is a way of dissecting or taking a report and making a bunch of little tiny files that describe everything so the pbip format is now the new micros Direction on which they’re going to head they’re going to do all these new things with the with the powerbi it’s basically powerbi project file format so the the Bim or the traditional data model gets broken into a whole bunch of these timle formed files which timle tarer model

27:46 files which timle tarer model definition language I think yeah and that’s defined as okay here’s a table here’s all the properties and columns and measures inside this table here’s another definition for all the Expressions that are going to make tables so it’s a lot easier for us to digest and make changes to the to the models and things while I like this while this does make sense there to me there’s a little bit of a missing gap between what that does the service and how to publish things so

28:16 the service and how to publish things so the pbip format is useful however you now have a collection of lot of little files that exist somewhere and unless you’re using an integrator like git or something that’s going to be able to track all those individual files of things it actually gets a lot more challenging for the end user where they were typically just using a a PBX file that was typically what people would just use so I agree with you Tommy I think this is the right direction I think we’re really

28:47 direction I think we’re really building this is this feels like what they’re trying to do with like power query or other visual graphical interfaces that live on top of a lot of code stuff it feels the same right hey we’re going to build the structure of the code we’re going to try and make it as easy as possible for you to just work with a single file or a collection of things that are just it feels natural and seamless so I think that’s where they’re going here me personally I’m not a beginner developer so since I’m not a beginner developer I love it I think this is great I want to use it all the time I want to have every project beond this and what I think I’m finding from

29:19 this and what I think I’m finding from talking with customers there’s a bit of resistance cuz H we’re not quite comfortable yet we don’t really understand get we don’t know how to use it yet the the business intelligence groups don’t have the maturity that I think Microsoft wants them to have or Advanced analytics teams should have and we’re getting there there’s a lot of business users I keep trying to make this this point clear when I talk in MVP meetings hey be aware we’re bringing a bunch of business users to highly technical things we we still got to think about making it as easy as possible I think there’s also a missed

29:50 possible I think there’s also a missed opportunity with the pbip format why not in deployment pipelines expose to me the pbip format for reports the PVP format for semantic models we can see some of that right now but everything should be timle like I should be able to look at I should anywhere in the system if I’m editing Reports online I should be able to have a code editor that shows up and shows me here’s what the visuals are doing on this page I should be able to see everything on the page as if it was in timle format so I can I have all the UI the the goey there and then I

30:20 all the UI the the goey there and then I can also go into the code side and just modify the code are these visuals aligned on the same XY axis I should be able to go into the code and just look at what the numbers are just type them in if I need to that I think is very easy to do and it’s what I it’s the experience that I get when I use a pipeline because I have code that defines the pipeline I want to see the code that defines the things anyways it’s very interesting what are your thoughts thoughts Seth there were a lot of them I so there’s a couple things one we’re

30:52 I so there’s a couple things one we’re talking future stuff right we’re we’re these these articles I think are designed for future things timle going to be at some point the de facto standard it’s not right now you have to cut over to it like you there’s there’s a lot in here that isn’t ready for what the base core thing for like content life cycle management is supposed to address which is what versioning of different files reliability yeah right so when the tool sets are tools or all

31:22 so when the tool sets are tools or all the things you’re telling me I need to use are not in a state of full reliability then like okay great we’re talking future State we’re we’re plagued

31:31 talking future State we’re we’re plagued with a problem we talk about things right away when they’re released I think this is one of those like technical documents that like it’s got to be out there right you have to you have to say that this is how you’re going to manage this ecosystem they’re providing information to us to digest but right now in context there are some pieces this that are not production pushed they’re not GA can you can you use them in the right hand like you Mike yeah you can are you going to roll in business

32:02 can are you going to roll in business intelligence teams to things that they don’t understand the Nuance they can’t like there’s a lot of ambiguity no you’re not because that’s not introducing a reliable deployment process into your company which is what you want to do I think the challenge for for it so having said that pushing and saying that all business intelligence teams need to go down this route I don’t know if I disagree with because in layman’s terms like I think there’s some

32:33 layman’s terms like I think there’s some very like straightforward deployment process things through using all these tool sets right I think first there’s a knee-jerk of like oh my gosh like we’re adding so much complexity because you guys are talking about ripping apart these files and like there’s so much nuance and we can do all this code checkin and code checkout like it doesn’t have to be like fullon like you’re using every aspect of this right to L it to get get value out of it however you can go deeper I think the

33:04 however you can go deeper I think the only push I have in there there is or maybe challenge point is one of the value ads to a business for powerbi was speed to Insight and everything we’re talking about here is reliability but that’s it that’s technology we’ve always done that and you have to in the fabric especially when you start building data infrastructure to scale but those are two different things that like are

33:36 two different things that like are uniquely butting up in here because is there some processes that a business intelligence team runs through here in a light version and then there’s also the heavy version because those are like the certified production level EV like the entire company uses these reports versus the business unit or the person because I think there is there has to be a differentiation here because you can’t lose I think one of the most the largest value ads of business intelligence which

34:07 value ads of business intelligence which is speed to turnaround and we have tools that allow you to quickly produce results for people and you can’t like we can’t go the opposite direction with cicd or yeah this this content life cycle and you’re pre you’re pressing a major button that I I completely agree with even if we made the Assumption right now that all these things were GA that they were already generally available you would have to look at the fabric skills Matrix right where there’s a lot of and I think this is earlier to Mike’s Point there’s a lot of

34:38 Mike’s Point there’s a lot of assumptions here on the level of experience and skill of that team because it can’t be just one person on the team who knows all this this has to be everyone who has to be familiar with looking at code looking and understanding the process with Git the the rest of the article goes into how we actually integrate get but this is not just something you just turn on and do a quick session an hour session with your team and say hey this is what we’re

35:08 your team and say hey this is what we’re going to do start converting your files and just continue a business as usual this is a complete process change a complete process change a complete a complete tooling change the software and the what you what you’re actually accessing so you would have to do a full skills assessment Mike to your point or I think to all of us when I see an article like this I’m jogging down the steps to try it out but again we’re also more on the developer side or we’ve been in the game for longer or we’re just

35:38 in the game for longer or we’re just that that nerdy even if I’m working with a client I’m not going to start developing in PIP because I’m gonna eventually have to hand this off and if there are right I’m gonna make sure that they agree and I will show them the values of what that looks like I do think to your point Tommy there’s like an easy win and there’s a more complicated elaborate version of this the easy one is all right just turn on get and just have your workspace sync to it great anytime you change something it’ll tell you when it looks different just sync it up boom

36:09 it looks different just sync it up boom backups like simple but that that I like that feels like a lot more of what we really wanted when we were trying to do SharePoint files publishing to the service so I like that part I also think that the addition of the git integration allows for web authoring so even though I not actually be doing the full pbip format I’m at least synchronizing my files with some git I think the web authoring is much less of a risk to me now because I can actually have copies of things I can see

36:39 actually have copies of things I can see it’s being checked in I can see there’s there’s content being created there so the only down side of this is to your point Seth git and GitHub it’s still a little bit more preview and not everything’s covered yet you build a data flow still not covered inside git so yes most of the artifacts you’re trying to produce can be saved over to get but not all of them so you have to Choose Wisely what you’re going to be putting in there because some of it might not s stick with the get repo yeah a lot of this is I feel a bit of a

37:09 yeah a lot of this is I feel a bit of a copy paste with General git General git not necessarily has to do with fabric in terms of like one of the things they talked about like release history I’m like I don’t think we’re doing release history or we’re doing version history we’re not doing types of releases with with G integration I think it’ probably be a good point here we I think we can touch on this all day but there’s there’s next aspect here is talking about setting up workspaces and I like this one another one I’m intrigu to get your guys’s thoughts because

37:39 to get your guys’s thoughts because again this goes into probably the the technology that you’re going to choose yeah I think this is another I think the questions that are asked in this next section are really relevant how many workspaces do you actually need knowing that the more workspaces you build to your point Seth we’re now just slowing down the development process if you if you add more workspaces if I have workspaces for my models I have workspaces for my reports and I’m doing Dev test and prod that’s now six workspaces that I must manage so

38:12 now six workspaces that I must manage so let we need to make sure that whatever we’re trying to choose if we’re going to increase the number of workspaces let’s make sure we have the right design of that in our pattern so that everything just works smoothly together across those things so I really like the questions will you separate the workspaces by item type reports or models will you have access to each workspace do you do you need to have control separate yeah do you separate out prod to only the person who needs to deploy or the yeah yeah so like there there is likely the and and this

38:42 there there is likely the and and this is one thing I think that Microsoft is not doing a great job of communicating but I think it’s there Dev test prod yes there are different lak houses artifact semantic models but the data itself in each of those environments very much could be different you’re pointing at Dev server versus test server versus prod servers if you’re doing your design right there’s three different places now in the business realm I feel like those are all pointing at prod Dev test prod because Dev is building things test is

39:12 because Dev is building things test is still but it’s all the it’s the same data each time but when you’re in like healthc care or Hippa or financial you can’t be showing all that information in Devon test you have to have some mock environment that’s doing a simulation of data before you get to production so you actually have to limit access to that production environment so I do think that is something that’s important here again this is the difference between is this internal reporting or say something you’re going to give out to an external client or something more sensitive I think that also changes the P ofer design here so

39:42 also changes the P ofer design here so we’re we’re we’re blurring the line very heavily between it and business I think in this space I I I would love to see more here around the work spaces for the type of content and the more I’m thinking about this for whatever reason Mike you not had a nerve but triggered something for me it makes a lot of sense to build the content or the workspace development the structure of how what you’re going to build solely around the content that you’re going to do because to your point there’s

40:13 do because to your point there’s probably a lot more use cases where your lake house is a gold standard or your semantic models or maybe a gold standard where they’re not going to change rapidly across each one it’s going to be the definitions the reports themselves that need that Dev test prod but if you’re dealing with a Lakehouse if I’m a team dealing maybe a the business dealing with a Lakehouse maybe I only have two Lakehouse workspaces the test and prod maybe maybe I just have one where everything’s flowing through because one I don’t want to duplicate

40:44 because one I don’t want to duplicate everything I want to make I want to make this as easy as possible rather than trying to point my parameters in a pipeline or from the git point of view but when I’m dealing with semantic models or more rapid development that’s where the dev has prod probably makes more more sense I don’t know if that that’s making too much sense but I think the gist of it is if I’m dealing with more let’s say overlapping content lak houses and gold semantic models certified semantic models that’s probably staying in the same place a

41:15 probably staying in the same place a little more and not going from workspace to to workspace yeah I agree with that one a lot on the flip side though you also have other teams when we’re talking about fabric right yeah oh yeah dat data engineering teams like these workplaces could be just the straight data yeah like ecosystem and in those cases it makes a ton of sense from I I guess getting more involved in git integration where you’re deploying changes from one environment to another especially roll

41:51 back so here’s a question at this point in time in everything we’ve done with CLM my opinion here is the the how you organize and structure the workspace

41:59 organize and structure the workspace development is the most important right now I feel like this would be the one that would cause the most confusion or set you up for the most success again how are you organ so how are you gonna organize your workspaces is right now probably the most critical juncture of what you decide to do and what you what you build most critical juncture juncture may be the wrong word but the critical part I’m going to lean into I would say it’s it’s a very large

42:29 into I would say it’s it’s a very large like if if you’re talking to this article and saying what of this is very important I think it’s a very large at the beginning of a project it’s extremely important to Define how many environments and what is your build pattern for those those common those environments build pattern and and because even down below they have like here’s some patterns that may be of interest to you here’s approach one Dev test prod here’s approach number two Dev test prod and there’s different patterns on okay this

42:59 there’s different patterns on okay this environment or this test and space is only for validation we’re not going to build things here in pattern number one you could build things in the test environment that you then potentially get approval on and then move forward to production so I think I think the idea here is there’s a lot of flexibility in what you can do the the build of these things how you architect the the workspaces highly dictates how much effort it’s going to

43:29 dictates how much effort it’s going to take you to get something out the door later on and what process you’re going to follow to follow so I’m not going to maybe I’m trying to say it differently I’m trying to say you need to focus on documenting what is your process and making it very clear and very well known this is how we’re going to develop I think that’s that’s the number one most important thing I also echo or agree that there’s going to be a lot of content that’s just in a workspace one workspace and it’s a combined model and support allinone there’s going to be a lot of content

43:59 there’s going to be a lot of content like that when you start getting more certified in your levels of content or if you’re trying to be more metered around regular releases consistency of quality like there’s a there’s a push to move away from a single workspace to more of these structures that allow for more checking and development and you’re trying to separate the development away from the checking and the production pieces so what I want there’s a balance there what I want as you’re describing there I I want something in the UI that lets me

44:30 something in the UI that lets me categorize workspaces it’s a it’s a mess right now and you just opened up this this huge realm of like as an admin or if I want to organize what’s going on in in this ecosystem called fabric right I I I can have a workspace for anything any flavor now data only test Dev prod uat like production it’s business it’s whatever how do I know how do I know what something is in this ecosystem but

45:01 what something is in this ecosystem but just by naming right give me some way to categorize workspaces even if it’s on an admin level right like and it doesn’t even have to be like you pick it for me let me let me assign labels right and potentially You could argue that that’s a workspace name or something but I don’t think it would need to be I would just like hey like give me give me my icons of or not icons because I want to report on it right I want to monitor like who’s

45:31 right I want to monitor like who’s building workspaces what does this look like what is my ecosystem of data look like in that in that bucket perspective because there’s so much that we can do with workspaces right like we were just saying it can be like the type of worksspace is it permission related is it deployment related is it where I have data related is it my models is it my reports is it my buckets of workspace that I keep monitoring for the business and say this is a business

46:03 and say this is a business owned workspace because the content within there is going to be wildly different I don’t know it’s just a thought that when you were describing all these things I’m like man I I want this thing that lets me report on my ecosystem I want to unpack what you’re saying there a little bit more it feels like what you’re saying to me Seth is you’re saying you want domains it sounds like you’re describing that a little bit well domains are not I I see what he’s saying with do from the categorization point of view domains are according to how Microsoft’s setting

46:34 according to how Microsoft’s setting it up right now is like departmental based like the type of content for the business you’re making an assumption that’s it’s a departmental based but it does not have to be departmental based I’m doing the documentation it you’re right yeah domains are just an owner of a collection of workspaces Dom group together so Microsoft is calling them business domains but that doesn’t mean like it so to your point Seth right here’s all you’re just when you said some of the things there were some words there that triggered thoughts in my mind right these are the collection of

47:04 right these are the collection of workspaces that workes are associated with domains maybe yeah maybe you manage that through domains so I I think there’s something there and I’m I’m the reason I’m I’m getting into this because I’m unpacking domains a little bit right now is because I think domains are a delegation of of okay so there’s a lot of things in powerbi where we’re starting to shift responsibility from one team to the next this is has been a challenge already from it like for a million years it’s just been around a fight between okay it you produce this and then you get access to this

47:34 and then you get access to this in the business domains feel like that collection of workspaces for me right now where hey we’re going to build this bronze silver gold Lakehouse data things okay well who owns that who’s the domain of what that may be and that may be made up of many different workspaces you might even have Dev test prod in there it doesn’t matter it means there is a domain that is it Centric the domain can collect all the workspaces together and you can even how do you want as an admin how do you want to manage the domains or how do you want to manage the workspace as well well

48:05 to manage the workspace as well well that’s if it’s departmental like they’re describing in the article around domains you can then associate that team that department with their own fabric skew and now you can handle here’s what the it organization will do and own you get an f64 skew build all the data through gold that’s your responsibility you own it you pay for it it’s out of your budget it’s your space then you can have a whole bunch of these other semantic models or other domains like HR or finance and now what I would call those

48:35 finance and now what I would call those is now you have another delineation to you can delegate some authorative properties to that collection of workspaces you can allow them to turn on some admin features right it I don’t know had the things that trigger mind domain subdomain I just ripped it up real quick it’s exactly what I would want so they have it good job thanks but it’s not it’s not implemented in a tagging like one thing that domains does not do which I think makes that

49:05 does not do which I think makes that does make sense is oh sorry one thing that domains will do is there’s no concept of tagging or having multiple workspaces in different domains so you get a domain the workspace can only be associated with some domain or there’s a subdomain that lives that you can put in the subdomain which is then part of the broader ecosystem yeah but if you think about parts of the company this is this makes a ton of sense and what I want to stress here is I think the domains help us enforce or let us hand off

49:36 us enforce or let us hand off responsibility between teams and I think there’s a very large mistrust in businesses that don’t allow okay I’m going to produce these tables that is my responsibility my responsibility ends at the the table Finance you pick up the semantic model from there build what you want to build you build it you own it but the idea is we will help you understand our data we’ll make sure our data is accurate but there’s a contract between us the it organization and now you the finance team here’s what we’re going to give you this is our contract it’s going

50:06 give you this is our contract it’s going to be these columns these tables it’s going to be correct you do with it you want you will after the fact and it’s up to the business now to own that process and if they fall apart and it goes to crap they’re on the hook for it they’re it’s their responsibility so I think there’s this delegation of responsibility that has to happen inside these ecosystems and honestly without the concept of subdomains like and because I I want to completely agree but without that concept of subdomains this would it would not be I think what you’re trying to reach for but yeah this

50:37 you’re trying to reach for but yeah this could be achieved utilizing the domains and subdomains that’s actually a pretty good point to actually utilize not just business Concepts but yeah what we’re actually actually doing awesome very good point I want to keep going down so I just want to Advance a little bit forward through here the article we’ve been talking a lot about things there is a lot of lot of patterns on how to deploy environments of things most of the article I would say the middle area of the article is all about patterns using git deployment

51:07 all about patterns using git deployment Dev test PR so there’s there’s a lot of images that are very relevant here but there’s a lot of unpacking in the middle section and then the next section I think I really want to emphasize here just very briefly is how do you Version Control your changes there’s this is many different options depending on how big your team is what they’re comfortable with are you a SharePoint are are you a one drive company do you have comfort in G or get Integrations do you do this in teams do you check in check out files these are things that I think I’ve been doing with teams

51:37 that I think I’ve been doing with teams for a long time now this makes a ton of sense and I really like that they’ve explained all the different patterns here that you could use because some teams will be more comfortable using git others will just be like nah we’re just going to use SharePoint and what I would say if you’re going to use SharePoint or one drive to store your files I highly recommend you build a library collection that enforces checkin and check out on all your files it’s a huge win to be able to have that that detail in

52:07 able to have that that detail in included inside that file formatting so I really like that and also if you go to SharePoint you get automatic versioning I think depending on what your settings are in SharePoint 100 versions of your file are just automatically saved so you just save the file up with the same name boom you just get a 100 versions of it and you can always roll back to previous versions which is really really nice so I I really like that and depending on your process right like in those versions you

52:30 process right like in those versions you can add comments right so true almost like if you check in check out you can require that they make comments on things correct love it which recommend that you do because SharePoint stores versions so what you can get into is like okay somebody made 10 version changes in a single day why what’s going on what happened what which like in cases where you need to revert or find the old file which which one’s the right one yeah with with the fires

53:02 one yeah with with the fires I’ve been in I’m happy to download 10 different versions in a single day just because I know I actually have it but and have it than not better to have it than on that’s that’s an understatement but I the git part here I don’t know how much we want to touch on this because again something I highlighted here to what we talked about before was they talk about all the G integration and then there’s this little yellow bubble that says warning G is currently in preview and not all

53:32 currently in preview and not all features are supported yes so I’m I’m still struggling this getting there it’s getting there like I said it’s it’s future State it’s future State yeah I think I think what I do like here though is like they’re they’re they’re outlining the first one is what the vast majority of I think people use today if you’re using like get on this it’s so easy to to get on to approach one you Point get the Version Control going you can do the one drive refresh if you want to don’t even need to just get some sort

54:02 to don’t even need to just get some sort to don’t even need to just get some Version Control for what you’re of Version Control for what you’re deploying right and then it it’s a gradual re like why do you want to do this and then approach two ramps it up a little bit throws in a little R repository like then so it’s Progressive and I think the more Progressive you go Microsoft is is actively working on ensuring that those Advanced scenarios work super reliably for all use cases where in six

54:33 for all use cases where in six months or three months or whatever the case may be you’d be able to look at this documentation just rip through it and go to the hardest one and be able to implement it in that way does that mean you can’t do that in today’s world no but it requires Somebody Like Mike right an expert who understands this whole ecosystem to oversimplify and train you how to use things in a certain way the challenge here is like when you’re to your point I think earlier Tommy going to try to bring on a team and everybody’s new to

55:04 bring on a team and everybody’s new to this you don’t understand the nuance and the pitfalls which means you’re going to hit a roadblock most likely right and you’re not just going to implement something of of Challenge and complexity in your pipeline to deploy things unless there’s value for you doing that which is reliability and that still goes back to the point at which where is this really good to use whereas it potentially still not an area you would Implement until your team is so fluent that this is a five

55:35 so fluent that this is a five minute addition it’s not it’s not adding hours or people figuring out and you’re figuring out the break and the the chain or regoing back and like it’s I can’t deploy something problem where we have been in the past right but like the tools and Technologies like if you guys were around in the in the it technical sector where it’s like I’m straight SQL database we’re developing reporting Services reports or God forbid you’re using tabular models and visual studio business intelligence tools like you’re

56:07 business intelligence tools like you’re triaging and trying to figure out what the problem is just to get something deployed for hours yes that that can’t happen anymore like and and and I don’t I’m not saying we’re there with this but it needs to be a fluent fluid process and I think we’ll get there but some of the more advanced scenarios like require some of these tools to hit GA and and a lot of the standardization of not bopping around to different solutions and just here’s how we do things here’s here’s a very clear path and some training can be done or

56:38 training can be done or developed and and bring people on to to this because the value ad longterm especially for these high viz reports and models is reliability and you do need to bring that as a team to an organization for certain use cases and these were these would be the ones and why we’re spending so much time talking about this when when we do become GA with this or this becomes more mature I think it we this deserves a whole episode or a series around branching and

57:08 episode or a series around branching and and git because you can’t just do git without the concept of branching well this is the art the diagram or the image that’s presented right after that little warning box that you gave timey is excellent I really like this one the image is super good it talks about the content creator it talks about the local repo of like you bu build the this is Kurt hand all over this I can see this really really well this is amazing this is exactly how it works but it the contrast to me is so Stark between I have a PB file and I save it to SharePoint and then I look at the git

57:39 SharePoint and then I look at the git process that’s like three steps save put in SharePoint download work on it save put in SharePoint check in check out like that’s simple this process around Azure devops and Azure repos to use all this is way more complicated and you’ll know there’s there’s 12 or 14 steps in here of different things you can do so the level of control drastically increases the the quality can be measured much more in detail one thing I would really like to point out here is I’ve been touting this concept

58:09 here is I’ve been touting this concept of the release manager for months I’ve been talking about there’s a person there’s a Persona there’s an individual who’s in charge of making content go from Dev to test to prod and there’s an individual this is the first documentation piece that I’ve seen where they’re talking about the content creator and and then they’re talking about the release manager and I think we’re finally having the right conversation the release manager Persona this is someone who knows how to use git who knows how to move things through environments who understands and can tap

58:39 environments who understands and can tap the team and say we don’t have merged pool request we’re not approving these changes this is more of a PM relase manager and they’re physically their job is to move content between different environments this is incredibly important to to note here because I think Microsoft is finally catching on with the language of how this is supposed to work so really like this article all right the final let’s do final thoughts I think we’ve well gone over our time today so any final thoughts I have no final thoughts my

59:11 thoughts I have no final thoughts my final thought was before Seth is out of final thoughts Tommy anything for you no I think the the biggest thing is and I think right to your point Mike fabric is introducing a lot more not just skills that we need to learn but probably job roll or that morphing of job roles call it personas cuz I think I think I think job roles is a little bit too specific CU we’re not going to hire I’m not going to hire a release manager that’s fair that’s fair ideal but yeah fair I’m gonna say hey you’re a technical person I’m GNA add the release management skill to your

59:43 the release management skill to your repertoire use that yeah and yeah the last thing is what whatever you’re doing right now and if you’re beginning this journey in fabric there’s a lot of opportunity here for being valuable really being valuable in organiz organization with these different personas this is going to I think Mike the this release manager concept this Persona is going to be critical for so many organizations as this becomes the the de facto solution I would agree yes so I like that there is Now

60:14 agree yes so I like that there is Now options so one the develop and manage content article is very comprehensive I think Seth your point in the very beginning of the article is extremely relevant please note some of this stuff is not fully functional yet and it’s still getting worked out but we’re heading I think toward the right direction so this article will stand I think the test of time just purely from a this is how things potentially can work there is probably I would argue there is rough points around deploying code from Repose into Dev test prod it’s

60:46 code from Repose into Dev test prod it’s not as smooth as it should be and there’s not enough examples out there to make it easy to deploy things programmatically in all those environments so that’s a little bit of a challenge to me right now but I’m encouraged by the powerbi apis are getting better the documentation is getting better and we have John kky so John can take care of everything else John can help us figure out all the details and the releasing and the pipelining and so if we have as long as we have those three things I think we’re good to go and we’ll figure it out we’ll have a good future all right that being said thank you all very

61:16 right that being said thank you all very much for listening to the podcast we really appreciate your time hopefully this unpacking this article was good for you we really want to encourage you to go look at the article spend some more time in it it’s highly recommended you check it out and read this one thoroughly and see how this fits your team or what you’re doing today in your team also note that you’re not going to do one of these patterns only you’ll probably have a mix of these patterns that you’re going to build that’s fine so with that being said if you like this episode if you found some value from it please go share this with somebody else or go give it to someone else who needs to learn how to do this stuff

61:46 needs to learn how to do this stuff because they don’t know how to do proper releasing of things you keep breaking crap so if you keep breaking stuff maybe this article is the one for you that being said Tommy where else can you find the podcast you can find us in apple Spotify or wherever get your podcast make sure to subscribe and leave a rating it helps us out a ton do you have a question an idea or topic that you want us to talk about in a future episode head over to power bi. podcast leave your name and a great question join us live every Tuesday and Thursday a. m. Central and join the conversation on all of power. tips

62:18 conversation on all of power. tips social media channels awesome you’re getting really good at that outro Tommy thank you very much appreciate it yeah it’s only 200 300 times down you finally got it it right thank you all so much and we’ll see you next time [Music]

62:58 the [Music]

Thank You

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

Overcoming Challenges in the Center of Excellence

More Posts

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.