PowerBI.tips

What’s New in Microsoft Fabric CI/CD – Ep. 431

June 11, 2025 By Mike Carlo , Tommy Puglia
What’s New in Microsoft Fabric CI/CD – Ep. 431

In Episode 431, Mike and Tommy break down what’s new with Microsoft Fabric CI/CD and what it means for teams trying to bring real DevOps discipline to analytics. They also cover fresh Copilot updates and a practical take on goal setting and skills development in the age of Fabric.

News & Announcements

Two new Copilot experiences in Power BI

  • Now available: two new Copilot experiences — Microsoft announced general availability for the standalone “Chat with your data” Copilot experience (off by default via tenant setting) and added Copilot support for securely embedded Power BI reports (portals and websites). The big takeaway: if you turn Copilot on, plan capacity and rollout carefully so experimentation doesn’t accidentally contend with production workloads.

Main Discussion: What’s new with Fabric CI/CD

Mike and Tommy use Microsoft’s May 2025 update as a jumping-off point to discuss how Fabric CI/CD is evolving from “nice-to-have backups” into a more complete path toward real engineering practices in analytics. A key theme: CI/CD is not DevOps—it’s one component of a broader process that includes people, policy, and workflow.

Start with goals (not features)

They open with a reality check: organizations want the “new Hot Wheels,” but don’t always have the ramp. The recommendation is to ground Fabric adoption in clear pain points (capacity constraints, brittle on-prem bottlenecks, untrusted data, slow delivery) and then select Fabric features that directly address those problems.

Skills and adoption: start small, then scale

A recurring thread is that successful Fabric rollouts require intentional upskilling:

  • For many teams, Git integration as a backup mechanism is a great first step.
  • As collaboration grows (more developers, more changes), process becomes more important—and Git becomes central.
  • They also discuss how “throwing more people at it” doesn’t automatically accelerate delivery without coordination and process (with a nod to The Mythical Man-Month).

CI/CD updates: what Microsoft is shipping

  • What’s new with Fabric CI/CD – May 2025 — Microsoft outlines enhancements to Fabric CI/CD including service principal support for Azure DevOps (preview), Azure DevOps cross-tenant support (preview), and expanded support for the Variable library across more item types (including future notebook support). The episode highlights how these improvements help teams automate workflows and broaden the deployment patterns available for enterprise environments.

Copilot + capacity management (practical warning)

They call out a practical admin consideration: Copilot can drive real compute usage. If you’re piloting Copilot, treat it like a rollout:

  • Enable it via tenant settings
  • Scope access with security groups
  • Consider using a dedicated capacity for Copilot scenarios to protect production workloads

Looking Forward

Microsoft is clearly investing heavily in CI/CD for Fabric, but the episode emphasizes that tooling isn’t a substitute for operational discipline. The win is when teams use CI/CD as part of a broader DevOps approach—clear ownership, repeatable release patterns, and a workflow that supports collaboration without breaking production.

Episode Transcript

Full verbatim transcript — click any timestamp to jump to that moment:

0:00 Heat. Heat. [Music] Well, welcome back to the Explicit

0:34 Measures podcast with Tommy and Mike. Good morning everyone. Welcome back to the show. Jumping back in. Good morning, Mike. We got some good topics today for you. We got a lot of news. We got a lot of things that we’re learning this week. A beat on the street. We want to talk about some things we’re learning and just playing with inside fabric. I guess I guess this is the right time to do this. , you get Microsoft build, there’s another number handful of features that come out. You’ve got to start playing with these new features and figuring out what’s going on there. So, this is probably part part and par for the road. that being said, our main topic today

1:09 will be just starting off the conversation around Microsoft Fabric and continuous integration and continuous development. There’s been some changes on the Microsoft side on how Microsoft envisions this working. So, we’re going to talk about that. What’s new with Fabric CI/CD? , what can we do here with automation git integration, automatic deployment pipelines, and fabric CI/CD? This is the article that Microsoft has here for May 2025. So, good timing. That being said, Tommy, what do you got? , any beat from the streets or news items you want to kind

1:40 of cover off? So I think especially with how fa how rapid development has happened with fabric build all the things I’m still dealing with a lot of projects where we’re still goal setting and goal setting for the organization for their projects what they want to do shortterm long term and I wanted to bring this to you Mike if you’re seeing the same situation we’ll say where we’re dealing with organizations who want to go fast they they hear about the news coming out like oh we want to do fabric

2:13 oh we want to do co-pilot but obviously we’ve talked about this a ton there’s skill there’s the technology that needs to be in place there’s the data that needs to be placed so goal setting in the age of fabric is how I’m theming this for you and I just want to hear from you your situation or how you’re dealing with that now because I’m getting a lot of organizations wanting to push for the latest and greatest they’re like looking at the new the new hot wheels and they’re going, “Oh, the new Hot Wheels came out. Can we get that?” Like, “We don’t have a ramp

2:45 yet.” So, how are you are you hearing this? What are you hearing from people? Are they excited about the new features? I think people are interested in the new features. I think a lot of what we’re doing right now is we’re we’re migrating a lot of customers from PowerBI and starting to explore more fabric related things. I think we’re still in this what is fabric how how confusing are notebooks and we’re starting to get to this point where I’m seeing a lot of customers transition from okay we were just doing PowerBI we were using just data flows we’re now starting to

3:19 use fabric and in doing using fabric there’s they’re actually finding much value from centralizing their data everything’s coming to one single place and I think that’s actually hugely valuable for the team to to to see that like there’s a now a central place where data lives. What’s happening now I think though is your point Tommy where the questions are coming from are more around what’s translitical how would I build notebooks what other tools should I be interested in around this new fabric ecosystem is it something in

3:51 visual studio code what are user data functions how do we work with them so we’re starting off simple with like notebooks pipelines and lakehouses and then from there it expands into these other use cases there and then we get more questions around well we have other needs of the business can fabric do that I usually the answer is yes usually there’s to there’s really not too many questions the business can ask that fabric can’t do at this point but at the at the same time and this is the this is the thing that I find really

4:22 humorous really funny now is I cannot tell you how many projects I had pre fabric where we had pilots and it was so one foot at a time in the water to just get PowerBI for the organization There was a lot of security, a lot of trepidation before just diving straight into fab or PowerBI. And now to your point, we’re introducing, , a ton more spades worth more. That’s not the right way to say that, but what I’m trying to say. we’re introducing so many more

4:55 features, functionalities, and technology into the space, and a lot of organizations still want to jump head in. So, the way I’ve been taking this with some of the projects is it’s okay to touch fabric a little, but it’s okay not to do the whole shebang of all the fabric features, especially if we’re still just touching PowerBI, if we’re not comfortable with that. And that’s led me almost to a conflict or or a decision point myself where rather than looking at this like a Mario level video game where level one’s

5:29 PowerBI and then you have to go you have to win PowerBI to get to fabric. You really have to do that. That being said, I I think there’s something in me where I’m arguing with myself on that. But right where you want to know what you’re doing with PowerBI because that is the end goal. of what we’re doing with the Microsoft data products. I think I see where you’re going with this question and the statement here. I I agree with you on some levels where you mean I I think it really

6:02 depends on the skills of your team. Where is your team comfortable? Right. And so I push a lot of teams. A lot of teams or data BI teams have a lot of ski SQL skills. SQL skills seem to be very common across a lot of these teams. and you’re either a DBA or you’ve gotten a little access to a database where you could write views on things and so there’s still a lot of I’m going to call it old old mentality that’s still digging around that I observe right and what by old mentalities is

6:35 everything has to be a view we don’t want we can do these complex view things and and it just works and so when we introduce things like direct lake and lakeous and notebooks. This puts people in their head a little bit. I’m like, well, time out. You were writing these really complex stored procedures inside SQL. If you can write a store procedure, you can write a Python notebook. Like, it’s yes, they’re different. Yes, it’s different languages, but like it’s the same level of complexity, I think. And it it could easily you can define

7:09 variables and parameters and build temporary tables to use. Well, , it’s the same concepts. it’s not that far off. Maybe the syntax is slightly different, but I think there’s there’s hesitation there. So, I guess I’ll say this, it really depends on the team that you’re coming into and how willing they are and how fast individuals can learn in there. I do think there’s to your point, Tommy, not every individual inside the business should immediately say, “Yeah, we’re going to light up fat for everyone and we’re just going to go.” I do think there needs to be some at least thought from a small group of people that center

7:43 of excellence that says okay what are we going to do are we actually going to centralize things what are some of our basic policies that we’re going to like have this and slow roll it go out to a couple people open it up for if a if a single team is asking for it work with them let them build it but to your point Tommy sometimes we’re not even we’re not capturing the fundamentals of like how to do a decent data model and yet We’re trying to now add all this fabric stuff on top of it, which shifts what we can do. And we can move

8:15 that work up into fabric, the data engineering side of things. But if we don’t really understand what we’re trying to build visually, we’re not really understanding a star schema, we’re not trying to build good data models that are performant, sometimes we hurt ourselves there and we’re not actually building a good data model. instead we’re just adding more messy tools to the situation and we don’t have we don’t have rules around like where data should go and where should we put it and that and honestly that’s what’s keeping me grounded because I think it’s so easy to get caught up with all the technology wanting to do that it’s easy

8:47 it’s easier for you and I as the individual to say that because we can test it out but for an organization what’s keeping me grounded is a phrase that I’ve been using is the goal is not the technology the goal’s the what solutions the technology provides ides and rather than being wrapped up in oh there’s no books oh there’s a lakehouse we can centralize our data well is that the solution that we need right now like what are we really trying to do then the that goal that objective will dictate what technology we’re going to use or how far we go into that I I can I have a pocket full of of projects still that I

9:24 would say the best case for them right now is a focus on their PowerBI development because it doesn’t matter to me if you have a Blake House or if you’re using notebooks that people can’t get their data and they can’t trust their data. So again, some organizations, they’re ready to go. They have the people and they have the need for what Fabric’s doing, but again, the solution is what should dictate the technology, not the other way around. I’d agree with that one. So anyways, just just interesting thoughts on that

9:55 one. Yeah. So I I still think you’re right. Goal setting is probably good here. , I think you’re going to want to have someone study this. You’re going to want a lot of proof of concepts. You’re you’re going to want to figure out like where does a lakehouse fit? Who should be learning this? We have these things called pipelines. Who needs to be working on them? Like there there’s definitely a skill set range that you’re going to want to find the right people to do that. And and I think right now the challenge becomes for organizations is how do we diagnose or figure out which is the right person to put in that space to one let them explore and learn what works like one you have to just

10:28 understand the technology but then two how do you use that technology in concert with whatever data process you’re trying to build right it’s got to it’s got to go together right you have to have someone with business knowledge of like why we’re doing this and and even now more than ever. Like I feel like we’re getting to the place where I used to be very very pro like bring everything to the lakehouse. Everything’s got to go to the lake house. But now that we can make semantic models with direct lake and import models like I’m a little less interesting motivated to like put everything in the

11:01 lakehouse now. And so yeah when I think about your security layer the reason the reason I brought everything to the lakehouse is because I couldn’t put it in a semantic model. like I had to bring everything like it didn’t matter what it was if it was a date table it was like everything needed to come down to the lakehouse but now that I have a highly performant direct lake and import mode together in the same semantic model like there’s no reason why I can’t now do both of these things and so I never thought I’d hear you say that well I think the techn is changing right it it it was a hard requirement if I

11:35 wanted it in the semantic model it had to go to the lakehouse We’re seeing shifts and this is the nice part I think what we’re in is Microsoft is really they have a huge dispos at their disposal tons of tools what’s the right way to get that into the semantic model and then and then you can have other conversations around okay what do my users consuming PowerBI reports need to really do right and then you can design a system that adjusts for that right it it you could change it in a way that aligns to your business goals I got a

12:08 scenario for you because you brought up a really good point here about the people and I think this is a big focus for a lot of organizations. There are critical decision points on going ahead is is going to be the skill level and a lot of this you’re going to be hardressed I think to find someone who does all of it or is aware of all of it. I think you still have a lot of people in their in those I know there was just the what the Kentucky Derby or the Belmont this week where people are still in their lanes on what they do whether they’re in engineering and

12:40 a lot of people they probably just touched a few of those things if you hear from a client or you’re on a project Mike this is the scenario here that we are pushing ahead we are going where no man has gone before no person has gone before with fabric but we don’t have any people who have done Are you going to lean closer to recommending one of the two options? Either we are going to then invest heavily in the people we are going to scale up everyone or we going to push back from that goal. So those are the two scenarios. A company wants to move

13:12 ahead with fabric. They want to go as far as they can. They don’t have the people. So what are you going to re and again scenario based here but are you going to then push okay let’s we’re going to devote x amount of resources and budget and money towards getting the people skilled up and hiring the right people or is that to you a red flag to say listen we got to do some headway here if I know you want this but we have no one to do this yet I’m going to say it’s going to it’s going to really highly depend on like what you’re trying to solve for right just to say we want to use fabric and

13:44 just to start with it not not a to me not a compelling enough story to say let let’s spend time and resources to go do it right what I prefer is I prefer to have like a business use case like something that’s causing pain points some pain somewhere in the organization let me give you an example we have a SQL server that’s on prem the SQL server on prem is continually getting data through a gateway and bringing it to semantic models and there’s multiple semantic models hitting that same server maybe different databases but that server is

14:16 starting starting to fall over at certain points in time. We get random failures of queries not executing because congestion something slow like we don’t know like there’s stuff happening. So either we spend a lot of time like really digging into and tuning all the semantic models that are hitting that same SQL server and looking at all the logs or what if we lift some of that SQL server away from the SQL server. What if we loaded it once to the lakehouse and repointed the larger semantic models at

14:49 the lakehouse make the loading happen one time or incrementally or whatever the whatever the logic is to get the data from that on-prem SQL server to the lakehouse. So what that does is that that is a a opportunity where you would place fabric that solves a real problem. So I would look for those use cases. And the challenge of this is how do you how do you even get to these points where you say these parts of fabric are solving these other pain points you have

15:23 inside your PowerBI only world, right? And so I think part of that’s just googling, part of that’s going to conferences, part of that’s just like listening to other people, talking to community. Like I think Microsoft is putting out articles that are helping you with this. And I think the idea here is you want to move more of your data system away from onrem, bring it closer to fabric because then you have the ability to scale up and pay for what you need inside the fabric space. So that’s that’s where I’m going to go after. I’m not I’m not going to just let the business just go you need

15:55 fabric and then just let them go wild and figure out what like I’m going to sit down and ask them for what’s the use case, what are you trying to do? Are you trying to give more access to more people? Like there’s there’s some reasons behind this. Like what is the reason? Let’s just unpack that reason and then once we get there then we can mobilize people and figure out okay who’s the right person? What do we need to go learn? Do we need to go hire a consultant? Is this something go if someone goes to learnicrosoft.com and get certified in something and then we use it. So there’s knowledge out there. It’s just a matter of does the business

16:26 want to oh gosh allow people to have time to learn it. Well, it’s it’s a funny thing where if an organization just says going after fabric, well, I guarantee you no one’s going to be happy if you have fabric, quote unquote, because if it didn’t solve anything, then you’re selling fabric and I’m still here. though. But it to me to me it’s the same thing like if you have an organization where and I think I said something that apparently was groundbreaking but if you have an organization where they just want fabric or they just want PowerBI but they don’t know what it’s actually

16:58 going to solve for the organization no one’s going to be happy if you also now have the licensing and people can use it. It has to solve something that pe to your point it has to be a pain point that people are solving. Yes. , what’s what is the reason behind wanting to move or change or adjust something if if the reason behind it there’s nothing that’s going to make it better? Like what’s a business? And this is actually a bad practice. I would argue a business shouldn’t just go out and run and go grab new technology if it’s not solving some real world problem, right?

17:31 Preach. Look, if if I’m using data versse or I’m using Dynamics and I need to get data somewhere else for reporting very quickly and easily, great. I’m hearing things about fabric and PowerBI that are going to make it easier for me to get things from Dynamics to my team to to report on. Great. That’s that’s a painoint we’re trying to solve, right? I’ve got this SQL server that’s falling over. I want to move it over. Yes, that’s a painoint. We’re trying to solve it. We’re doing too much data engineering work in Excel files. We want to centralize it. Let’s bring it to somewhere else. That’s a painoint. Let’s try and solve that. So, there’s a lot of these like you already have these pain points. You’ve already identified them.

18:02 Anyone in the business you could ask these questions to and they’ll be able to identify the pain points. Now the trick is can you take those pain points and redesign them and bring them into fabric. That’s the next question. And so I think that’s that’s where I would go with with the team trying to get started with fabric. And a lot of times I feel like the best learning happens when you just have the project and you work side by side with people to build stuff. I’ve I have one customer where we just when they need time we schedule a two-hour

18:34 working session. They come with some questions. Hey, I want to do this and this and this and this. Hey, we’ve got these data flows. We think we should migrate them. What do you think? And we just sit down and we just stuff it out. We work as as fast as we can for the two hours. Okay, let’s let’s address one or two of these problems in the call, building things, clicking on stuff, making notebooks, using co-pilot, teaching the team how to write Python in real time. That’s we do the work. Like we work together on things. And I think that right there is what should be

19:05 happening. You need to bring in experts who’ve done this before into your team. Have them help you get through these first initial steps. And then once you have that, now you have the knowledge in your team a little bit more. And at least now you like some of these problems. I I find this the same way when I’m googling things. I guess I guess googling is like so old school now. I should be just doing everything in chat JPT. But when you Google things, Tommy, like you have to know enough to know the words to write to go get the answer you’re looking for, right? Like I just

19:39 can’t get data and get the right results. Yeah, I’m having data problems. Like there’s not enough detail there. Like you’ve got to know like what you’re googling to to get the right answer back. And it’s very important. Even if you just hit a couple of the key words, you just got to hit a couple of the key words to get the answer back. And I think this is the same same pattern here where organizations if they don’t know what to build in fabric they’re not even hitting the key words. I couldn’t even Google the right things to go figure out like why like fabric

20:12 pipeline variable sets like what what Monty Python and the fabric notebook. Yeah. Yeah. What what am I what why am I even googling these things? So yeah I think that’s that’s where we’re at. We don’t even need what to say yet. And you need some level of like term knowledge to then start Yeah. getting your team to learn the on the on the right direction. Yeah. The last point I’m going to say here because I don’t want to hammer it is you you just brought up something that I’m finding really exclusively with fabric. It’s it’s almost impossible for me with a lot

20:44 of these projects where the the consultant is just going to do the work and hand off without that really intimate relationship of upskilling their team too. It’s I don’t know if we live in that world anymore to be honest. But yeah, really good things there. Yeah. What other topics do you have here, Tommy, for us? We have a few more, Mike. Well, we have some co-pilot updates. I know we’ve been hammering this, but what? Copilot is where it’s at, baby. So what actually just came out I believe yesterday. Yes, yesterday is now available after build

21:20 chat with your data is available now. So this is a independent icon on the lefth hand side button on your PowerBI or fabric experience app. PowerBI where you can chat with your data. Again, this goes with data that’s already been prepped. We actually had a conversation about this already, but this is now available for organizations and it’s off by default. So again, this is something that you have to do and enable in your admin settings, which I really really happy that the Microsoft team decided that and not one of these on by default

21:52 features we Oh man. Yeah. Well, I’m waiting. I’m waiting. Love to do on by default. Yes. or or off by default for like 3 months and then on by default after that. So I I don’t know Tommy like I I’m on the you’re you’re exactly right. Like I really like the idea that they made it off by default because again this could be a very large impact to your tenant as well. Oh my gosh. But yeah, I feel like we’re going to get to a place where it’s off by default for now and at some point in the future, 6

22:24 months, a year from now, they’re going to be like, “Oh, by the way, oh yeah, we’re changing it now. we’re not going to just turn on the experience by default in the future. So yeah, we’ll see how that goes. And again, this that to me goes back to it’s a feature that requires so much internal development just to I think be able to turn on. There were other things like the filter page like yeah we can we can turn that on. That’s fine with me. This we can’t just turn this unexpected to work. The other part of that is also the enable co-pilot for portals and websites which is and I think this is only for the true

22:57 embedded I don’t think this is published to the web that you’ll have this but if I embed or put something in embedded report or for portals and websites I can now also have that enabled with co-pilot. So I’m actually surprised how quick after build this is already on or or now enabled here. So, if if that’s one of your goals, there’s a lot of work to do here. One thing that I think would be interesting here, and again, I’m I’m thinking about this this co-pilot experience, and like once you go into the admin settings and turn on, just just a word of maybe not caution or

23:30 guidance or something like here. , but when you turn on co-pilot, whenever you have that on some capacity, now you turn it on, you need some capacity to run it on. It just can’t be it just can’t be standalone, right? So, if you’re going to use the co-pilot, you need to be able to say, “Okay, there’s an F skew non the non F64 trial version.” Something has to be there for you to turn it on and then attach it to. I’m almost of the opinion right now. Again, it it does have there is an AI in the usage metrics. There is an AI

24:03 related activity that shows you when AI is happening. , but if I wish this I wish this feature if you turned it on for a group of users, I wish you you as the admin could set it up so that it would automatically go to a single workspace cuz there is a feature where you can actually make co-pilot automatically like you can do a dedicated co-pilot experience. Yeah. Because what you don’t want is you don’t

24:35 want your production or other things you’re working on people asking a bunch of questions and all of a sudden running out of compute units very quickly. So if you have so let me say this way if you’re an organization that’s large enough where you have a dev version of fabric and a test and prod version of fabric or fabric skew I guess let’s call it that skus I feel like I would recommend putting this co-pilot experience on the dev co-pilot that way if it if it runs out of tokens if you run out of CU at least it’s not breaking anything in production

25:08 because you could submit a lot of queries and you could stop semantic models from running because it’s it’s adding in contention to that usage on compute. You’re right. You can’t That’s a good point, Mike. You can’t do both. You You can’t have a workspace that has both licensing there where I have something dedicated for Copilot. Well, yes and no. Yes and no. Like, so there you can do. So, if you wanted to centralize C-Pilot down to a single user, right? What you have to do is you have to say use a security group. These

25:42 are the users that will get. So here’s the dedicated co-pilot. So you actually have like a designation like this is a capacity. Yeah. Yeah. So basically this is the capacity. This capacity is the only one I want to use for co-pilot. Just that capacity. So you have to turn that on. Mhm. You have to have users attached to that. So that way when those users use co-pilot, they’re only using the dedicated co c-pilot capacity. And then when you make the new co-pilot or you turn on the co-pilot for everyone, right? So when you turn on the new co-pilot experience, that’s the

26:13 standalone experience, you open it up to the same user group. That way they can only pick or only use that dedicated co-pilot capacity. So I think that’s the combination you want to do. But again, it’s I’m just pointing this out because it’s not centralized. It’s not in one single space. You have to like do settings that like, okay, I’ve got to go over to the capacity. I got to adjust some settings. I’ve got to go over to the admin portal. I’ve got to adjust some settings. I need a security group that is in both places. So, I’m thinking about this in like a best practice and guidance, right? If you’re going to use it, make sure you’re

26:46 making a security group and rolling it out slowly to that team. To your point, Tommy, this this fits very well with your goals in the age of fabric. Like, how do you roll this stuff out? Like, that’s what you do. You start out small. You make a a security group and you put it where you need and then you go from there. Especially the fact too that this is that standal unlike the report copilot this is a standalone it doesn’t live in a workspace even though obviously it’s going to be using capacity works workspaces and their capacities to the user if they say show me the sales that has

27:19 to come from that certain workspace but I I would agree with you in that in that structure because honestly I I couldn’t imagine doing it a different way without you’re going to be spending more or wasting something more than you would want to. So that is available and the embedding is now available. Mike, I got a problem with our next news feature and I want to see if you agree with me. This is actually a VS Code. How to debug user data functions locally in VS Code. Oh man. And I’m struggling with this one too. So but this is why I have some

27:53 issues with this because I think it’s failing to see the the audience here. Cool. Cool feature. Like it love it. I’m going to use it. However, some of the Q&A things here, FAQ at the bottom. What are the reasons for the squiggly underlines appearing after import statements? If you’ve ever written any Python or done anything, why or what the what the red squiggly lines are for. They’re obviously errors. there’s , they are for what is it called? simply a package that’s not available, whatever them may

28:27 be, something misspelled or that’s that action’s not available and they’re talking about the requirements. They’re talking about these local settings. They’re talking about very basic things if you’ve ever done any Python. Again, I’m not a Python developer, but I can tell you what all those mean just by having some experience. So my question for you here and here’s where I got a bone to pick so to speak is who’s creating user data functions but not aware of any of this right because this is the struggle I’m having with they’re introducing some awesome

29:00 things I love the idea of user data function so I want to preface it that way Microsoft doing great however to me the Joe Schmo or whoever is not just going like what I’m going to create a user data function today. The only thing I’ve ever done was a few calculations in Excel and I’ve created a DAX statement and I’m going to dive head first dive right into the deep end of Python here and just figure it out. I don’t think that’s happening and if it is tell me tell me I’m wrong, please. But I don’t think that’s happening. So,

29:32 yeah. No, this this is so a couple things. The user data functions is Azure functions now brought to fabric as is functions as a service basically, right? It’s it’s it’s a software as a service offering of functions that now exist in here. So with Python code but yeah the story there’s there’s two stories here. The story is here. Look it’s not so like the fact that these are here it does give you some helping mechanisms to like help you get started but you are writing pure Python. You are writing a full function that needs to be deployed into

30:06 fabric and do a specific thing. So it’s this is not for the faint of heart. This is definitely for I would say a very senior data engineering mode of things. It’s this is not this is not a PowerBI user in my mind. This is someone who understands code, who’s done functions, who’s maybe built applications before. because it is Python, the gap isn’t too far away from people that would be business users. But you’re you’re not giving this to any old normal business user. you’re you’re finding the ones that have written a lot of code, who are

30:38 writing lots of notebooks, who have very specific use cases to do this thing. , and , there is a really good presentation out there. I did a one on YouTube with Shunada where we went through like five or four or five like really good solid use cases like, hey, you have a report and you want to update records. Great. You have a report. You want to then insert a new row to a table and have the data table like write back. That’s that’s one of the use cases. I have a I have a table and I want to take actions right away. Hey, I want to send a message or I want to run a piece of data through some AI and then have it come back as an answer. All

31:11 those things are now possible. So, the the user data functions extend an amazing amount of extra space where you can build custom things inside fabric. Yeah, go ahead Tommy. No, I was going to say and yet I would be remissed could the devil’s advocate here be you don’t need to be that expert Python or experience if I have co-pilot. Oh, so I think the answer is there. Yes, I think I think you do need it will assist you. Yeah, it will assist you but it’s still in my opinion it’s still not you’re not again even if you are there

31:45 do you even know how to write the imports? Do you even know how to use the local settings.json? So I think the reason they’re putting these things out there is because if you’re a if you’re an Azure functions developer, these are all things that you’ve already read in the documentation around Azure functions and how you run them locally. My opinion here also is I don’t really love the whole VS Code desktop experience in editing functions at this point. just because there’s this whole idea of like okay I’m in fabric I’ve made a function and I’m getting it down to my local VS

32:19 Code which is nice because I have a whole other series of code paths I can use down there. But the downside is it’s not very easy to get the code published back into the service. Like the way the UI is written, maybe you can do it, but I it’s just difficult to find the button of like I should be able to pull down a file, make the edits, and hit publish, and it should just go right back up. It should just come down and back up. No problem. I should be able to bring the code down, and then when I bring the code down into my VS Code, it should automatically recognize, hey, I need this local settings file. Hey, I need

32:52 these packages and import library pieces to be imported and actually added, right? They’re talking about here, , why are there squiggly lines, but when I run it, it it works, right? Well, the idea is it says the first time you run it, all the libraries will be installed using pip. Well, why not just do it? Why not why not when I bring the code down? Right. Right. So just to me things just seem like out of sequence like the things that I would have expected like if I’m bringing the notebook down and the and the pip

33:23 libraries are not there sure give me the squiggly line but then also note to me like hey this library doesn’t involve right click here to install it and it runs a command line down below and you’re done like that’s that’s what you want right so it it shouldn’t be this mysterious there there’s like knowledge gaps I feel like in this functions piece that I just don’t really love it it’s the same with notebook well notebooks is almost the opposite where you can get notebooks running so to speak locally, but you are very limited. You don’t have all the packages or some of the major ones. I don’t have MS Spark utilities, which is a big one. And I so it

33:57 makes it Yeah, I I agree with you. I love the idea, but if I’m going to use that exclusively, which honestly, if you were to ask me personally, which one would you use exclusively, VS Code or the web? I would want to use VS Code on my desktop if I could. I yes personally but right now did all the things that needed where’s the better experience right like right now like to me right now at this moment experience the better experience right now with developing these things and just build it in the web and that’s fine I’m okay I’m actually very okay with doing

34:31 the function development inside the web all the libraries are installed all the things that are there that need to be there I can I can send in sample data and it can do things to it like great Like that’s what I that’s what it should be doing, right? So I I’m I’m totally fine with the development of the functions to stay inside of fabric in the browser. Well, do me one better. Too painful to work in desktop at this one work in VS Code. This will be a little segue to our our topic today, but do me one better. Right now we have the Azure data function user data functions

35:05 extension and we have the VS code extension for notebooks but those don’t work with just the same git repository that you may have downloaded and I don’t know to me this is way out of whack for me it gets me out of experience let’s say Mike I want to edit a notebook that I’m using for in git right so I’ve downloaded that repository and I want to use the whole VS code experience with git or or VS code with a notebook from my git repository. Well, I I can’t use that same file, that actual same p

35:38 IPNYB file extension to do so. I actually have to create another folder with the VS Code extension for notebooks to then actually edit that so it actually connects to my fabric capacity. what I’m trying to say? Am I saying this clearly? I think so. Yeah. Yeah. And to me that’s such a weird out of whack experience where to me if I have an if I have get it’s not just to use it on the web it’s like oh look here it is on my computer now I can notebook and I can use all the functions yada yada yada and do that but that’s out of whack still which it’s just oh I

36:12 just want this seamless and I know that it’s a lot of work but to you to us I think as the users and the prousers when you see something when you see the a feature or the name of a feature you expect it to work as it should. And I think it talks about our our AI things too. If I have my Git repository with my notebooks and my user data functions that are a file, well, I should theoretically be able to edit that file and do all the things that I was already doing. We’re getting there, but and I think we’re seeing that too with what’s new with CI/CD. Mike, do we

36:46 want to talk about the analysis server or do we want to just go head in to our topic today? We’re about we’re very far in. A lot of news, a lot of talking about things. Again, this is like the reaction to things that came out at Build that we’re trying to react to. So, yeah, let’s jump over. Let’s let’s move over to our main topic. We’ll go CI/CD frame us out here. So, we’ve been talking about just , , news items that coming out here, functions, all these other pieces. Let’s actually move on to our main topic around what’s new in Microsoft Fabric continuous integration and continuous deployment. This is an article that I’ll put in the chat window directly from

37:18 Microsoft and we’ll unpack this article as we go here. Yeah. So, there’s a few features that came out, but I think there’s that greater also conversation that I want to have with you is for more and more users around this. , if you’ve never used Git or if you’ve never used anything like it, it still can be very intimidating. However, that being said, I think we’re finding that more prousers are adopting it or should be adopting it. And what they’re doing and the amount of effort that Microsoft’s putting into CI/CD exclusively Git is pretty extensive

37:51 where they’re making this to me the de facto way to really work in a workspace. With that being said, we’ll dive into whether that’s the case or not. But there’s some major features here, Mike, that I just wanted to cover with you. And we’ll talk about and I think like I said that will get to that greater conversation of you as the pro now should should that be a prerequisite skill. But let’s just dive into some of the skills where Mike automation maybe one of our top five favorite words in the world out there where how can we automate things?

38:25 You and I were talking about this on the podcast. Gosh everything that’s manual if if I do anything more than three times a week I need to find a way to automate it. And one of the things that they have here is an API automate git integration using APIs which Mike have you seen this? Have you used this at all yet? I have not. we don’t use a lot of APIs. We we use we’re doing more of this through the let’s call it the the the built-in deployment pipeline experience . So there’s a so there’s an API that kind

38:58 of is behind the scenes that’s like doing something like this like get a connection update your git provider. So this can be automated with a number of other things. So when you’re talking about moving things in and out. So I do we use this? No. We’re using we’re working with organizations that are just trying to get their head around like okay do we even need dev test prod? Like let’s get that started right. So I feel like the very first entry level to this is getting started there building deployment pipelines. So we’ve done a lot of like okay we have a workspace that is prod. Well we know we we recognize we need to have dev and test.

39:30 So how do we reverse deploy the code back through the environment so we get everything like aligned right there’s lots of different branching strategies one of the ones like there’s trunkbased de deployments there is github branching strategies there’s git labs deployment strategies there’s different strategies here and I think what the APIs do is the APIs allow you to have more options on how your businesses needs to function and how you want to deploy things like for example One pattern could be have your main

40:03 branch pointed to dev. Dev is your integration tests for all the developers. Developer takes code, they create their own branch, they make their changes and then they push they do the pull request into main which then updates the dev workspace and then you use deployment pipelines to go from dev to test prod. This is the pattern that Microsoft uses. This works really well for PowerBI assets but not so much for lakehouses and notebooks. there’s other considerations you have to build there. So what we’re seeing now is the CDICD is getting better to enhance those things. A second pattern would be

40:37 you have a git repo that has three nondestroyable branches. There’s a branch for dev, there’s a branch for test, there’s a branch for prod. So each of these branches have very specific functions and you can push code into them but you can never delete them. That’s the point. They’re they’re not they’re a non-deing branch on your development cycle. And so then you can build things in de dev and then you can then compare what happened in dev to test and then you could start a cadence of doing this and deploying things. So again I think the APIs is nice. It’s

41:11 going to enable us to do a lot more with different patterns on how you want to deploy things. Yeah. Slight hot take here and let’s see how much you how well you are on this on the scale if you agree or disagree. if I’m dealing with a team and to your point with workshopping with other teams, one of them that I’ve had on a recurring basis is just talking to them about Git and whether or not Git makes makes sense for this BI team. And to me, I’m leaning more and more on if I am going to deploy or recommend Git for a team that has not

41:43 used Git or let’s say is basic and Git, I am setting them up as much automation and actions as possible to make this as seamless as possible rather than they’re having to learn all those different little, , , commands on the terminal. I need to make this as easy as possible for teams and especially if I want to get them started because there is an education here. There is a learning curve here and there’s going to be mistakes that occur. There’s there’s no question about that. If you’re dealing with a team who is

42:16 just beginning their that CI/CD journey or not aware of all the ins and outs, but I want them to use it. I like git. I love git actually. I think it makes so much more sense where I can the way I can version control. I’ve been trying to do this for years before this was even been available with a semantic model and a report but I can’t expect them to say all right you have a dev and you have a prod and just publish to this branch but just expanded this there’s going to be a mistakes happen especially especially if there’s collaboration so

42:49 with the APIs with the GitHub actions I have to set this up if they’re going to be successful I’m I’m going want to point out. , yes, I agree with you, Tommy. There’s definitely something there, but I think there’s different levels of experience that are used in this space, right? So, if I’m a team that’s just looking for backups, I’m going to do a get very, right? So, like if if we’re just starting out, like I’m just coming from PowerBI and like look here’s the ch

43:21 let’s look I I’m always going back to like let’s identify the challenge and then what we could use to solve it, right? The challenge is right I have deployed something into a workspace and I broke it right and I don’t have that old copy of the PowerBI desktop file right right something like that right or or maybe I’m in PowerBI and I’m like hey look I have made a report and then I have a semantic model and then I’ve made an app right so in the app you get a copy of the report pointed to the same

43:55 semantic model So the app runs along no problem just clips along not a problem. I may have changed the semantic model to update my report in the workspace and somehow that has broken my app report because that’s a different report and I maybe have changed something in the semantic model that physically broke what was inside the app. I don’t have full separation between when I’m working on testing things and when I’m actually deploying things in the app. So what happens here is you run into this problem of like oh no I need to revert I need to undo

44:28 something I need to synchronize the changes. So the very small use case here is look just keep building things in the workspace deploy what you want to deploy do what you got to do and then you just need a backup. I’m just looking for backups of things. That’s pretty lightweight and I would argue every PowerBI user could get their heads around what that means. Look, I make a change. The little icon is no longer green. It says it’s not synced. I just synchronize it. That’s it. That’s all you’re doing. So, I think that’s like the very low low threshold here is we

45:02 need abilities to back up our things and no longer work as if we’re like having to rebuild stuff. Yeah. So, I want to move away from tracking everything in SharePoint. I want to track it in Git, but I want to make sure it’s a backup of the workspace. I have found really good success in keeping those things up to date. I think that works fairly well. Sinking works really well in that in that use case. I’m I’m good with it. And it’s a pretty good kitty pool, , with some swim helpers if especially if you’re diving into the

45:33 pool, too, where you are for a lot of users, you can still just worry about publishing. , your process doesn’t change a lot, but then if you do need to see that version control, it’s all there. It’s a good introduction. And yeah, I I really like that as an introductory piece. So I’m so in that scenario I’m not going to Azure DevOps or GitHub and downloading the files. I’m not doing other things. What I’m doing is I’m I’m relying on the workspace. Now when I do a synchronization to a workspace, I feel much more confident now letting the workspace be the source

46:07 of truth for those files, right? And what by that is before I would be like, dude, never never ever ever deploy something to powerbi.com and then download the report, edit it, and then push it back up, right? I’d always want you to have some other place where I could store that file, but I feel like over the last year, there hasn’t been very many things. I’m having less and less issues of downloading something specifically from power.com. Yeah. So, the fact that I can almost download everything now without issue and I’ve never I’m having less and less problems

46:38 with that, I’m feeling more confident. Just like, okay, fine. I deploy a report. I got I’ve got some internal reports I do for my business. I don’t even keep a SharePoint copy anymore. I just let powerbi.com be the source of truth for that file and when I need it again if I want to make an edit I go into the experience in powerbi.com make my simple edits or if I need more complex edits I bundle it up and download it and I can download just the semantic model I can also download just the report now like it’s giving me options on how this works I just realized I’m doing the same thing I I

47:10 haven’t I barely used SharePoint where that used first thing if I start my computer but and also too let’s not forget like there’s you don’t have to have VS Code or even use the terminal to use GitHub like there’s GitHub desktop which is a pretty userfriendly application. , but that that being said though that next step when it comes to Git and I think a lot of the article here, Microsoft seems to be and tell me if I’m wrong here if I if this is a good take, but it seems like their focus on the Git integration is for the

47:43 more advanced users, the more advanced teams here. We’re not dealing with this simple Git anymore or simple Git push features that would be for your basic user because we’re talking about service principle support. We’re we’re talking about variable support. Correct. Yes, you are right. Yeah, I would agree with you there. Yeah. Yeah. But these are the features that these are these are the features that prodevelopers like so just the synchronizing of things back and forth. Yeah, that’s fine. Right. And the other thing that was another huge pain point for me was like I couldn’t I couldn’t get a dataf flow gen 2 into a CI/CD pipeline.

48:17 That was a big issue for me. So a lot of that I was like well I don’t want to don’t use dataf flows gen 2 because I couldn’t push it into the git experience. I couldn’t synchronize it. Like I could do the work on it, but I couldn’t get it stored anywhere. Like this was not good for me. There’s a lot of things here that I was like, , it wasn’t. But now that more of those things are being covered by the CSD, I’m I’m cool with it. But I’m that first stage is think of the git as a backup. That’s your first stage. What you’re talking about now, Tommy, is like, okay, now I want to have a more complex deployment strategy. And this is

48:49 where we get a bit more enterprise centric. And this is where I think the team starts growing, right? Before I had maybe one, maybe two developers working in the workspace together, but when your team gets up to five people and you have multiple people pushing things in and out, this the size of the team scaling up from a business leader standpoint, the business leader wants people to show up and have like, look, hey, we need to build more reports faster. We need to get data models done quicker. What’s the best way to do that? Let’s add another person to the project. Great. Adding

49:21 more people to the project doesn’t necessarily make it go faster. If you can only edit one semantic model at a time, putting two people to edit it and tune it isn’t a good thing, right? Because now you’re trying to coordinate between two people, like what did you change? What did I change? Are your changes interfacing with my changes? Like it doesn’t work as good. But when you start incorporating Git, you’re now able to throw a process. Here’s the process down and you can throw more people at the process you’re building, which then makes it go faster. So now the business leaders have a better, more compelling story. Hey, we want to go

49:53 faster and build more things. Great. Throw more people at it. They just jump in, learn the process, and now everyone can play on the same files together in the same space. That that’s right, that’s the idea of Git. The the the Git experience allows you extending the space of developers. And that’s also the transition to for a lot of users, their skill, right? Where the first level one you said. I like how we’re using all these. Maybe it’s because my kids are asking me about Mario 64. So I’m in this like level one level two thinking but and like your level one is that backup where I don’t have to know anything else. I don’t

50:25 really have to do anything else but I have to be aware of a few things but really the critically my process or my knowledge doesn’t change but then there’s this shift right where if I’m going to start be relying on git more well now I have to be aware not just at the git but timle right now I’m also being aware of some different code and how I’m looking at those reports is going to be shifting and required where honestly I think a lot of teams can survive on the backup or even the dev prod where if the dev

50:58 prod can still be something where I’m just dealing with desktop and report if I know Tim do great I don’t have to it’s not a requirement does not really it can make my life easier but it’s not a deal breakaker and I think also too the for both the heavy git users and for the heavy p fabric users the fact that now all there’s no deal breakers left or limited where it’s like hey there’s no data flows can’t use it sorry we can’t you It doesn’t have all the features. But there’s a shift, Mike, where I’m going to keep asking this question every 3

51:31 months, I think, and we’ll see when the answer changes. But there is the shift though with all these features coming out with CI/CD, even the fabric command line, which I think is really cool where at what point am I expecting a heavy PowerBI or fabric pro that maybe I would hire or maybe I would lease for a project that I’m working on that I’m expecting them to know Git and I’m expecting them to know Tim, right? Because to me, you’re really not doing Timle unless you’re doing Git. That’s

52:03 kind I feel like they’re very intertwined. They’re they’re they’re dancing to the same song. Mhm. And Let me ask you this question. Do you think or are you only hiring an expert in PowerBI or Fabric, but they got to know Git 2? Is that one of the prerequisites? Get now or later? Or I guess that doesn’t make sense, but are you looking to hire someone and they should know Git? Are you is that an expectation for you? Give me your opinion, Tommy, first and then I’ll and then I’ll jump into my opinion. I think if I’m if I’m hiring someone who says they’re in the fabric

52:37 experience and they’ve been doing fabric, I am that I I I’m looking for that and I’m going to be a little disappointed if they’re not. Depends on how big my team is, right? If my team is one person, I’m hiring the one person to get into my team to do the one fabric. If they’ve experienced it a little bit, I’m okay with it. But if you are if you are a team of like six people and I’m hiring another person, another the seventh person to come in to use fabric with us, I’m going to expect that I’m hiring that person who already has

53:10 understood it, knows how it works, can at least speak to why Git is important and how it exists. Like give me some patterns, right? So I’m not going to want to bring in someone who’s so green. So if my team’s larger, yeah, I’m g I’m going to think that we’re already trying to a agree that we’re going to be using Git on some things. If my team’s smaller and I’m bringing that first person in, I want your I want you to have knowledge of it, but we’ll probably train you to get you up to speed on what Git is doing, right? So, I think it depends on your your team size cuz I I would rather

53:43 hire for a larger team, I’d rather hire someone who’s going to be able to hit the ground running as opposed to having to bring them in and then go through all the fabric, , get integration stuff and get get it figured out there. No. And I the team matters here because I’m about to there’s a project that’s I’m about to help launch for fabric. We’re not talking git at all even though they’re going to be pretty extensive here. The fact it’s it really deals with I think how much that team’s already aware of what’s available in fabric or that skill. It’s funny how these fabric skills, data engineering, notebooks, even semantic

54:18 model where I’m almost equating that to if you’re not experiencing that, there’s no reason to know get. But previously, that was not ever a situation where I didn’t need to know get to do anything with notebooks. I didn’t need to know get to do anything with semantic model report building. But if you’re more experienced and if you have a larger organization, yeah, they they go as part of that package now to me. But no, I I think for my team personally, I would want that. But if I’m dealing with a client and we’re just getting

54:51 introduced to a lot of this, I’m not even touching it. I’m not even touching it for the backup yet because that’s too many too many concepts. So it’s a curious situation where when we come to when does that become a just a general prerequisite? Say that again. It’s it’s curious to me when does get and source control just become a general prerequisite. Again I I think this is I think regardless whatever you look at this there’s going to be some value in this regardless right if you’re just starting out with this if you’re just coming from PowerBI you’re probably

55:22 going to be very familiar with like one drive and sharepoint that that’s where you’re going to store your stuff. That’s that’s fine. But I think your pattern very quickly changes when you just have a workspace that’s now in fabric or a workspace that’s now on premium. To me, I’m thinking of immediately everyone should be using git as a way of backing up their stuff. Period. Like that should just be done. Like we should just have that working. Like to me that’s a very there’s so little effort going on there that that just makes it makes it happy. to me also if my team is allowed to edit reports in the powerb.com service 100%

55:56 I’m saying you need to turn on git on that workspace that’s one thing I’m I’m immediately turning on for that workspace because that way if I edit in the service if I edit locally I can at least do a diff compare on what changes were made right someone someone in the service may make a change and synchronize those changes I may make a change in desktop or download something from SharePoint and make a change I want my source of truth to be the git repo that’s That’s the shift here that’s happening. So, I really like that that experience and it’s a very I feel like Microsoft has done a decent job of making that easy to use, easy to

56:29 get started. Love it. All right, my last question here. I got a data scenario for you. Pretty basic here. You got a person, you need to upskill them in one of three areas. You tell me what the priority is. And again, obviously there’s a lot of gotchas here, but more or less notebooks AI get organization we’ll say is pretty far ahead with fabric. They’re doing a lot of things in fabric already. This person you’re going to rely on a lot. Where are you going to prioritize their upskill? Copilot, notebooks or git?

57:04 Good question. Let me think about that for a minute. I think again. Yeah, you go. You say yours first and then I’ll I’ll follow up with mine. I think I’ve got I think I’ve got an idea to be very honest. I’m going to start with Git because I’m assuming this organization, , if it’s beginning to dabble in this, it really starts here. As much as those are maybe more urgent, if you don’t have the Git background, again, that’s where the process is going to begin and end. So, and I can teach you or I can you can upskill and get pretty quickly. It

57:36 doesn’t have this nearly the same conceptual things or the same I think amount of time to get an hold of the basic push commit stage merge because we’re already going to be doing that and it’s not that crosses every platform and product you’re going to touch in fabric. So for me out of all those three I would start with git. H interesting. I I’m I think I’m of the opinion of like I think I would actually put them in notebooks. I think there’s more. So, Git is interesting. Git is very much a process build thing. I’m not

58:09 sure a lot of organizations have the the stomach to slow down and build process at this point, especially when they’re starting to get going, right? PowerBI is a commodity now. We’re now building lots of reports. There’s lots of things that could be happening. There’s probably stuff out there that’s really good and very useful. there’s probably lots of stuff in the organization that’s just mediocre at best and it gets the job done but it’s not like at our at our standards basically. Yeah. So I feel like a lot of the pattern that I see right now is build something that works

58:42 come back and then optimize on your second build. Right? So I think a lot of what I’m doing is I’m taking a lot of these legacy PowerBI things like data flows gen one, data flows gen 2, and I’m probably trying to migrate them into notebooks. I do so much work with notebooks, Tommy. And honestly, Tommy, every time I get to a place where I’m going to build some data engineering, the sooner I can get to a notebook, the happier I am. Every time I start smiling, I get really happy about like

59:13 the experience. like of the places that I get to go do things. It just is it’s so flexible. It does so much and it’s it’s so much more cost effective in a compute units than it is in the data flows gen one and genu flow 2 area. I just like it. It I can do anything I want there. I can think about what’s really right for the data models and how do I shape the data. It just makes sense in my brain. So again, I’m a little bit slanted here as well because I come from like the data science notebooks arena a bit more than most people. So I think if I’m looking at it from a value back to

59:48 the business, there’s definitely value in GitHub. Yes, I think there’s more value if I compare that against notebooks and running things more efficiently. I think there’s more value in running your system more efficiently with notebooks and saving compute units so I can do more things. So I think that’s the place where I would put my people first. And then once we’ve got those notebooks in and running, then I’m probably going to start stepping back and saying, “Look, let’s start integrating Git in various places.” And again, yeah, if we’re talking about the very easy GitH like just synchronizing a

60:20 workspace to get backups, it’s it’s it’s not hard to learn. It’s not difficult to get going, right? But if you want to do full dev test prod branching with multiple merge and pull requests and then pulling code down from a repo and using in VS Code, like there’s a whole bunch of other things that could be going on here that can make this really robust from a process standpoint, but the process, it’s going to slow people down. Yeah, when you add process initially, it’s going to slow it down before it speeds it up. I don’t even know if you’re aware of this, Mike, but

60:51 you’ve already brought out two out of like the 19 chapters of the Mythical Man month, too. So, , for those of you who are listening here, a lot of things we’re talking about here is came from software development. Mike, you talked about throwing more people at it. The myth is it’s going to go faster and the idea of the second system effect. So, man, are we are we nerds? We are. I feel like we’re can we be less nerds because we’re talking about the get stuff. But that being said, my closing thought here for those of you listening, if you think the Git and the Python, the

61:23 notebooks are nerdy, they’re not. They’re cool. It’s cool if you’re not 15. They’re actually pretty cool. And here’s the thing with a lot of this stuff. It’s it’s a playground we’re living in. , a lot of these things though, too, are people are going to be asking for this. People are asking for this. I am finding so much success with Git without having to be a nerd and without with but being dumb. Personally, I’m dumb and I can use git and there’s a lot here where you can find a lot of processes and also a lot of CIA.

61:56 So, my last thought is here. Love what we’re doing with git. Love what we’re seeing here with source control. Still a ton more. Mike, I’ll drop it to you for your closing thoughts. Yeah, I like where we’re going here. Microsoft is taking this very seriously. They’re making a lot of improvements here. I will argue that Microsoft is focusing a lot on continuous integration and continuous deployment. they’re not necessarily focusing on dev ops. DevOps is more about the process that you use that uses continuous integration and continuous deployment. So I just want to be very very clear that this is this is a part of a broader

62:30 solution. When we talk DevOps, CI/CD is not DevOps. DevOps is a bigger topic. It’s more about the process, the team, how people interact with your code. but CI/CD is a small part of a bigger story. And I think one thing I see right now, especially in the PowerBI community, is a lot of people are very excited about CI/CD. Good. I’m happy about it. Microsoft’s building a lot of things for CI/CD. Great. I love it. But at the end of the day, CICD is not really everything you’re going to need. It’s a part of a it’s it’s the it’s the

63:04 potatoes to your dinner, right? It’s part of the dinner meal, right? You need there’s other food. You need to have some vegetables. You got to have some meat in there. You got you got to have some variety of some other things in there. You can’t just eat one piece of food and and call it done. So, this is a a part of a larger meal. So, all that saying, I do want to say thank you all for for hanging out and chatting with us. I know we had a longer news section today. Thank you for hanging out for that portion. I hope this story around continuous integration and deployment was interesting to you. We think this is a great art article. Definitely check out the link in the description below if

63:36 you want to learn more about what is new in fabric CI/CD for May 2025. Automation of Git APIs, automated deployment using fabric APIs, and now more around fabric CI/CD. these are going to change how your team builds things. So we want you to know about these things, unpack them, understand how this all works. Tommy, where else can you find the podcast? You can find us in Apple, Spotify, wherever you get your podcast. Make sure to subscribe and leave a rating. It helps us out a ton. Do you have a topic, an idea, or a question that you want to ask us? Well, head over to powerbi.tipsodcast.

64:10 Leave your name in a great question. Keep them coming. And finally, join us live every Tuesday and Thursday, 7:30 a.m. Central, and join the conversation on all PowerBI tips social media channels. Thank you all so much, and we’ll see you next time. [Music]

Thank You

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

Overcoming Challenges in the Center of Excellence

More Posts

Feb 25, 2026

Excel vs. Field Parameters – Ep. 505

Mike and Tommy debate the implications of AI on app development and data platforms, then tackle a mailbag question on whether field parameters hinder Excel compatibility in semantic models. They explore building AI-ready models and the future of report design beyond Power BI-specific features.

Feb 18, 2026

Hiring the Report Developer – Ep. 503

Mike and Tommy unpack what a report developer should know in 2026 — from paginated reports and the SSRS migration trend to the line between report building and data modeling.

Feb 13, 2026

Trusting In Microsoft Fabric – Ep. 502

Mike and Tommy dive deep into whether Microsoft Fabric has earned our trust after two years. Plus, the SaaS apocalypse is here, AI intensifies work, and Semantic Link goes GA.