Adopting Copilot Standalone for Power BI – Ep. 428
Mike and Tommy break down what Copilot Standalone for Power BI is and why it changes the consumer experience for chat-with-your-data. Then they role-play a realistic rollout plan—starting with real problems, measurable wins, and AI-ready semantic models instead of a blanket mandate.
News & Announcements
Mirroring in Microsoft Fabric: benefits, use cases, and pricing
The episode starts with a set of Fabric news items, including a useful explainer on Mirroring. Tommy highlights the difference between standard mirroring (connector-based, for supported sources) and open mirroring (a flexible pattern that lets almost any app write change data into Fabric). Mike calls out a practical governance question teams should test early: inserts and updates are straightforward, but how do deletes get handled end-to-end?
- Mirroring in Microsoft Fabric explained: benefits, use cases, and pricing demystified — A deep dive into what mirroring is in Fabric, where it fits versus other ingestion patterns, and what to consider for operational-to-analytic replication.
Updates to database development tools for SQL database in Fabric
Tommy brings up Fabric SQL database tooling improvements—source control, SQL project workflows (DACPAC-style), and better dev experiences in tools like VS Code. Mike shares his shift in perspective: with Fabric’s pay-per-use model, a SQL database is no longer something you have to keep running (and paying for) 24/7, which makes it much more attractive for smaller transactional workloads and app backends.
- Updates to database development tools for SQL database in Fabric — What’s new for SQL database development in Fabric, including project-based workflows, source control, and CI/CD direction.
Intelligent Data Cleanup (Smart Purging) for Fabric Warehouses
A smaller article with a big impact: automatic garbage collection for Fabric warehouses. Mike shares a real-world lesson from the lakehouse world—storage costs can balloon when no one runs cleanup/optimization—and argues that “smart purging” style automation is the direction these platforms need to keep storage efficient over time.
- Intelligent Data Cleanup: Smart Purging for Smarter Data Warehouses — Overview of automated cleanup/garbage collection aimed at reducing storage bloat and maintenance overhead.
Main Discussion: Adopting Copilot Standalone for Power BI
What Copilot Standalone is (and why it matters)
Copilot Standalone is a full-screen, left-nav experience in Power BI/Fabric designed for “chat with your data” without forcing users to start from a specific report. Instead of having Copilot scoped to one report at a time, the idea is that users can ask questions, find relevant content, and pivot across the data they’re authorized to access.
- The next era of Copilot in Power BI: Chat with your Data (Preview) — Microsoft’s preview announcement and capability overview.
Don’t start with a mandate—start with a problem
The core of the role-play is a rollout scenario: “You have 12 months to ensure the organization adopts Copilot Standalone—what do you do?” Mike immediately pushes back on the premise: making “everyone use Copilot” is a solution looking for a problem.
Instead, they propose a practical approach:
- Inventory pain points: Where is the biggest skill gap or time sink in your data culture?
- Map capabilities to real work: Search and discovery, summarization, Q&A, and data agents each solve different needs.
- Pilot with a small group: Train prompting, document patterns that work, and refine.
- Measure outcomes: Success is time saved and better decisions—not raw “number of prompts.”
“Time trials”: measure where time is actually spent
Mike borrows an operations concept: measure how long common tasks take (“time trials”), then target the largest time consumers. If Copilot reduces a 2-hour activity to 20 minutes, it’s worth talking about. If it saves 10 seconds here and there, it’s unlikely to matter.
Data prep is the non-negotiable
Both hosts come back to the same constraint: Copilot is only as good as the semantic models and metadata it can read. If the model lacks clear measures, definitions, relationships, and descriptions, “chat with your data” won’t be trustworthy—and adoption will stall.
They also flag a likely next wave: once more teams turn it on, the community will have a real conversation about capacity (CU) cost vs. business value, and where Copilot is actually worth it.
Episode Transcript
Full verbatim transcript — click any timestamp to jump to that moment:
0:34 Explicit Measures podcast with Tommy and Mike. Good morning everyone and welcome back to the show. Good morning Mike. Good morning everyone. Today’s main topic, we’ll jump right in today on the the main topic, but we’ll have some news here in a minute. the main topic today is we’re going to do like a a what if scenario. Imagine if kind of what we’re going to go here. Imagine if you are adopting co-pilot standalone for PowerBI. What does this look like for your organization? How are you using it? and what does it look
1:05 like for you to roll this out? So, kind of just think putting yourself in the place of the BI leader or admin and trying to figure out how do you communicate this to your organization? Where do you communicate its value and and how do you roll it out? So, that’s kind of what we’re thinking here for the co-pilot standalone as an option for your organization. With that, Tommy, you want to do any introductions or news items? Yeah, what? Summer’s upon us. Tomorrow is in Chicago. It’s our last day of schools tomorrow. And with an eight, a six, and
1:38 a four, it can be a little hard to think about how to keep them busy, so to speak. Because if you go back to my summers, Yeah. If you go back to my summers, which I would change, it was Sports Center, Sports Center, two bowls of cereal, followed by the same Sports Center, followed by Nintendo. So, I don’t want to do that to them kind of thing. So, Mike, let me start off with you. What What good tips do you have? We’re a PowerBI dot. We’re PowerBI tips, fabric tips, but what parent dot tips do you have for us for the summer?
2:10 Well, it’s it’s a lot harder for me because I’m I’m in my basement all summer, right? So I I’m here at my house. I’ve been here for many many like months throughout the summer. So like yeah, like we do a lot of like adventury things kind of I guess nearby adventures. There’s a lot of like museums. There’s like some kids museums that are kind of interactive. Parks seem to be a thing. thankfully my wife is able to stay home and she’s able to kind of like take care of kids and make sure they got to go places. There’s a little bit of friends thing going on.
2:41 there’s an age bracket though where kids start getting old enough to start doing work and things. So last summer we had my son working at a restaurant nearby and that kind of really occupied a lot of his time. It actually kept us kind of close to home honestly when he we’re working around his work schedule. He’d work on weekends and weekdays. So there was like some additional scheduling thing there that that he had to be busy with that it wasn’t something like we were able to like step away for long periods of time from our family or
3:11 area cuz needed to do some work. So I think we’re going to start seeing a little bit more of this. I’m I’m in the teenager age group a little bit. I’ve got a 13 and a 15 and I think you’re going to start seeing them do a little bit more of their own friends experience. Maybe more so this summer than last summer., and we’re also seeing a little bit more work creep in there as well. So, when I was really little, it was,, just kind of bop around the house and get bored and try and find friends to play with who were who are also bored., yeah, that’s kind of that was kind of my
3:42 What kind of neighborhood did you grow up in? Did you grow up in a neighborhood where you could go out and have friends or did you grow up in a neighborhood that that didn’t have that nearby? There’s a lot of games on the street and and so that that was a total thing for us in our first when we first moved from New York, there’s a lot of places on the street. Then we moved to a we’ll call slightly older neighborhood. So it got a little harder. but then it was like you just had to find people to bike to. But it’s interesting with what you’re saying with 1315 because yeah, the job’s definitely got to be a part of that. You you got to keep you got to keep them
4:13 busy. Well, they want to go get like these fancy clothing things or these new Nintendo Switch stuff. And I’m like, yeah, I I’m okay with you that. But you got to earn that stuff, man. I don’t know. I’ll buy you some stuff, but like I’m not going to go buy you the the most expensive of everything all the time. You got to earn a little bit of that. Put some skin in the game on this. Exactly. So, one thing we’ve we started doing last year is I don’t want to just tell my kids, “Here’s the work you’re going to do. Do it.” Because if that was me, I’m going to go the other direction. So I wrote a contract and the contract
4:47 is entails this are called we call it the summer contract and it’s if you want to watch TV Nintendo yada d kind of thing you have to achieve these following things before you do that. Oh I do like that we have we don’t have a contract for this but we do the same thing like there’s there’s like a list of chores and things that got to get done and signed off on. I’ve often contemplated building like a power app where my kids could like do their chores and take a picture of it when they’re done. That way I know it’s actually completed because right now they’ll say, “Oh yeah, it’s done.” I’m like, “I don’t think it’s done to my standards yet.”
5:18 So, but the thing that I’m trying to teach them with this too was come back like like for example like 30 minutes of math, 30 minutes of learning Italian, 30 minutes of reading chapter book, whatever. But what I said is go read it. Come back to me if you disagree with anything before you sign it because once you sign it the expectation. Yeah. You signed it. Once you sign it, you signed it. So my oldest I know I’m going to work I need to work on more on the art of negotiation. My my middle one who signed it or before she signed, she’s like,
5:50 “Hey, there’s a lot more I have to do than what I get.” I went, “Okay, well then we’ll work with this kind of thing because I want them to own it. what ? Yes. Rather than me telling them. So it it’s worked well because they signed it. It’s not just me telling them to do. Granted, did they have much of a choice? I didn’t say that, but,, so it’s a negotiation a little bit., there’s negotiation, but they own it kind of thing. So yeah, that’s how we kind of keep them busy during the summer. But yeah, our lives don’t change that much. We’re still in the fabric
6:22 dived into all these new things., it’s just more louder kids at the house. Our spouses, I think, take the brunt of this a lot more than we do like in this space because we have to like close doors, come downstairs, focus on things like I feel like that’s it doesn’t change. And so, I think something too is like when when my my both my parents were teachers, so when I was a kid growing up, we I had both parents like just stop they would do other odds and end jobs maybe around through the summer. My dad did electrician like helped out with an electrician or did some help jobs here and there, but like
6:53 when we were kids like my parents were there all the time cuz they were both off for the summers. It’s different now that I work all year round and there’s no like summer time for me. So I think it’s also a little bit interesting when kids are like what do when do you take a break? I’m like I don’t take a break. It’s you just keep going. That’s how it works. Like when do you stop eating? So yeah, exactly. It doesn’t it doesn’t happen. Like we just keep going through the things. So anyways, very interesting stuff. Good conversation there. All right. If you have any comments of
7:23 things that you are planning to do as a family throughout the summer or you do with your your family units, let us know in the chat. Let us know what you what kind of what kind of activities do you do. Maybe you can give us some ideas. I will say this, this summer we’re doing something a little bit unique. My son is really interested in computers and computer science. He’s like, I want to start learning some of this stuff. very fun and interesting for me cuz this is a space that I’m very interested in. Last night as I was going to sleep, my wife was watching something on like
7:53 Instagram or whatever and I’m I’m sitting there going reading about Duck Lake or another database system. Yeah. Duck DB, but now there’s a Duck Lake. Apparently there’s there’s another thing on it. And I’m like someone sent an article. I’m reading this whole thing. I’m like internalizing. Okay, interesting interesting. And my wife started talking to me like, “I’m reading about databases.” She’s like, “Oh, of course you are.” So, like, okay. All that saying is my son is starting to like some of this technology space. He’s liking computers. He loves playing on on programming., so I I think one of the
8:25 things we’re going to try and do this summer is we’re going to kind of extend school for him. And so, a friend, family, and I are going to have him and another young young gentleman, we’re going to start teaching him how to program things. We’re going to start teaching how do you use cursor, what does programming look like, how do you build a UI, what’s a website, like a lot of the fundamentals that you kind of like what’s a component, how do you build components., so we’re going to start trying to teach some of the fundamentals., maybe there’ll be another separate job in there somewhere or something else, but at least getting
8:55 their hands on the technology and helping them learn it., the goal is to spend,, between myself and another dad, we’re going to spend some stuff, some time to kind of curate some content and then from that content, we’ll have them go through some classes. There’s actually a lot of free Stanford and a lot of free Harvard classes that are out there. A lot of introduction to Python and introduction to classes. You can just audit that without actually taking the credits or getting the certificate. It might be a good thing for them to go through. like you take some of these,, multiplehour classes, go through and learn what
9:27 they’re saying and give you some fundamentals to what this I think will be useful. So, while I think the AI will do a lot of really good things for you, I think it’s kind of what we’re talking about today, I don’t think it will give you everything and I think you still need some core fundamentals like if you’re going to build websites, you’ll need to know what like padding looks like and with the frames and like what is a flexbox. I think there’s there’s conceptual things that make sense that you should just understand how to do them. And then I think from there you can then step back and say, “Okay, now that I know understand that concept, I
9:57 don’t need to write the code every time.” Yeah. I can let the AI figure it out a little bit. Mike, I’m a little disappointed because I want to be able to ask your son in August, how many colors belong on a report page? What’s the maximum amount? Where’s the PowerBI? Where’s the fabric in this? I I’ve heard all the developer stuff, but where where’s the data transformations, ETL, all acronyms that are fun? I don’t I don’t know if that’s going to be much of a thing anymore. I don’t know. We’ll see. Hot take. I look looking at the future of things
10:28 here with Copilot and what the simple things copilot can do. I’m not sure spending a ton of time learning all the properties panel inside your PowerBI report for every single visual is is really useful for your time. I I don’t Welcome to prompt engineering. Yeah, I’m thinking so. So, I think it’s going to really shift how people do work in the future. So, I’m already starting to see some trends and shifts and things which we’ll get into and I think more as we talk about this conversation. But, all right, enough of that kind of very large yammering about just life and
10:59 things in general. Let’s move over to some news items. Tommy, what what news things do you have for us today? Yeah, we’ll run through these. We have three good ones here. The first one we’re going to start with is Open Mirroring Explained for Microsoft Fabric. I’m really happy that Microsoft and this is in the podcast this is in the YouTubes and all the things the link but I think there’s been I don’t want to say confusion but lack of clarity for a lot of people unless you really dived into it on mirroring compared to open mirroring what the difference is and the biggest difference really is mirroring
11:31 is something that has standard connections or standard sources that you can connect to. Open mirroring is a flexible approach that allows any application to write changes to fabric. And this is useful for if you’re using legacy databases, external data providers, or your own custommade one. We mentioned this on Tuesday. I I wish I remember the the person’s name, but they use an Excel file as their open mirroring source. So it’s really what your,, curiosity is or what you
12:03 can think of can be enabled for open mirroring. I I like I love how they kind of outline the article and talking about when to use it. database mirror and fabric in general and then talk about open mirroring. So this is great. This is a great article. I think it was written. Yeah, there’s a little bit of like confusion here. especially when they the way this article is written even I am under the understanding again I’m just going to kind of throw this down here and we’ll see what happens my understanding of like SQL DB mirroring and maybe
12:36 me even mirroring for Azure Postgress Mhm. my understanding is those two sources actually make physical copies of things and actually bring data to the lakehouse in this mirroring experience. However, I’m not sure that mirroring does the same effect inside the datab bricks catalog. When I look at the data bricks catalog, it’s it’s not about I’m not I’m not physically making a copy of things and bringing them in. It’s like a an elaborate shortcut that goes back to
13:07 data bricks and then you can use it. Now, I don’t again I haven’t tested snowflake so I can’t speak to that fully, but if data bicks is doing this whole kind of like look data bicks is going to manage the table. There’s a bunch of files on disk that are there. And then snowflake is also doing a very similar thing except not with parquet with iceberg and arrow files. So if snowflake is also in this mix and maybe snowflake keeps its information inside snowflake and then you’re getting this like shortcutting mirroring experience
13:39 here. I I feel like the mirroring language is a little bit Yes, mirroring makes sense. It is taking tables and bringing them to your lakehouse. Awesome. Do love that. But I think the the language of the mirroring itself is kind of different between different data sources. Am I physically making a copy or am I not? And I don’t know if there’s a clear distinction between those two pieces. Does that make sense, Tommy? What I’m describing. And I think this is the this is part of the bigger pro kind of concern I think a lot of people have
14:11 is there are so many things to choose from and like well why can’t I just use shortcuts or I know this is right back but okay what about what about the legacy things like I would even call power apps too like well if I could do right back here when do I use this how do I not forget the how do you use this it’s when do I use this when is this more valuable than x y and z and I think that’s kind of the big thing here open mirroring All these things are great features, but all the different use cases are still, I think, very undefined. And I think that’s to me
14:43 that’s kind of the bigger thing we’re seeing with fabric where unless you spend a ton of time reading, researching, and honestly just tinkering around, it’s it’s still going to be that black box. Yes. And and this other thing here that’s interesting here is when you talk about mirroring there is there’s also a storage and fabric is free. So with these miring mirroring experiences yeah this is I this is one thing I just want to point out here the pricing model
15:14 for these is you get during the mirroring experience. So let’s say okay for example right you’re mirroring a SQL database right SQL’s doing things data is going coming in and then you’re making sure that the tables in your fabric match the tables that are in the SQL server well there’s a certain amount of like writing reading things and so in lie of this Microsoft is giving away one terabyte of mirroring for free so if you have a small SQL database yeah I don’t know how much how fast that has eaten up for the one tab level I’ve done some real time streaming things which can eat up that terabyte very quickly But if
15:46 you’re streaming real-time data and one tab kind of goes a long way when you’re doing copies of smaller tables and getting them in the and the fact too Mike that that’s not an offer for the micros fabric databases right you’re pay that’s a storage cost on its own compared to the normal capacity so yes I find that interesting that they’re offering this mirroring terabyte coupon so to speak go buy your Nutella and get a terabyte of data but the fact that data fabric databases
16:16 don’t there’s no trial with fabric databases anymore or or we’re getting to that point where no there’s no you get two gigs of a database and then you start paying for storage you’re paying for storage immediately outside your normal capacity interesting they do that so okay the only other thing I want to unpack with you here just a little bit is a is a is a stuck it’s a it’s a stuck it’s a challenge that I have that I seem like I face when I look at mirroring things one of the challenges I look at when I look at open mirroring is I look at open miring and okay when things are changing on the database side
16:48 right let’s imagine I’m bringing new records so the concept here is you have a table it’s inside the lakehouse and then when you use open miring from my understanding I haven’t played with it enough to really get my full head around it that is as you update things like a change a record update something you identify kind of like the keys values of it and then the the mirroring port will then automatically know oh you’ve updated this record therefore I will update that one record and make all the changes for you. So, it kind of does it in an efficient way, right? You just I’m just going to keep bringing the
17:19 mirroring all the changed records and the mirroring just kind of figures out how to update it like that. That’s really nice. It it simplifies a lot of the loading of data. The one thing I’m not sure I’m very clear on is what happens when your database physically removes files or records. So this is one thing that I’ve always had challenge with in the past has been like whenever you’re doing a database mirroring or updating experience you either copy the whole table but when you have deletes
17:51 that’s harder to manage because you have to compare okay for this given period of time this was the list of IDs that made up that table but then tomorrow when I look at the same data here’s a new list of ids that are current and valid and that have that record. So I I don’t really see a lot of language and maybe this is just me not understanding mirroring very well. I there could be others again chat if you are also experimenting with mirroring. What are you finding and how
18:22 do you manage deleting things inside open mirroring? I’d like to understand what that means and do I have to go through so something has to go through and compare what were all the IDs that I had what are the IDs that are missing and then at least tell open mir and say delete these IDs right here’s there’s got to be a point in time where you say these things are deleted does that make sense no which again and there was an update at the Microsoft build about workspace warehouse snapshots too right so that’s
18:55 covered there but the fact that yeah there there’s these undefined and honestly unless you tinker around Mike like unless you’re actually testing this out how could you roll this out and until this because that’s a data governance issue that’s a compliance issue before you actually start getting out of production you’re talking about fundamental things that could make or break a lot of processes is here processes here so that that’s a concern and if you don’t know it it’s like I don’t known unknown and those I
19:28 really stay away from anything production-wise or anything I would recommend to a client because you’re like yeah we could do it sounds great works cool but then that situation occurs. I just bring up that point. I don’t I wouldn’t say I would not develop with it. it’s not to the point where I’d say don’t use it because it’s not ready. I don’t think it’s I don’t think that’s the case. I think really the case is like no I’m not worried about it. That’s not what I’m trying to say. What I’m trying to say is I’m trying to say this is something that we need to design for. So, how like the thing is you’re going to create new records, you’re going to update existing
19:59 records, and you’re going to delete records. Some data systems allow what they call soft deletes, right? Just the record still stays, but it’s marked as deleted, and then it stays around for a period of time, and then maybe at some point it’s purged. I don’t know. Sure. But something like that,, off deletes is a great solution to resolve this one. But it also means your database could could get really large and have a lot of extra records in it that are being deleted. Do they need to stay around? Is there a period of time? Do they There’s all these other kind of like questions around the deleting side of things
20:30 because you as a database manager, I don’t want my database to have a whole bunch of records that I’m never going to use again. When you’ve said they deleted, like they’re gone. So that’s where I just need to understand or unpack a bit more of like how open mirroring works with the delete side of things. I feel like there should be a bit more features around,, if we’re doing open mirroring, there should be two there should be two queries. There should be one to go get the data and find what records have changed and then here’s the master key list of
21:01 things and it should just kind of figure it out on the back end and join those two pieces of information together and then figure it all out. So like right that’s what I think should be happening but I don’t see that in the tool or it’s not clearly called out in the tool in a way that I understand it. Does that make sense? No. And that’s what that’s what I’m saying. Not saying don’t use it at all but you you have to if you are unsure of this you have to test this out before rolling out like a lot of things in preview. let’s jump in Mike to then one I’m actually really I’m a really any article
21:32 that has tools in the title. I’m down. I love it., so I’m going to actually jump into the updates to the database development tools for SQL database. Okay. In fabric. So again, you you had me at tools. You had me at tools because anything like that is cool. So there’s four major tools that they’re really releasing for fabric databases here or talking about. source call integration,, Microsoft SQL build projects, VS Code development updates, and some
22:04 upcoming features on the road map, including improvements to source control, deployment pipelines, an update from source capabilities, a REST API, and further integration with the fabric CI/CD Python module. So, that’s a mouthful. Putting a lot of work. Yeah, this is a this is a mouthful. They’re putting a lot of work into the fabric SQL databases here. how do you want to dive into this first? You just give my initial opinion on this one around SQL and SQL databases inside Microsoft fabric. So the one thing I’ll just I’ll
22:35 kind of note I believe the SQL community is really good. Let me just start there. I’ll start with the SQL community has a really strong following. People really know how to use it. But I think the SQL community because SQL has been around for so long. The product of SQL databases from Microsoft is very robust. There’s a lot of expectations that come with that team, those databases, like like a lot of things there. And so I got to be honest, I was not a fan of working with SQL databases before fabric SQL. I got to be really frankly honest with
23:05 you. Like that was that was not a thing for me. I don’t love it. Not news. I was like, why why would we need this? I have a lakehouse. I have a a data warehouse. Like why do I care about the SQL databases? I do think there are some really good I think there’s some really good tooling pieces that exist. I also think this is a really good fit solution for like smaller mediumsiz datas. And one of my biggest complaints with SQL Server
23:38 was like I didn’t want to go buy the SQL server and have it stay running and on all the time. that my my biggest grip gripe with a SQL database was look, I’ve already got all these semantic models. Why do I need to turn on a database and just let it sit there eating money when I only load from it like for an hour in the morning? That’s it and I’m done. So I one of the things I liked about SQL Synapse or or Synapse in general was the idea that you had a a SQL engine. It could talk to the lakehouse. It had the ability to read parquet and delta tables, but it would only read or be on
24:10 long enough to just run the query and then turn off. This is what I think is happening now with this SQL inside fabric. Like this the SQL is always there. It’s like an an always on machine. I’m not paying for the SQL engine to run. I can start a SQL database and only when I’m running the database does it actually charge me. This makes total sense to me. I I I really like this model of if I use it, I get paid for it. So now I’m I’m much less resistant to go turn on a SQL database and just use that to read write
24:41 data from it. I think that makes a lot more sense. Now, does that become your staging area for a bunch of other things in fabric? I don’t know yet. Like it’s it’s very transactional in nature. So, if you, I’m I’m seriously looking at maybe I can add apps and build things like legit apps inside of Fabric now. And the back end of my transactional app now is part of Fabric. And I’m thinking, yep, I’m kind I’m kind of really rethinking like I knew this was like a thing. They made the announcement, but I’m seriously rethinking maybe fabric is
25:14 the place for my transactional data. Maybe fabric is the place where I want to put everything. It’s very seamless. So I I think you make two really good points here. The the first point that a lot of people if they’re diving in to SQL and or if they’re well dived in get to your point, this has been around for a long time. There’s already bare minimum expectation on features, services, and what it should provide. Not just being storing rows of data. There’s a lot more to this. And I think they go through that in the article and
25:45 I think they’re really working to get there. The other point too, Mike, to on the app side, one of my latest holy cow moments was when I pointed a previous existing Power App to a fabric database. It was so seamless. It was so easy because it again it smells, looks, and tastes like a SQL database. and the fact now that it’s in fabric too. This is to me this was for me a we’ll call it a mini game changer because here I know we’ve talked about AI where it is Power Apps still
26:15 incredibly popular or applications like it too. Power Automate still part of Microsoft’s budget and what they’re doing and I I I love that. The only my only I’ll say my only comment about the article here is know the audience. I think there’s a lot of people who are like, I want to start using SQL. Everything in this article is for a database administrator. Yes, it’s a DBA. Yeah, this is a DBA. Nothing in this article. Nothing in this article is like I want to be a SQL player, too.
26:46 Dude, there’s so many acronyms thrown around this thing like CICD, MSQL extension tool, SQL.NET object a Yeah. Anything PowerBI? Yeah, come on. SQL Projeack like these are all things like I don’t know these things SQL package. Yeah. Yeah. I don’t SQL projects I don’t and but again this is the reason I think you see all these complex terms these are things to your point Tommy I think you’re spot on. This is the DBA language. This is the language a DBA
27:17 would understand and know how to manage and use these things. So very excited that this is coming in and very also excited that we have this whole like you can make everything go to git and all the items that you would need from the SQL definition and schema and deploying things all of those can be used through deployment pipelines into dev test prod workspaces. I yeah I think this is a good change. I just also feel I’m like really far behind in like the DBA side of things in SQL because I’ve kind of
27:48 stayed away from it. I’ve kind of just tried to own the lakehouse side of things. And so this is adding yet another whole big bundle of stuff that I feel like I need to go learn and be more educated around. Mike, you got another episode for I think that’s a great conversation that we can dive into is like how much should one know about SQL if I’m playing in fabric. Should I be a DBA or I can get started right now in three clicks but love it. So again a lot of tooling available. Mike one more here that is not a super long article but I think it has a it’s
28:21 going to be one of those underrated things and is yeah I think this is a I think this is a winner here. I I do think this is an underrated feature here. Yeah, there’s because there’s not a lot to this it’s one of the smaller articles but it’s pretty big. intelligent data cleanup, smart purging for smarter data warehouses. So, smart AI. This sounds like all my smartphones back in the day. And we’re basically introducing something through garbage collection with Microsoft Fabric with no knobs. I’m going to admit here, I don’t know what the no knobs acronym is. Maybe
28:52 is that slang for the young kids or is that something actually has to do with data warehouses? That I’ve never heard before. I think when you’re being acting dumb, Tommy, they call you a knob. there that’s I think that’s what there’s there’s not dumb Microsoft fabric. I thought I was noob. No knob. Stop being a knob, Tommy. I don’t know. Maybe that’s what it is. no knobs. No dumb people. I I don’t know what they’re saying here. No dumb warehouses. We’re not We don’t have dumb warehouses in fabric. But really what garbage collection has is we have storage cop optimization
29:23 which simply allows us to take certain expired data that takes up storage space and simply intelligently identifies and cleans up data files personally which goes back to our kind of initial article about the mirroring but this is all again in fabric not mirroring maintenance overhead and it works with your data governance. So there I think there’s a lot more they’re going to be doing with this. The data retention period usually with your compliance is 30 days and we can’t configure that. So you’re stuck with 30
29:53 days now. Mike, what are your thoughts? this is let me give you a quick story. I had a story about a customer who was using lakeouses and doing like again this is warehouse stuff. So this is warehouse information and warehouse data. So just to be very clear this is all about the warehouse smart purging for warehouses. But I’m going to just relate to something else that I know in this experience of like so when you separate the compute from the storage accounts there’s a lot of little small files that are being written and potentially even when you’re you’re interacting with
30:23 small little files you’re making a lot of tiny files like JSON files and you’re compacting them into tables or delta parket tables there’s there’s potentially a lot of movement and there’s there’s a high volume of files that you’re working with storage accounts don’t really like a lot of very small files they like kind of somewhat medium or larger size files so you can just read and write one big chunk of data at a time. But that being said, this garbage collection is very important and my customer had started a lakehouse. They had been using it for 5 years and they’re like, “Michael, our
30:54 storage costs are really high recently. I don’t really know what’s going on.” It’s like, “Well, when do you when did you do your optimizing vacuums on their lakehouse?” And they were like, “What do you how often should you do that?” I’m like, “You knob.” I was like, “Yeah, you should be doing that.” So, we took storage costs from like almost $5,000 a month or $4,000 a month down to like less than $1,000 because no one had cleaned up the lakehouse in the last 5 years. Now, impressive that every transaction,
31:24 multiple edits per day, was creating all these new files and there was a huge amount of volume of data being shown up there, but they didn’t need all those old files. required to keep the record of whatever that table was. So I really think that Microsoft needs to provide better keeping things clean and just kind of autooptimizing things. data bicks is offering it as well. Their lakeous give it this warehouse thing should be doing this as well. This should also be a feature of the
31:55 lakehouse and data bricks and spark engines like materialized views right materialized views should have some sort of like auto cleanup experience inside it it should be doing these things for us automatically and so that way I’m only storing the data that I need and to your point Tommy we just need a couple knobs to adjust and and say look I want to keep last 7 days or 30 days or like we there should be a couple small knobs that we use that help us adjust this without having to like know all intricacies of them and we’re not noobs anymore. Oh, so they’re
32:27 metaphorical knobs. That makes actually a lot more sense. I think that’s what it is actually. Okay. Didn’t think about that. I I That’s hilarious. I’m But I’m going to start using that now. Don’t be a knob. I don’t I don’t think you want to use that, Tommy. Don’t be a knob. Mike, I think it’s time. Let’s go. We dive into something I cannot wait to get into with you. So frame us out here. Frame frame is the main topic here. What are we talking about? What is this co-pilot thing and maybe we should even do just a quick little introduction around let’s
32:58 just talk about what co-pilot standalone even is? Because this is a this is kind of a new thing Tommy like this is a newer feature. So let’s unpack that first maybe and then figure out where we go the topic. Yeah. And for for the sake of argument I think something we’ll maybe introduce more is this idea of role playing. But let’s start with the idea here of the next era of copilot of fabric. If you’ve ever used co-pilot in any other Microsoft product, you already
33:28 know how this works. Microsoft 365 is now dedicated to copilot pain. off all office products have a co-pilot dedicated. We have dealt with co-pilot that felt for for consumers part of PowerBI but kind of it didn’t really flow well because I could have it for a report. I had the pain if I had the capacity to see co-pilot to talk to a report or build. A lot of co-pilot features though have been a more focused
34:01 around the builders. it’s built in power queries available in notebooks. It’s available in data warehouse. But for consumers there hasn’t been a lot of focus yet for chatting with your data, talking with your data except for in an individual report. Microsoft is now introduced into Microsoft Build 2025 the new standalone co-pilot a full screen experience accessible as a new button or action on the lefth hand pane globally available in
34:32 PowerBI and the main differences here is this standalone co-pilot allows you to not worry about which report you’re in so if you want to talk about your marketing data and then go into your sales data whatever is important. You simply have the same experience in the same location. And here’s the thing that may sound minimal or not significant, but my gosh, technology changes process and how the feature looks, where you go can change if people use it or not. So things that are available in this
35:05 copilot experience is simple things that I can ask it to do. kind of like the Q&A feature, but obviously now we’re using co-pilot and that’s really I think the best way to from an analogy or to understand it and it understands metadata. it understands a report name. So if you do want to specify a report like hey what’s what’s my quota from the sales 2025 report and but it understands descriptions, content, visual titles and text boxes, the workspace name and other properties. So,
35:38 it’s not just scanning the report and pulling in general information. If you understand some of those pieces or you don’t, it’s going to attempt to intelligently pull back what you need. Also, it’s going to weight more other features such as what you recently viewed, what’s endorsed, or your favorites. So, that customization is completely different from the Q&A feature. If you’re thinking this is Q&A plus, it’s not. There’s a few summary features here as
36:09 well. there’s a whole article here, but you can still ask it to change it. So really is the co-pilot experience in PowerBI. Mike, what do you think? What’s your take here? Yeah, I I haven’t been able to really play with it in detail because it’s it’s a preview feature. So right now you go to your tenant right now in the early part of the article. I just want to call out like this is a feature that has just been announced at build., it’s coming in the next couple weeks. It’s not there yet. I don’t see it in my tenant. It may be in your
36:39 tenant. It may be out there eventually. But I think the idea here, so couple things that I think about this feature. Like one one thing that needed to happen with this was Microsoft, if Microsoft’s going to really double down on the co-pilot and the experience of things, they’re going to need to get this into the hands of more people and make it extremely accessible. You’ll already notice co-pilots everywhere, all over your computer, right? If you have a Windows computer, there’s now an icon on your menu bar that has copilot 365. There’s now co-pilots in your Outlook.
37:10 There’s now co-pilots in your Excel. Has a co-pilot keyboard button. Yeah, you have a button for it now. Like, and you literally go to go to paint. Open up paint. Just Tommy for a second. Look at that. There’s a there’s a co-pilot inside Paint as well. So, just in case you didn’t understand how to make a line or a square or or delete something, you now have co-pilot inside Paint to also help you out with everything there as well. I could use that a lot for paint. Yes. Yes., so I think Microsoft is in this moment
37:44 and and again I’ll to their credit right to their credit as well as I interact with apps now AI is becoming so pervasive so quick that every company is trying to produce some sort of experience around an AI level feature. So much so that’s why we built an AI feature into our theme generator. So the theme generator that we do is like when you are o uploading images to your wireframe and then you’re adding visuals on top of that page. We’ve actually added our own AI on top of that to help
38:15 us put the visuals where they should go in your images to kind of help it make it easier for you to place images or visuals on a page. That’s that’s a that’s a helper mechanism, right? We’re trying to build things that are going to make it easier for you to work with like the tools that we’re building. anywhere I see kind of text things, any AI, any tool that’s like,, Grock or Twitter or Adobe Express, like any product that has something where I’m entering text, I’m now expecting that tool, especially creative tools to automatically have some sort of co-pilot
38:47 to help me create the wording and rewrite it and all these other things. So, this is becoming standard. So, all this to say is I think this is in line with Microsoft’s vision., this also aligns with their ability to take a co-pilot and let you use it down to an F2 level. So, I think that’s also something that’s been a big movement here. You couldn’t you couldn’t really do copilot as a native experience unless you had a capacity that users could use to talk to Copilot, right? And let’s set it up then because here’s the situation,
39:18 Mike. Microsoft has tried a lot to deal with dealing with intelligent application-like features for PowerBI that let’s admit have not taken the world by storm. my all my clients, every company I’ve worked with, we have tried attempted all the AI like features in the past. Automated insights, key influencers, Q&A, Q&A and dashboards, Q&A as a button. what else was out there out there as well? decomposition
39:48 tree,, quick insights. None of those were something that people depended or really used at all or wanted to use. And there are a lot of methods that we did, but honestly, they were, and I think we’ve said this, and I hate saying this because I love all the Microsoft products. It was unreliable. It did not provide something each time or the majority of the time that people found that they could use or act on. Now, we have something here that is much more promising. I’m totally stoked for it. But let’s see what the situation here in
40:20 this role playing that we’re going to do where you’re are you are tasked in the next 12 months to make sure that your organization adopts the standalone co-pilot for PowerBI. you have some budget, you got some people, but let’s say that situation where we’re still in that point where the data culture is, yeah, we’ve done AI things with PowerBI, no one’s been impressed with the previous tooling. So, the role play here is what’s what are we going to do? Like how
40:52 what’s the process that you would implement? Obviously, here we’re we’re talking here on a, for the next 20 30 minutes. So, we’re not going to go through everything. We’re obviously have every company’s different midsize small data culture how we assess it but we’re going to speak a little general here but where do you start what are the milestones that work that you would implement and let’s really dive into if we were to work together here what what would that project plan look like so at 12 months time we have a pretty good working system of co-pilot standalone
41:25 powerbi and our consumers yeah I like your question Tommy but I think I would rephrase it slightly. So, nothing new, but it’s good. So, nothing new. We usually beat up the questions, the definition. This is how it works, man. This is good. It’s refined. So, I I feel like the question you’re asking is,, let me let me just state some things that I think you’re assuming with the question. Let me maybe that’ll just clarify what we’re talking about. So, just let me react a little bit, right? So, this is my reaction. My reaction to
41:55 your question is you’re deciding that there’s a mandate that says you must use co-pilot and we have to have our whole team using co-pilot to be effective as a team. Yep. I don’t necessarily agree with that. So I I think specifically co-pilots or AI things in general, it’s what are that that is a that is a solution looking for a problem. I want to think about it as a problem looking for a solution. Right? So I would phrase it more of the idea of like okay as I
42:28 would look at this I’d say what part of my organization is struggling the most what in our data culture is the weakest and after I take survey of my company my team my people what do they know and what they don’t know then I step back and say okay now what does what things or capabilities can the co-light co-pilot offer me and how can I use that co-pilot to simplify or make it easier or I’m going to identify the skills of the team essentially and then look at okay for example when you’re talking
43:01 with this article from from Amanda there is the very nice pretty like blog article right hey look there’s key insights you can do all these things with it but if you go read alone go read the documentation on like the chat with your data experience right actually going into the documentation about what this feature is going to mean and what it can do there’s actually like what we’re going to call capabilities right One capability is just finding stuff. Search. As much as people hate using large language models for search, they’re like, “Well, we have these like
43:30 we can index everything. We don’t need it for search.” For whatever reason, the first thing most people’s minds go to when you talk about AI and CO, it’s searching for things. It’s it’s summarizing context that’s somewhere else, right? So,, I would go through and say, “Look, what are my team’s problems? Do we have so much content that it’s difficult to find good content? Do we have so much content that’s hard for people to understand like what’s going on? And then as you walk through the other capabilities, like another one’s like summarize a report or a topic. Okay, interesting. That might be a bit more interesting
44:00 that I want to look at. Ask questions about your data. Now, again, we’ve done this before in the past with like the the large the Q&A models we’ve had before. Have I found a lot of value from that? Probably not. I might that might be a capability that I will look at. I’ll explore, but that may not actually be useful for my team. using a fabric data agent. Okay, data agents are interesting. This is a new world. I can actually change what the AI focuses on. I can pre-train the AI a little bit. I can give it some instructions on how to
44:32 respond with an answer that could potentially improve the answers. So, data agents might be a good opportunity for me to use that trained model with my language, what I want, that might actually make it a better experience. So that would be something that might be useful. you could even pick what kind of data you can either you can you can prep it. Then there’s also this space of prepping data for AI. So then it moves away from its capabilities and saying you can use AI to prep the data for the AI. This is very meta at this
45:05 point. But if you don’t have models that are prepared well or have all the instructions and the relationships and the descriptions, the AI has a much harder time reading the semantic model and getting out what it needs to like give you the answers. So all that to say is I’m going to really step back and say where are the main pain points of my team? Where are the big skill gaps? What are the things that we think we just need to get better at? And then I’m really going to try and look at the co-pilot in that light and say, can I build specific data agents to address
45:37 those needs? Can I build specific things in co-pilot? Can co-pilot solve those bigger pieces? And then only when I find matches on that, then I start and again, if I’m looking at this, I’m starting out with a small team of people that I think are going to be capable. We’ll teach them how to prompt. We’ll work on it. We’ll do like little mini projects together. And as we learn things, we’re going to refine our process. And as it the process gets better, we then roll it out larger to the organization, right? Use the co-pilot where it’s most effective, document that, refine that, and then roll it out to the rest of the
46:09 team. Like, here’s where we’re finding value with this co-pilot thing. And I feel like that’s what has been the missing gap in a lot of these other co-pilot experiences. They’re just like, “Look, cool. It’s AI. It’s a thing.” Yeah, but it’s not really solving a problem. It’s just something you can do that’s interesting. And and for those of you listening at home, rewind again two minutes for what Mike said because Mike, what you you stated, you could plug in any technology on that approach because this is really important and especially
46:40 with any anything that’s hyped in the world that we live in now. It’s AI. People want it all and they want it all done, but they don’t know what it’s going to do, what it’s going to really solve, right? And I think that’s the big thing. They think go get co-pilot, right? And I think we you I think all of I think that’s just the mindset we have from the consulting world or just I think dealing with data and technology is yeah we could do it but I think what is your old saying how like our job is how do we help people make money or save money and if everything we should be
47:11 doing should either be making money or saving money. If we’re not doing one of those two things with our reports like again looking at the report what does what does this report do? Does this report help you make money or does this report help you save money? If it can’t tie to one of those two things, does that report even need to exist to? And to be very frank, if you are tasked with just making everyone use C-Pilot standalone, you’re probably setting yourself up for failure. And the reason why is honestly you got you have to break it down where it this is the first AI tool in PowerBI that really allows
47:45 some things that can achieve that can do things better than other things in PowerBI. And I don’t think we’ve really had that. And to your point, Mike, you start with this project rather than, yeah, we’re just going to create an adoption program is like, well, what team, what does this do better? Okay, let’s start with the search. Let’s start with getting that quick information. Well, who could that save the most time with, right? And I think it has to start with that type of project where you have to measure first rather than again I
48:16 don’t want at the end of the day organizations are not going to care that everyone’s asking 100 questions a day on co-pilot. That’s not your success metric. It’s a KPI but it’s not your success. It’s not an objective. Your objective would be how is this saving people time or making them better what they do. That has to be reflected larger or you’re looking at this wrong. And I I so I love how you broke that down. So, if I’m tasked with this and they come to me, I’m gonna in a sense talk back or or respond back with, well, let’s break
48:46 this down into who needs this the most and how do they need this the most. Easy easy wins are always sales teams or operations teams and because if I want to get my quick numbers, a lot of sales have a lot of reports. A lot of,, reps, agents, managers, and it’s like, well, what do they what do they do every day, right? So, usually it’s trying to find accounts. Like I I I feel like I’ve said this a thousand times on the podcast, but like again, if people don’t want to report, it’s just the best
49:16 platform to do it. Well, if I can get something where I open my computer and I could see who I need to call, I could see what I need to close or I I can see,, where I am to my target and an application just popped up. That’s all I need. I don’t need to look at anything else, right? where or just solve what I need to do if I what do I need to do today and what do I need to know okay so let’s break it down there and I think you start with teams and departments who are in critical need of
49:51 those types of questions because to me when I’m looking at co-pilot standalone what it’s really I think it’s edge is those going to be those questions there is advanced analytical things that it can do I wouldn’t focus there right off the bat at right because you can do data agents but if it can retrieve the right information quicker than going to a report and it can retrieve it from multiple sources that’s where I’m starting and I’m starting with people who need that the most would with that in mind
50:23 if breaking that down how would you kind of approach then too as you broke it down as well where it’s like okay let’s find a problem and build the solution where would you see the problem. What does co-pilot standalone do that you would see that could who could benefit the most or services or processes that could benefit the most? Well, let’s let’s take a So, this would require Okay, I’m going a lot of thoughts on this one. Let me let me try
50:54 to unpack your question here a little bit. When I was working in the retail industry, there was this need to do I I forget the name of it specifically and someone on the chat here is going to actually know this one. I’m not going to be as good a chat here to to know things. It’s a time I’m going to call it a time trial. It’s it’s Oh. Oh, I I It’s an application on your computer that No, not necessarily, but like in retail space, it’s like, okay, look, I have a job to do, a thing. I’m going to go retrieve this product from
51:26 the shelf. I’m going to go service this customer. I’m going to maintain this thing. Whatever whatever the thing is, but you basically do I I want to call it time trial, but that’s the wrong language for it. I I don’t remember the actual term of it, but it’s the idea of like you actually do like a measured amount of time and you say go do this task and you you time trial how long that task takes, right? And so you you basically look at the work that you do throughout the day and then for each major buckets of tasking that you do, you identify which periods of time are
51:57 taking the most. And so if you’re trying to make someone’s job more efficient, you look at their whole day across multiple days on the week and say, “Let’s focus on where you’re spending the most of your time.” The part of your time that has the most amount of effort or time spent is usually the part that I can improve. Yeah. Right. If I if I can do something in a minute and it’s not a lot of effort. Well, if I can drop that from a minute down to like 50 seconds, minimal value ad to my workday. I’m only
52:28 faving saving 10 seconds here or there. Now, in some high manufacturing jobs, you’re doing a lot of like time trials. It came to me. It’s called a time I think it’s called a time trial., but the idea is you you measure these things over time. And so what you want to address is what is that large task that takes a long time, right? What is that major thing that really eats up someone’s effort to do something? So to answer your question directly, I think about time trials as a way of
52:59 looking at what is what are the people working on in your PowerBI environment? What are they spending most their times on? are they spending most their time in the format panel trying to get for visuals formatted? Are we building a whole bunch of bookmarks in reports that are just a bear to manage and we don’t know how they work and they break things? Are we always fixing things in tests because we’re trying to get things out to production and it’s always breaking something?, are we always fixing a data pipeline cuz it’s unreliable? Right? If I look at my team’s work and this will change over time, but I think what you’re looking
53:29 for here is you’re looking to find what part of your users your team’s time is being most used and then solving for that. So throwing co-pilot at explain this visual to me might not save me a lot of time. Right? Throwing co-pilot at I’m trying to understand the relationship between salespeople and customers. Here’s some things. Here’s some of my assumptions that are happening in the data. Mhm. I think these are true. Can you show me graphs that would answer
54:02 this kind of question? To me, that’s where co-pilot should be be used. There should be this concept of like letting the co-pilot deep think in front of you. Hey, do a deep think first. What could this mean? Let’s understand the question. Like a lot of that processing, thinking out loud. I’ve been finding some immense amount of like value and just watching AI kind of process the question and think through answers. And I’ve seen some very creepy things where like the AI should be like, “Well, I shouldn’t answer it. I shouldn’t tell them this because then they’ll think I’m
54:35 then they’ll think I’m a a real person. I don’t want to let them know that I’m a real person and then I’ll It’s just creepy stuff. Why not? Huh? Can you Can you actually explain that? I’m going to change my answer because I don’t want it I don’t want the person on the other end of this to know that I’m thinking like it it does weird things like that. So, it’s it quippy or whatever. It doesn’t make a difference. But,,, when I look at this and I’m going to myself, I’m thinking, “Wow, this is very interesting what it can do.” I think it’s that kind of stuff. It’s the stuff that can dovetail
55:05 your existing work and I want to identify the one hour project, the the 2hour thing that I’m doing. Can I cut that down in half? Can I take something that normally took me a week to do and can I make it a couple minutes or a couple hours? Let me just by extension give you an example. I had a website. I was trying to add Google Analytics and I didn’t think it was working correctly. The Google Analytics wasn’t firing off as fast as I thought it should. So, as I was working on the code, something in the code was definitely having a
55:35 problem. So, I was trying to add Google Analytics. I spent about 4 hours trying to add it correctly into my static web app. It wasn’t working very well. And it maybe works, maybe it doesn’t. I don’t really know. I said, “Look, I’m going to start over.” And so, I bought Cursor. I said, I’m going to learn a little bit of this AI stuff and I’m I’m going to use a large language model and just start playing with it a little bit. The first thing is I made a web page. I then made a Google Analytics on that page. It was rendering as needed. It was fast,
56:06 seemed to work as expected. Then I started adding details and other options and other pages and other clicks on the page. Like so I was able to in the same period of time where I struggled with one single small feature, I was able to rebuild the entire app using an AI agent. So it it basically took like a week’s worth of time and condensed it down to 4 hours of time to build a full website with all the details I needed on there. I was impressed. I was like, “This is amazing that it could do this.”
56:37 So it’s those kind of efforts, right? It’s where it’s where I can spend $ 20 or $50 per person and take work that was taking me a week to do I can get done in a couple days or a day. That’s the kind of stuff I’m looking for. And I would argue though that’s you’re that’s still thinking in the developer role, not the consumer role of what this co-pilot’s doing. And I perhaps I think we’ll explore that in a future episode. That’s my example of like how I’ve used it. But I I I think the same principle exists like the principle is still there,
57:07 right? Rubers are going to have a much more limited feature set available to them in copilot standalone compared to what you can do in cursor with all those models right because I don’t even care about I don’t even care about the features all I care about is what is the very large thing of time that you’re taking like that analogy still fits so I’m I’m looking at from a developer lens like this thing that I would develop typically was very complicated and I didn’t know how to study the code to write it all I’m taking this very complex thing and then giving it to an AI agent and it’s outputting to me maybe not as clean of a
57:38 code, but it’s outputting me something that’s functional and it gets me to the answer. So,, what is that to your point, Tommy, what is that what is that timeconumption thing for the end consumer? What does that look like? And I think that’s that’s where I want to see these AI agents go and until I really get my hands on it, I’m not sure if I’m there yet. And I I think the biggest thing is this is what I want to dive into you another day. We’re I can’t believe we’re already close to time., a lot of the article has a pretty elaborate part of prepping your data
58:09 because here’s the thing. We could talk about roll out and I do want to do that in a future conversation on what does that look like, but I think we need to really talk about the prep here because here’s the thing. I don’t care what model it’s using. It could use Claude 8 that’s not even out yet. But the biggest thing is this is different because it’s using a semantic model or multiple semantic models to render whatever the answer is. So you could have chatbt8 and chatbt25 that’s not going to be out for
58:41 another 10 years. Doesn’t matter. It’s not going to work if we don’t prep our data. And that’s going to be my biggest concern when we’re talking about this is regardless of the tool that I’m using. If I’m connecting to my semantic model, I need to verify two things. And this is going to be my closing thought. And I think like I said a lead into another conversation another day. The two things are this is I’m not rolling this out or really introducing this unless I have vetted a semantic model and that what’s going to be used because here’s the
59:11 thing I I promise I promise you like just like everything else that hasn’t worked. We have never had ability to customize and wait a model for AI which is probably why it’s never worked or caught on. So I don’t care if it’s copilot or chatbt or claude or anthropic or Google or gra whatever if the data is bad or not weighted and ready for AI this is not going to catch on. So that’s number one where I am going to be very lenient on who gets access to this even when it becomes a feature in PowerBI.
59:44 Number two, my number one focus on rolling this out is that focus with my team to get a vetted data set. Now, questions on whether it’s certified or not to be we’ll discuss that another day, but the biggest thing is use something fundamental and then it’s going to have to start start with a small group because here’s the thing. this all the things we’re talking about here to me are a non-starter unless I can in a sense prep my data comprehensively
60:15 for AI because it’s it’s not something that’s going to take on because AI is not going to know co-pilot’s not going to know what to do. So we can’t expect a lot from the consumers. This is the weight’s on us. that waits on us on building this, not on the consumers to catch on more than ever on this because I can’t expect them to know a visual property. I can’t then expect them to know the metadata. That’s not their job. And again, if I don’t have that ability
60:45 to do that, this is not something I’d roll out. If I haven’t done that, that’s not something I’m rolling out. All that being said, we’ll tie a nice little bow around this is for me, my number one focus with this tooling is what are we going to vet? What data we’re going to use that’s going to provide my to your point to get someone to the answer or to the solution that they’re trying to do in the quickest amount of time. That’s the models I’m focusing on and I’m spending a lot of time prepping that and that’s where we start. Yeah, I agree. I
61:16 think I think the prepping of data is going to be good. Again, I don’t even know what to prep yet even, right?, how to prep it? What kind of language I should be using, right?, do the descriptions get a standard description for human readable things, but do we also add some AI language in there as well? I don’t know. Like, you’re going to get this is where this is an uncharted territory. It’s like a it’s like a new science of things, right? You have to figure out what’s going to work well for your team, right? And I don’t know what was going to work very well yet. I I have some suspicions around that. But also, if I look at this going,, I’m going to go back to is
61:50 this lemon worth the squeeze? Right? So, I guess my final thought on this one is really really I want to I want AI and co-pilot things to tackle the things that my team spends the most amount of time on. I would argue when I work with organizations, many of the longhole running things are semantic models aren’t always in the right shape. we don’t really understand the requirements very cleanly for the questions we want to answer. some of the questions we ask of the AI doesn’t have data in the semantic models yet. So we need to
62:21 identify what that is. So I I’m going to focus my attention on trying to understand in in the early stages is yeah what can the AI do and then I would like to also emphasize a lot of these time trial experiences. Let’s make sure we really understand where is our team spending the most amount of time on what tasks and then going back to from that list of items of high long running items what of those things can I use in AI to
62:53 help me use that experience maybe it’s for the end consumer maybe it’s for the report developer maybe it’s even for the data engineer I think I think if you think about the more technical parts of this the things that are more structured and more known. AI is really good with code because it’s a very like you write it right and it works. If you write it wrong, it won’t work. You write it well, it goes fast. You write it bad, it goes slow. Like it’s very measurable, I think. And I think that’s where why AI is doing a good job there. I also think AI does really good job in the creative
63:24 space. Like it doesn’t really have clear boundaries or clear direction like make me an image that has these things in it. like you can kind of be a bit,, loose with the requirements. Those seems to be those kind of things seem to be a good fit for AI. Being very precise and making a visual or a chart that gives me exactly what I want, maybe a little bit less useful to end consumers, right? So, at the end of the day, I think what co-pilots are going to do is they’re going to make the complex simple. It’s
63:55 going to make these these complex things easier to consume for a general consumer. So, who’s to say we can’t talk to a co-pilot? Who’s to say we can’t click on a visual and have co-pilot simplify it or remove things or change the style of something based on talking to it? Like, those are the kind of things I look at. I I look at a lot of these YouTube channels that do really rich, awesome, highly stylized visuals. Like, hey, I’m writing this custom DAX measure that highlights this one bar when you click on it. Cool. I don’t want to learn how to do that. Why can’t the
64:27 co-pilot learn how to do that? like I should be able to I should be able to talk to the individual visuals and say, “Hey, co-pilot, I want you to highlight the selected bar.” And the co-pilot says, “Oh, you need a measure. I’m going to set this up. Here’s the conditional formatting. It should just rip that out.” And so that’s the kind of stuff that like I don’t want to be a technical expert in this area. I want the co-pilot to figure that stuff out. So that’s,, I I’m trying to find I want to talk to the AI in con in concepts, right? I’m a sales manager. Which one of
64:59 my customers are most likely to churn? The AI should rip through the data and say, “Here’s a list of customers you should contact today because these are the ones that are most likely to not be a customer in the next week.” Okay, I now have my action plan. I then use my personal skills and go talk to all those people. That’s the kind of stuff that I think we want AI to go do is like there’s a lot of data to go sift through and I need to be able to ask very pointed actionable things and the AI should be able to guide me or provide some value in in some direction or
65:29 another. So it’s that kind of stuff that I wish I could get the AI to do more of. So all this to say Tommy back to your question. How do we roll this out? I am going to figure out what my team knows and what they don’t know. I’m going to figure out what the AI can do and what it does well. And then I’m going to try to align my team’s long running tasks against what the AI can do quickly. And I’m going to see how well that blends. If it blends really well and I can find immense value from co-pilot, then yes, I’ll light it up. But this is a value proposition. Does the co-pilot add
66:03 enough value for the cost that I’m going to incur based on the CU usage? What does that balance look like? I I’m going to read I’m going to read a little bit into the future here a bit and I’ll say I would guarantee you in a couple months we’re going to have a lot of people on the internet saying this co-pilot thing is interesting but it’s using all my CU and it’s not adding enough value. So there’s going to be there’s going to be a pullback from this that’s going to require people to understand a bit more around where does this co-pilot
66:37 thing sit and is it really valuable for my team. So that’s kind of where I thinking about it. Yeah. Love it. Love it. All right. Well, dude, I think there’s a ton more here, but I we got to end unfortunately. Sounds like another episode coming up. All right. That being said, thank you all so much for listening. We appreciate your ears on this podcast. I hope you found this interesting and the co-pilot topic to be fun to talk about and discuss here. There’s a lot of new developments coming. We’re just trying to unpack where it makes the most sense in our organizations. We hope you’re finding this as a valuable thinking or thought
67:07 exercise. Maybe you can take away some of this as well and use it in your organization. Also, let us know in the chat if you are using Copilot and if you have any additional questions or topics as well. you can also go to PowerBI tipsodcast and also ask your questions there as well. Please share this with someone else if you like this conversation. Tommy, where else can you find the podcast? You can find us on Apple, Spotify, wherever you get your podcast. Make sure to subscribe and leave a rating. It helps us out a ton. Share with a friend since we do this for free. If you have a question, idea, or a topic that you want us to talk about in
67:39 a future episode. Head over to powerbi.tips/mpodcast. Leave your name and a great question. And finally, join us live every Tuesday and Thursday, 7:30 a.m. Central, and join the conversation on all of PowerBI.tips social media channels. Thank you all and we’ll see you next time.
68:18 Let’s go out.
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
