Creature Comforts of Data Warehouse – Ep. 453
Fabric Data Warehouse keeps getting more comfortable for SQL professionals. Mike and Tommy walk through the July 2025 warehouse updates including the migration assistant, SQL endpoint improvements, snapshots, and usability refinements that make the experience feel more familiar.
News & Announcements
-
Test and Validate Your Functions with Develop Mode in Fabric User Data Functions (Preview) — A new develop mode for testing User Data Functions before deploying to production.
-
Useful Community Tools and Resources for Power BI and Fabric — Chris Webb’s curated list of community tools and resources.
Main Discussion: Warehouse Creature Comforts
What’s New in Fabric Warehouse — July 2025
Drawing from the July 2025 warehouse update:
Migration Assistant
A tool to help teams migrate from existing warehouse solutions to Fabric:
- Assessment of current warehouse workloads
- Compatibility analysis
- Guided migration steps
- Reduces the guesswork in migration planning
SQL Endpoint Refresh Improvements
SQL endpoint refresh is how lakehouse data becomes queryable via SQL:
- Faster refresh times
- More reliable sync between lakehouse files and SQL views
- Better handling of schema changes
- Critical for teams using both lakehouse and warehouse patterns
Snapshots
Snapshots enable point-in-time data recovery:
- Capture the state of your warehouse at a specific moment
- Useful for debugging, auditing, and recovery
- Similar to database snapshots in SQL Server
- Important for production governance
Usability Improvements
The “creature comforts” that make daily work smoother:
- Better query editor experience
- Improved object explorer
- Enhanced monitoring and diagnostics
- More familiar SQL Server–like behavior
Looking Forward
Each monthly update brings Fabric Data Warehouse closer to the experience SQL professionals expect. The migration assistant and snapshots are particularly significant—they lower the barrier to entry and increase confidence for production workloads.
Episode Transcript
Full verbatim transcript — click any timestamp to jump to that moment:
0:00 Good morning and welcome back to the explicit measures podcast with Tommy and
0:32 Mike. Good morning everyone. Good morning, Mike Carlo. How you doing? I am doing well today. We are just kicking things off. It’s been a bit of time. We had last week we took some pre-recorded meetings. and now we are back live again. so yeah, welcome welcome back to the show everyone. We’ve got an action-packed episode today. But before we begin our news and everything else, let’s just quickly talk about what our main topic today is. Our main topic today, we’ll be talking about some creature comforts, recently,
1:06 This is a tongue twister, recently released features of the data warehouse and making sure we unpack some of these new features that are coming out. these are things such as there’s a migration assistant. We talked about in a prior episode that the the data marts is out of out of support. It’s going to be done here pretty soon. October 1st is like the deadline. So, it’s coming up pretty quick. So, there’s a migration assistant that’s out there, a tool that helps you get from data marts into the data warehouse. Talking about some SQL refresh improvements, usability
1:38 Improvements just generally with using the tool. And then let’s talk about snapshots. What are those things? Do we need them? Should we be using these things? So, these are creature comforts that are coming to the data warehouse that we should just unpack and talk through. All right, that being said, Tommy, over to you for some news. Dude, we got some good ones. And yeah, when anytime we take a week off, all all the news comes in. And one that I’m in here, I’m intrigued to hear your thoughts, Mike. Introducing item history in the Microsoft Fabric capacity metrics
2:12 App. So, you have fabric. , there is an app or report semantic model that allows you to track your capacity. same as what used to be in premium but obviously all things fabric. Now what they’ve added here which again this goes into my bucket of why didn’t we have this before? It’s a history page that has slicers. So you can choose the capacity name workspace experience things by the compute consumption. just had a great conversation with a client about so this is something they just
2:45 Made up like oh yes and no but see use by workspace cus by item and the item history how many times something’s been scheduled or the number of operations by date the percent of the breakdown by the status the status of it how completed and the details so rather than only seeing that current time frame or overall you can see this o by date Mike what’s your take what’s your thoughts here. My my initial reaction to this one is just again, we knew all this data was happening. It was all occurring
3:18 Somewhere in Microsoft. We we just couldn’t see it, which is disappointing that the the tracking. So, it probably wasn’t presented before because each system or each artifact or each item that you’re building inside the the workspace probably didn’t always report the same way. I’ve heard there’s been some challenges like each team would just report whatever they need to report. they would only report the bare minimum for like how to build your team or how many CUS you’re using, right? That was like the only requirement, but everything else was like subjective. Well, they’ve standardized that I believe a lot more now. And because every item is now using
3:50 The same amount of CUS, it’s giving you the same amount of consumption no matter what. You can now have this more consistent look at all the items. , I think CU down to the item level is great. One thing I had to dig into here a bit more was they talk about CU per item and then they talk about smooth CU over time. This is one of my challenges I think I’ve had because a lot of the report only shows you the smooth portion the smoothing of the CU consumption which I think is actually
4:22 Very difficult because sometimes you have a large effort of something let’s say it’s a copy job or something happens and it it spikes your capacity and uses a lot of CUS very quickly but then you don’t really get to see that very easily and all of a sudden you’re getting throttled you don’t know why where where does that throttling come from. So I think this is going to help you get into the weeds a bit more and say, “Okay, what are my items that I have and is that report or is that data warehouse or is that loading job worth the amount of CUS that we’re doing to run it?” And
4:56 Then to your point, Tommy, like this is another thing here where we can pick on data flows even more now like it’ll be a that data flow is very inefficient and costly. Let’s rebuild it into something else. And this is what should happen, right? A lot of this stuff I’m feeling like even notebooks for some degree is like just build something so it works. Figure out the API calls, the table changes, the data column changes. Like you just need something that’s fluid enough to get the job done and then you can always come back after the fact and actually optimize it and then bring down the usage on things once it can
5:28 Be used a lot. So that’s that’s my perspective on it. And I think this just goes back to whenever you introduce new metrics to users. And here’s the thing, not a lot of people even listening to the podcast are probably going to see this. This is for administration. This is for admins of your fabric capacity or owners. And yeah, we can make I like that we can make reports off of the semantic model. It’s available just like any semantic model. But if you look at the back end, you understand that we’re introducing a lot of metrics, measurements that are require cognitive load and require cont.
6:03 That being said, I think history is a big part of that that’s going to help. So happy to see that. Excellent. Let’s move on to the next one. What other what other newest things we’ve got going on here? Some of our favorites right now. So we have one of data functions are user data functions. Not userdefined functions, but user data functions. You and I had a good conversation about that. And what we have here is test and validate your functions with something called the develop mode which is strange because I thought whenever you create a user data function that’s what you were
6:35 Doing you were developing it but simply there are two different modes beginning in August where you can now say develop mode or the run view mode very much the similar experience in notebooks where you can edit you can run it or and or you can view it you can edit it or You can develop it or quit. Yeah. develop run view. Yeah. Run view. So these modes provide a space for users with different permissions to perform these tasks. This is cool because again it’s the develop mode is the same concept.
7:07 There’s nothing new in the develop mode. It’s the normal mode. Mike, I don’t know. this is fine, but how many people do you think are going in with permissions to view a user data function that are not the same person who’s going to develop it? And I would also argue so I’m going to step one even further above that right Tommy. Yeah maybe so I like the idea of being able to stay in the browser like I’ve done some user data function testing myself and building some of my own and it was it was challenging just to get it out
7:38 Of the browser and into VS Code. There were some functions had to be there. It wasn’t like one single package I needed. I needed like multiple fabric extensions. I think they’ve simplified that now. It’s gotten better over time. So definitely it’s gotten more consistent, but I just had a lot of difficulty like writing and running code locally and getting it back up to the service. So yeah, it just wasn’t either the way I have my VS Code right now or these extensions I’ve added in there, it just was not as smooth as I would like. , but this makes I think a lot more sense where I can actually develop. I
8:10 Mean, it’s VS Code. VS Code is in the web like you could use it anywhere basically. So I would rather have VS Code in the browser. Now I I will agree with you Tommy. I don’t think a lot of people are exploring user data functions either in our community or generally. I think it’s it’s out there. It’s used for people when they need it. It definitely is a complimentary service to other things as well. Is is my first go-to for people to go use a user data function? Probably not. This is very developer, very procode type experience for things. And so, you
8:44 Know, again, you’re the topies of the world, right, Tommy? you’re going to want to run it in VS Code anyways. So like this is I think helpful for people that are newer to the space. I think you’re going to need well my opinion here is you need to have co-pilot in this browser experience regardless like there needs to be a co-pilot in there to help you write stuff. So I disagree with the first statement because Mike anyone who wants to look at Python wants to edit it too. I don’t I have not met a person to date says I don’t know anything about Python but man would I love to see that code. I don’t know that happens. Now I want to explore
9:18 Something you said about the VS code thing and having that issue. Sure. So because for me Mike I haven’t done any editing with user data functions on the web. Everything I’ve created and and worked on not just creating the user data function but editing it has all been in an IDE cursor VS code question. I think we can solve your problem here. I think we can solve your issue. I can sell you the pen. , do you use profiles in VS Code? Yes.
9:49 Okay. Do you have a basic profile? The minimum profile? I don’t know. So, I’ve set up my IDE because I’ve had the same issue. I’m like, I just want to run one thing or edit a file. I don’t need all my 80 extensions to run and slow everything down. So, I default basic profile, not my default one, but a basic one that only has like 20 extensions. really minimal things there. Set interest. Yeah. And that’s when if I open VS Code with a new window, that’s the profile that starts.
10:21 It has Fabric, it has PowerBI if I need to do that in Python. But everything else that I’ve, , gone, oh, that’s cool. That’s cool. I can, , I keep that in separate profiles. So when I need to do a quick edit, it’s pretty blazing fast. Wow, that’s interesting. So, I’ve I’ve I’ve not done a good job of switching different profiles and my default one is the one that I use. So, I know what you’re talking about. Yeah. I have set up one for like Microsoft MVP stuff and like other things. But like in general, I don’t switch between them. So, that’s probably a really good idea
10:53 Is just to simplify down only the extensions you need to run the thing. Make it when you make a new profile, does it keep all the extensions or basically start you from scratch? You can choose to take your So there’s a default profile that has basically if you had never created a profile, any extension you download is part of that. Create a new profile. You can say start with nothing or use my default. So interesting. I first started I I pruned man. I pruned all the extension. It was hard to do on my default one. But once I did that now, if I need to do
11:25 Something intensive Python, I have a profile for that. But if I’m doing fabric or just want to go through that, I have a my default one can handle that. And it makes a it makes a big difference from my user experience and my computer’s user experience. But I have not used Python a lot for user data. Even though I’m doing a lot of Python with notebooks, I’m not doing that in the web with user data. Yeah. And this is this is like my rub, right? , the rub here is like I I can I can talk to a co-pilot on the side of my browser. I can get the answers
11:57 That I need. Hey, write me a function that does XYZ. In the browser, it does have a lot of generate code things for you. That’s very nice. Like, hey, go generate this code that says go connect to a lakehouse. Go use credentials. Go access a SQL database. It gives you a lot of like standard like out of the box code samples, which I think are actually great for starting and learning. , I think that’s really really helpful here. So, anyways, I I’ll say I like the browser experience. I think a little bit more. I like it being integrated. when I was done editing the code in my browser or my VS code, I did it in the very
12:31 Early days which was and again very clunky and it was just difficult to get that code up into the browser into the web. So I like this feature. I think it’s going to be useful. I think keeping people inside the the browser session and giving them the features they need to be able to test and run develop this makes a lot of sense to me. I agree. I agree. So that’s a cool one. So, I think we got one more news, Mike. And this is, oh my gosh, it’s not on a Microsoft domain, which is now a an outlier for us, but someone we love
13:04 To talk about. He’s probably number three on who we have to mention or or who we talk about. It’s for web. All right. What’s Chris Webb talking about now? Useful community tools and resources for PowerBI fabric. That’s anytime I see a title like that, it’s always a winner for me. If you say tools and PowerBI or tools and fabric, I’m downloading something. That’s the my next step. I’m doing something after that because I I can’t get enough of this. There are a lot of really cool and this is what he says. There’s a lot of really cool
13:36 Community developed tools and resources out there. There’s a fabric toolbox, the fabric unified admin monitoring tool, which is successor to Ruy Romano’s monitoring tool. God bless PowerShell which dude the amount of you the amount of hours I spent learning PowerShell. Yes, it was fun. I have to admit it was actually very fun. MPC MCP tur servers PowerBI again the model context protocol servers auditing mod semantic model stacks performance
14:08 Testing and then there’s obviously the tindle view tindle scripts ruin mono is also done a gallery of tindle scripts there’s a script here to create a date dimension from power screw script powerbi inspector v2 version of it vs code extension to run rules and the fabric studio. So, oh my gosh, this is Mike. This is This is like having a nice rib dinner for me. This is I love this. I love this. This is a good thing. And a
14:39 Lot of these things are tools are free out there. I don’t know why Chris Webb didn’t put our theme generator out there, which is also free. And another really good tool that’s out there as well. So, I’ll do a yes and on Chris Webb’s blog. So, themes.powerbi tips is also out there. You can build wireframes, reports, and now with power designer also built into the service of powerbi.com. This is a workload you can use. You can create styles of reports as well. So I great news here, Tommy. , this is fun. I really like this one. The only thing that concerns me slightly about this one. So give me let me give you a little bit of hesitation on my side here. I love these tools.
15:12 These tools are great. Most of the people in the community are using these tools. , and a lot of them are adding a huge amount of value over time here. At some point though, I feel like I get a little bit also overwhelmed around there’s so many of them. Like there needs to be like a single catalog library of here’s the list of all the tools that we’re finding. Maybe Tommy, you and I should put together like take this list, go build a GitHub markdown page, and then maintain it with like a wiki that just says here are whatever. Yeah. Here like awesome docs like here’s a
15:44 Here’s a list of all the tools that are, , fabric PowerBI related tools. Here’s what they do. here’s what they here’s what they work with. And so, , I think there’s there’s a need for this one because this is nice. It’s an article, but he’s just rattling off a bunch of stuff. I need a list and someone needs to maintain the list. That way, people can just embed it and then we can see this like stuff. And like my other challenge here too, Tommy, is like, okay, well, just because you announced the tool like the FUAM fabric unified admin monitoring. Okay, great. It’s a tool. Like I get that it’s kind
16:18 Of around monitoring, but like you have to go like read the docs and how to install it and all these other things. So it does take some some lessons to learn and I would say most of these tools are for like the pro developer trying to become extremely more efficient on top of what you’re already doing. So that’s how I see the landscape of this tooling article. It’s welcome to pro tools stuff. Here’s a bunch of free things and things that you can work with right now outside in the market. It’s awesome. There’s just so many of them. And that’s that’s my only hesitation. It’s good.
16:50 It’s good that we have this problem. I really like that Microsoft is opening up their world to PBIR and formatting and all these extra things that makes it easier for us to build tools on top of them all. I’m just a bit hesitate hesitating, I guess, because it’s like is there too much? No, I I I I think I think to your point, I don’t think there’s too much and and maybe I’d never say there’s too many tools out there. I think the biggest thing to your point is the organization of it, right? Like I I would love to go to a single place to go what’s out there, but this is but Mike this is kind
17:22 Of the life of the developer. You mentioned that the people who are going to find the best value out of this are developers. I will counter you or my swipe back to you is I think anyone who’s getting paid to do Microsoft Fabric is that type of person. I I think if your primary income is coming from Microsoft fabric, not that you have to know these tools, but you will find them useful, it I I think that’s the game we’re playing now. Yeah. Yes, I would agree with that one.
17:53 There there is but it’s also I think the scope of what you need to do inside PowerBI has increased or fabric for that matter has increased. So with increased scope comes the need for additional tooling. And what people are doing is they’re finding there’s a lot of areas inside Microsoft or even these tools that are efficient or better ways of working with things. Like one tool that I think is incredibly powerful here is the the fabric toolbox. is it fabric toolbox? Is it Gard Bucky? He’s got this amazing fabric toolbox.
18:27 It’s a VS code extension. This is incredible. makes API calls. There’s all these like administrative level things that you’d want to be doing. It just makes it really easy to manage things. But I think a lot of this is, , where people have actual tooling experience like I am a VS Code user. I want to use that. This tool supplements, , VS Code really, really well. And I I think that’s the idea here is , sometimes these tools are being built in other solutions that are helping you manage the the fabric world. So let me ask you this awesome fabric docs whatever this markdown page let me
19:00 Ask you this does it get organized based on persona in terms of warehouse scientist engineering or is it getting categorized based on where I’m going with this yeah based on your skill level I think we’re going to solve a problem here for a lot of things here because I know which way I’m leaning I I think it’s still I think most of these tools are designed for a persona I think so when I think about different person like I almost for almost everyone I’m looking through these tools I’m looking
19:31 At these tools I’m thinking I can almost for a fact tell you which persona each one of these tools should belong to so I would probably air on the side of like if you were this persona here’s some tools that you may need to care about like that are interesting to you right there’s also to your point Tommy inside that that area right you could have a report developer or they call it the analyst the analyst individual could have a lot of like technical DAX stuff that they need to be able to do to be that analyst. So even in that lay that area or I’m the
20:04 Data engineer or I’m the data scientist there’s a lot of layers of like skills like so skills I see being a topic a particular persona with stacking skills there right so it’s like a deep and wide thing I I may be a mile wide but I’m only an inch deep because you only see but like maybe I’m maybe I’m when it comes to report building I’m a mile wide and an inch deep but when I go to data engineering like oh oh yeah I’m not very I don’t have a lot of experience at a lot of different things but I’m really good at these specific things right so that’s the skill stack
20:37 Of that data engineer you might be specializing your skills into that space yeah I really like that idea it’s a 3D matrix which the matrix 3D by itself never mind forget that no it’s a cube we’re you have the skills matrix we got to do the skills cube yes cube matrix this cube matrix I like that I like that we’ll talk I don’t know how you do that on Markdown, but we’ll figure that out. Well, let’s do one little jab here that I see coming out here in this one. So, in the middle of the article, I think
21:09 This is Chris Webb poking a little bit of fun at Phil Semark. In the middle of the article, he says, “My fellow CAT team member, Phil Seymour, why doesn’t he blog anymore? Has been busy a month ago building a new PowerBI load testing tool video here.” So, , thank you, Phil, for producing a load testing tool, but you must be extremely busy because Chris Webb is now poking fun at you, saying, “Why should Why are you not blogging more on Dax.tips?” Which is funny. Hilarious. That’s really funny. Awesome. Yeah, I found that funny.
21:43 Back over to you. All right. So, I have a real We’ll see if we can speed this up because I know we’re doing a mic actually. Do you want to do our parent corner or do we don’t want to dive in? It’s up to you. We We’ve gone on a lot through the news. There’s a lot of news items. I’m fine hitting our main topic if you’d like. Let’s do the main topic because I don’t want to take too much time because not only is it main topic, but Mike, I think I hear Oh, good. Good timing. We have a guest again Brad from the Microsoft product team. Welcome back
22:16 Brad. We really appreciate you being here on the show. we are again shocker by our title and our introduction. Today we’re talking about more data warehouses. What better person to have here other than Brad to help us unpack what has been coming out from data warehouses? What’s coming up? let’s just unpack these recent features. Tommy frame us out here some topic. Welcome Brad. So Brad, Mike, what we’re doing here today is if you’re trying to get used to the fabric warehouse, you want to be comfortable in the fabric warehouse. What do you need
22:48 To do or what are the tools, resources, and features that you need to know about just to get started. It’s more than just creating a data warehouse. It’s also a lot of things and the features that Microsoft and Brad especially has been part of releasing to help make that experience a little seamless, a little easier, maybe like going to the spa and a warehouse spa and this idea of simply having a better experience developing a warehouse. And I think that’s the lay of the land here. And Brad, all the work that you’ve been doing lately
23:22 And your team’s been doing lately, who have you had in mind? Like did you start with hey we’re working on that person who maybe doesn’t have a lot of experience or was like what we’re going for that guy who can drive the Lamborghini. Let’s give him all the tooling he needs. Yeah, it’s an interesting like it’s a it’s an interesting balance that we have to strike with that. And I I’ll be honest my background and everything is in building data warehouses doing the TSQL code the hands-on. So, I like getting into
23:55 The internals of the system and like what’s going on and well, when I create this index or I do these stats, what how does that change the optimizer? Like I like all that level of stuff. And so, when we first built the warehouse, it was a very much a we want completely no knobs. We don’t want to expose any of that information. And then we got a lot of push back. So we were very much trying to go the the PowerBI folks, the citizen developer side and like we got an enormous amount of
24:27 Push back from the prodev side and like everybody on my team of course and all the people in the MVP community and stuff like that that builds that do SQL server stuff. , and so you’ll see that over the last like two years, , we we shifted a little bit and we started to bring out some of these other, , pieces of functionality that were a little bit more code first or, , stuff that maybe a SQL developer would like to do. but but I think the balance here is now that we’ve
24:59 Got some of those things, we can’t forget the PowerBI developer, the report developer, the person that just wants to be successful. And so I’m not saying that we’ve necessarily got the right balance of those things, but hopefully you see through the things that we’ve released the last couple months and some of the stuff we’ll talk about today that we even though we want to give those experiences to the code first person, we hopefully aren’t leaving behind the the the folks who know TSQL but don’t really want to get into it super duper deep. so like
25:34 I said, hopefully we’re striking a little bit of balance. you’ll see a lot like we enable the co-pilot experience to help people but then we also give the ability to do userdefined functions or snapshots or whatever that happens to be there but even for those procode people we still want to make it as as seamless as possible so you don’t have to like go as deep internally on all that stuff give you the capability not necessarily make everything that’s going on behind the scenes I’m going to say whether we’ve hit that mark I’ll leave up to you push We’ll let .
26:06 We’ll let . I I think you’re I think you’re spot on this one. And then also Brett, I’ll also echo here a little bit as well like you’re you’re struggling in a very different world than the PowerBI world cuz the PowerBI world was we’re coming from Excel. Things are different and it was it was a different enough of an experience where it was like you’re it was green field like what does a report building look like? What does semantic modeling look like? How do we transform this power pivot type experience and move it into like its own product or tool? So I
26:39 Think to some degree PowerBI could be very businessfacing front forward without a lot of technical pieces. This is why we didn’t have like a a report format that was easy to use or we could we didn’t have PBI and it wasn’t really git enabled because the business wasn’t like we don’t care about that like we just need something to work. So I think your your team has a very challenging approach to this because you’re coming from a world where there already is all the bells and whistles and the knobs. They’re all there like so when you step into this new world everyone’s expecting oh look it’s TSQL in fabric they’re
27:13 Thinking oh it’s just like my other TSQL and so I think lots of SQL server history. Yeah. Right. And if it isn’t if it all isn’t there day one like we’re going to complain like I I think it’s very difficult to do that. And I think you took the right approach. , I I really think it was like you need to land something that’s usable for like the front-end PowerBI user because we’re trying to bring them along and then listen to the feedback that came back. And I really I really think that you’ve done a good job of like listening and the features that have been coming out are the right
27:45 Features that people would complain about the most like from the the prodev SQL experience space. So, I think I think you’ve struck, my opinion, you struck a really good balance between the two and I see a lot of value being added in both spaces. It’s not like just one team, one persona is getting all the value. It feels like it’s on both a little bit. Tommy, what are your what are your thoughts? Yeah, I think Brad, the thing I’m going to test your guys nerd level here and what you don’t Oh gosh, we’ll see. We’ll see. So, you because I think it’s would be very easy to struggle with the
28:16 Case of the Blackberries. And here’s what here. If you recall the BlackBerry phone when they were struggling in the market, they developed an tablet called Playbook and it didn’t have email because they were like, well, we needed consumers and we needed these pros and it didn’t come out with the feature that BlackBerry is known for which was ebel and this so I I I I look at all the features coming out here and I look at what you guys are doing and I don’t want to say it’s impossible. Well, it’s not impossible, but you do have it’s a hard balance to strike where
28:50 Almost where like in my head, do you create two different experiences where Mike and I joked around on a previous episode, it’s like we would love a toggle on the top right like for developers and all of a sudden everything has a different UI for that. I imagine that’s not going to happen. It just it just goes to command line. Like it’s just the whole thing. It’s for developers and it’s just the whole UI just disappears. It’s just a CLI. You just type commands. the entire time. That’s all it is. Straight bash. Just give me straight bash. Straight. That would be very interesting. Although I will I know you’re somewhat joking
29:23 About this at least, but I would hearken back to the the days not too long ago when we had the little experience switcher in the bottom lefthand corner and how much crap everybody gave us about that thing. So before you go asking for toggles, remember the experience switcher. Be careful what you ask for. Be careful what you ask for. That’s a good point. Well, I guess all I’m saying now it’s it’s it’s now PowerBI. It’s fabric. It’s all the switcher we have today currently. I guess Tommy is what he’s really asking for. He’s asking for a third switcher that’s just command line and and it just takes away all the UI and it’s just a it’s just a black screen
29:55 With a little bit of green text at the top there. And and we’re jokingly saying cuz there actually is a command line, a CLI in the browser already today that you can go play with. So, it’s still there. It happens. I haven’t actually played with it that much cuz I I like clicking some of the buttons at least. So, I’m doing that for now. Well, yeah. Go ahead. Go ahead. Well, I I was gonna say I think we’re we’re trying to without making a button for it specifically. I I think you’re you’re seeing us try to move that direction a little bit in that like sure there’s not a button that
30:27 Says, “All right, turn this into command line or anything like that.” But what you are seeing is as we release some of these new features, you’re seeing a lot more of it either be not a button that you have to click, but give you a TSQL command that you can run in the background. Like we always to give you a UI, but we do we have a lot more over the last probably 12 months or so really started to to make sure that if there’s a a button that there’s also a code path for that. So you’ll see that in things like and even sometimes we’re going the code
30:59 First path on that like collation for instance that was one when we launched it there was absolutely no way to build a warehouse without having without going and making an API call you couldn’t even do it through TSQL and then some of the stuff like snapshots for instance you can change the snapshot through the UX but you can also roll it forward with TSQL commands so like I think the the goal again in balancing these personas here has got to be let’s to your point Tommy keep the UX as clean and end user friendly as
31:32 Possible. I think something we have to try to make sure we don’t overindex on is let’s make a UI for absolutely everything. And maybe that’s what your your point is there. We don’t have to have a UI for everything. some things can be more for the developer side of things and make it put it behind a a piece of code to maybe make it less accessible so that we don’t scare people by too many buttons because like I think we also have that tendency as everything that was the problem to me with Azure’s like data factory for someone who was a noob a novice or noob in it was there’s
32:05 So many options to check mark I’ll try it see what happens could be fun and you just want to see because and there was not an explanation of this parameter this if you turn this on turn this off compressed index you’re like okay and but there’s and I think for a lot you’re getting a lot of users I don’t know if you heard the beginning of the podcast but this I think there’s something here Mike with this idea of the skill cube not only do you have people who are crossplatform on science engineering warehouse but it’s also that level of interest and that skill in that too where I think to me
32:40 What I’m hearing at least from people I talk do. There’s a lot of users who are new. Warehousing is the one where they’re lacking the most experience. And again, that’s just from the world that I live in. , and and not in terms of they don’t want to do it or they don’t know what it is, but it’s like, listen, I have a PowerBI background. I have a, , notebook background, Python. Lakehouse is where I’m at. So, they’re diving into warehouses for the first time. So you almost want that choose your own adventure and also that
33:12 Learning as you go like I’m doing something and I’m learning the code. I’m learning the right process as I develop this because there’s a need here. as you’ve been developing new features and trying to make it more accessible for people who understand the need for it but again they lack the I understand all the widgets to turn. What has been the what have been the things that you’ve been focusing on the like the features that you’ve been focusing on there? Yeah, I I think in the the I think there’s a couple things there like one I I don’t work in marketing of course
33:46 Obviously. Thank you. Thank thankfulness because then we’d have all kinds of weird stuff. So appreciate that. That’s true. But I think a problem that we’ve had going back to the SQL data warehouse days like before Synapse even when we first brought MPP into the the cloud and and all that. Yeah. Was the term just data warehouse in general because people I think a lot of times people look at that and they’re like well I don’t have a data warehouse. I just have a bunch of reports that I need need to build. And so like coming from
34:20 The SQL server world like we understand OLTP versus data warehousing and OLAP and all these things. So I think the it’s not necess I think sometimes the term data warehouse scares people away I guess is where I’m going with that. if a little bit of SQL then you should be perfectly fine in this environment. If you’re not don’t let that name scare you away because one you don’t have the skill set. I’ve never built a data warehouse or two like I don’t have big enough data. Like that’s the other thing we hear all the time from people. Well, do I even have enough data to need a data warehouse? Well, again, in in the spirit of making things easier for people, like
34:54 We want those decisions to go away. Like I think we talked about this before, but we’ll we will run we will build a single node execution plan if you don’t have a whole lot of data and we don’t need a bunch of extra compute nodes. Well, the users don’t have to worry about that stuff in order to be successful over here. but to so the the question you had there like what things are we prioritizing and looking at to make sure that people are comfortable in here like I think there’s there’s two things or three three main things.
35:26 The first is we want to make some of the the stuff that hinders performance go away. things like statistics. We automatically build statistics like users don’t ever even have to hear the word statistics in in their day-to-day world if they’re coming from PowerBI. they probably maybe don’t even know what that is if they’ve never built SQL Server tables and that stuff. So, there’s a whole set of things like that and that would include stats, make sure you don’t have to put distributions on tables, do indexing, all that stuff. , and you’re going to see more investment in that
35:58 That area going forward. , and then number two, I think is just, , something you guys mentioned just a moment ago is making sure that the things you’re used to doing in SQL are there, like the TSQL surface area. Like one of the biggest pieces of feedback we got early on was, hey, this piece of TSQL is missing. I need that in there. And so that’s going to be continue to be a big piece of of focus going forward. That’s going to also include things like tempt tables, which we’ve landed more recently. stuff like that. And then then the the third piece I think is making sure that people have the
36:32 Capability to get from where they are today into this environment. And that’s where you see investment into areas like the migration assistant for instance that are going to come into play and you’re going to see that continue to improve over time. So if you do have an environment on prem or in Azure SQL DB or in Synapse today, we make it as easy as possible for you to get from where you are today to where you want to be inside of a fabric. This is probably one of my larger friction points and many organizations I
37:04 Think struggle in the migration of databases from onrem things into cloud spaces and again I think this is actually a really your your point there Brad I think is underrated in the fact that how muchrictction send that to my boss if you don’t mind it’s well it’s it’s an it’s an underrated statement so I every time I talk to companies who have something SQL on prem They’re like, “Well, yeah, it’s it’s working, but you it feels like, , we get to a place of, , we need to res we need to refresh some hardware. We
37:36 Haven’t really upgraded it to like the newest hardware. We know we need more memory on it. It’s got to get bigger. The machines like running out of space, and they’re like, we really want to get over to the cloud, but they’re like, it’s on our to-do list.” It feels like a lot of these are just on the to-do list and there’s very few companies that are able to like very quickly with a lot of without a lot of extra friction like move it from the on-prem to the cloud side of things without like an intentional project around it. Now all this being said like you should be very strategic if you’re that business if you’re listening to the podcast right now and you’re like hey we should get ourselves to the cloud two
38:09 Points around this one. One I would say it’s very worth your time to figure out how to do it. It’s it’s definitely much easier to manage things and the scale of if you if you start with medium-sized data, it just keeps growing. You have a lot more leeway inside Microsoft or cloud-owned systems. I love that. So do it. The second thing is you need to be very intentional about planning getting it out of your onrem. If you don’t specifically set a deadline, work towards it, , just I the number of conversations I’ve had of like, we really want to get to the cloud. It’s on
38:41 Our to-do list, do it like get it done. Like figure figure like put put a no development time frame like when can you migrate this thing? Get it over to the cloud. Start doing it now. It’s sooner than later. It it eases I don’t want to have to deal with gateways. I don’t want to have to move data between stuff. There’s a whole bunch of bottlenecks. you add a lot of touch points of things that potentially could fail by not getting it to the cloud. And so my opinion is the sooner you can get stuff over to cloud, the the less items. And and again, I’m going to take a like them
39:14 Or hate them. I’m going to take an Elon Musk route on this one is less is more, right? Always simplify. If you can take out systems, if you can take out machines, if you can take out hardware, if you can have less things, it becomes easier for you to manage and build. So, I’m always of the opinion of simplify, simplify, simplify. Let’s start what are we building? Can we simplify first and then build net new stuff after that? And yet the operative word there Mike that you said to me is intentional because outside of the semantic model to me I see warehousing being the most
39:48 Intentional if you’re going to do I can test things out with the lakehouse it’s and quickly get data in I can create a notebook not even have data and just run a python run a script and configure and just have a little fun but to me this outside of the semantic model I think to your point Mike a lot of people are seeing the warehouse as the most intense intentional like we need to have a specific product or project rather around this before we start and if we have that we need the resources and if we have the resources we need to make sure we have all the technical
40:19 Configuration and Brad it sounds to me what you’re saying is not that that’s been the challenge but that’s where the team has been the most intentional with some of the tooling like when I I look at the road map here I am seeing a whole smorges board of features in terms of things that are pure TSQL like hey you can do now copy into open open row set and json l and there’s also things like hey time travel and hey you have a the fabric editor which has the nice user
40:51 Experience so trying to make things as intentional as possible for people but when it comes to the pe what you’re hearing from people and let’s talk about those creature conflicts then where have been the what have been the areas or where do you see for someone who’s listening to the podcast today or myself maybe I’m just being selfish here where it’s like hey you want to get your ease into this you want to feel comfortable in this what do I need to know yeah so I think there there’s been a
41:26 Little bit of a a ramp up to this period of time where we’re able to actually start building some of these things like we’re talking about look at the road map for the for the last six months and the upcoming six months to stuff that’s out there. you’ll see a lot of TSQL and like yeah there’s some other capabilities other functionalities that make things easier like the copy into or open roset from one lake but we have to it doesn’t do you any good to have a migration assistant if the TSQL that I want to run is not there and so I think that’s why you see now
42:00 Is is that’s part of the reason why now is the time where you’re seeing the migration assistant come into play is and we had to get some of these that were somewhat baseline if we’re being honest for running a data warehouse like I had to be able to to truncate tables. We’ve got identity coming up like these things. I’ve got to be able to have my temp tables because again if I run my code through and it says only 40% of it converted that’s just going to make people upset and we don’t want to to do that. So I think there there’s been a lot of ground work over
42:31 The last two years laid to get to to where we are today. , I will say on the point of , intentionality that that you guys were just talking about there, I think there you guys are 100% on that. Like there’s two things I think in in the fabric world or in the analytics world that that people go to for data. It’s their data lake and depending on which level you’re you’re at, data scientists want to go as raw as they they can. , but that also is the hub of everything else that you do in your organization. And then you got the data warehouse. like
43:03 Your data is probably coming feeding your semantic models from one of those two places. , and so if you can’t, , you have to be intentional about putting putting those two things. I agree it is easy to just put a lakehouse out there, throw some files in. But I would argue you have to be just as intentional about what you do with your lakehouse as you do your your warehouse to an extent because of that. I would agree. , I think in our conversation here, Brad, on a couple of your pieces here, I’m becoming more pro- data warehouse just by hearing you like and I think Yes. What I love to hear. Well, and , again, I was hesitant cuz I I came from data science world. That’s
43:36 Where I came from. So, I I I think you’re you’re spot on and I think there’s actually more opportunity here. Me personally, an observation I saw with fabric with what you’re speaking to at this final point here was the billing method has also changed here slightly as well. So there’s there’s been a change a shift in billing method in addition to like before I felt like I had when I was talking like synapse world and and and previously any MPP architecture I was paying for all the MPP and the machines to be up and you were talking about like a single node cluster that could be spun up and just run everything on one node. Again
44:09 It’s about the query the planning is it efficient or not? Do we need a lot of machine or a little bit of machine? the fact that you’re doing a better job of like only letting us pay for what we’re using like the the argument of like oh data warehouses it I think there was a stigma around it which was if I turn it on it’s going to cost me a lot of money and I think to me in fabric this is different like I want I want people to really have a a refreshed thought around this don’t think that way anymore go try it out go do some things and see how the CU usage looks now because it’s way
44:43 Different Then just turning it on doesn’t mean it’s always sitting there billing you c use. It’s a as you use it type mentality which is I think is a a huge point here. Sorry I didn’t mean to interrupt. Right. Keep going. You had a second point. You’re going to say no. Well I I think the billing piece of it is interesting and and very very important on that that side as well because like I think the way I would describe fabric compute is it’s a it’s a a dedicated capacity with serverless compute. M so obviously you have to pay for it all the time but the compute comes and goes
45:16 As you use it. So like you’ll see a lot of times like you can start with an F2 or F4. You throw a little bit of data in there and like you said see what the CUS are and then figure out where you need to go from there. , and so it’s nice to be able and it scales on demand or not on demand but scales online is what there and so you don’t have to like shut it down for with Synapse a 20 minute downtime plus or minus 100% if you will to figure out where you need to go. so I think from that that perspective it it does make things a lot
45:48 Easier for people to dip their toes in the in the water here. , but I guess the the other thing I’d just say generally about the migration side of things is you’re going to see a lot more come into this space. I think in order to get people into the environment easily, , I think right now you have to upload a DACP pack. , I can’t remember what’s I get confused sometimes. I have to go look at every now and then like what’s internal, what’s external on on this. But there’s like you’re going to have the ability to say hey connect to this this synapse data warehouse or
46:20 Maybe connect to this SQL server and that way you don’t even have to export a DACP pack in order to do it. You’ll just be able to give it the the connection endpoint and give it this the database name and we’ll we’ll just pull it directly from there as opposed to having to go up export and import. you’ll see the same thing happen on the the data loading side of it. So when you get done loading your your schema and everything and if you’re happy with it, there’s a button in there where you can say create a copy pipeline and or a copy activity copy copy job. Again, too many names. copy
46:54 Job in there to be able to move the data all over. And you’ll see that expand to all these other sources. And who’s to say that we’ll just stop necessarily at at SQL Server and maybe there’s the ability to bring in from other sources if people come to me and yell and say, “Hey, do this this particular source.” then I can go talk to the right people and make sure that we get that feedback over to them. So again, I think you’re going to see that one-click, multiple click, but the one-click experience, that assistant type experience continue to just get better and and better over time and more. So, I want to give you confidence, Brad.
47:25 You were spot on about I just pulled the documentation. I also put it in the chat window in case you wanted to see like the prerequisites of what you need to have for the DAC pack file. So have a workspace have a fabric capacity have a create your workspace the dacpack file ex extracted from Azure synapse dedicated SQL pool or a dpack file data tier package file you can then create the deck in visual studio and then you can pull that up to that’s the migration piece there as well. love this and again this this is our our point around I see friction like these people are like
47:59 Companies are planning efforts around should we migrate we know we should when’s the right time and I think this is just one of these opportunities to start removing some barrier to entry here to say yes we can just make the migration happen let’s remove some friction here so it’s not as much of a it’s not a long downtime to get something up and again the idea of the deck pack is it’s like a backup of the SQL database and all the things that in there. Correct. so yeah. So so backpack takes schema and data. DACP pack is just
48:33 Permissions all that stuff. And and I think the other thing too like these are never short projects. Like I don’t know if you’ve ever done like a SQL server upgrade or move to like migrations don’t just happen overnight. , and so I think another area that you’re going to see get better and better over time to help folks with this is the ability to say, “Hey, I already I already migrated this particular database from this SQL server.” , but maybe I don’t want to bring everything over. Maybe I only want to bring some of it over so I can maybe do a schema at a time. And you’re going to start to see the ability at some point in there come in to say, “Well, now I I already did
49:06 This migration. Let me go pick up the next piece of it.” Or, “I did this migration. and I modified, , I fixed this one sore procedure. Let me go rerun it. Cuz today you have to go in and just create a new warehouse every single time. And that that doesn’t necessarily fit the reality of the way projects work sometimes that will will obviously work because you’re going to you’re going to pick some of the heaviest workloads first and just go, , let’s let the two or three heavy ones get them out of out of here, right? There’s a couple little side projects or something. Like we can’t assume that one server is only one function in the business. like it’s probably potentially serving many
49:39 Different databases which would be doing many different solutions or apps or whatever. There may be a lot of stuff behind it that are actually supporting multiple projects for that matter. Sorry, Tommy. Yeah. Yeah, that I was just say real quick. So, there’s two I think there’s two sides of that that we have to handle. It’s the the one database that we want to move over incrementally, but then we also have to be able to say I have a one one database now. I want to split that into multiple databases on the other side of it so that we can rearchitect a little bit at the same time. No, I was going to say I I a lot of the words a lot of things about the
50:12 Warehouse that is important for us to know is unlike a lot of the other features this is not like a light switch we just turn on a warehouse and you can the ability is there but in terms of it actually being an efficient part of your organization and part of the process that doesn’t happen overnight. , with that I know we have there’s a few features of co-pilot and without obviously losing your job here. Do you see co-pilot being more of a creature comfort in terms of from a project point of view because right now
50:43 It’s the SQL analytics endpoint. , it’s available in the SQL side but that’s more of a call it a per file type of thing. It’s not in the context of the entire project. To me, if there was ever a situation for an integrated co-pilot for a single artifact in fabric, it’s the warehouse. Is that a conversation? Is that something that we can talk to somebody? Do , can we get a number? Do we talk to you? , well, let me ask you this. What What do
51:16 You want to see from that? Like, because because you’re right today, , it’s it’s a little bit of a, , here here I’m in line with my TSQL code. Explain the code to me. German line my code fix my code or generate this code that thing. So are you thinking like you want to tell it, hey here’s my database, go help me migrate it or like what do you want? Well, let’s talk about what we talked about the steps. We talked about the people coming to a warehouse and these people if they want creature comforts they are on that I know a warehouse but I’m not vertically aligned here. I don’t have that vertical step. So the
51:48 The to your point the where do you start? The majority of people I imagine listening to the podcast if you do it bonus points for you never heard of a DAC file before much less a backpack file. And again those people are already in depth with warehouse they’re getting what they need. But for a lot of users who were saying what we needed this fabric and I guess a warehouse is not as big as we thought. So how do we do this here? There’s almost that like give me the step by steps on where do we start? Here’s the data. Do we have
52:21 Generate a plan? Oh, not generate a plan for us, but to that effect of what do we need to turn on here to your point, , let’s start with migrating data rather than just help me generate code on a sing, , a single query. , so yeah, I I I’ll start there and I’ll let you try to navigate those waters. , no, I I I think maybe maybe where that that’s going, correct me if I’m wrong here, but like do you remember in inside of Azure we had the was it
52:54 What’s the thing? , Azure the recommendations like the Azure Azure. Love those. Yeah. Yeah. So, I would think what you’re explaining there to me would would be a little bit of a an Azure advisor thing, but not necessarily just on things that I’ve already built, but also, hey, like you say, hey, I I want to go build this. And then we we would say, all right, well, you want one warehouse to do this, and maybe in this case, you want to use data clustering or or something like that, like here’s some advanced features maybe for you to be able to to use. , or, , just have it go in and do an
53:27 Analysis of your warehouse like No. And I I Yeah, good. I say I don’t know that we have anything like that. , not that I’ve I’ve heard of, nothing like that in the works, but I think it’s a really interesting idea like how like other ways that we can just help people get up and and running more quickly in there. We’ve obviously got the framework and everything for that. We know what best practices are and like we’ve got all these different architectures and all these things. So, I think it’d be really interesting if we did have some extension to give you that and then even drop you into some of these experiences like the like
54:00 The migration assistant or here’s a code to build a table or something like that. And we can do some of that stuff like at a very basic level. I think today like you tell it here’s my here’s my business use case and gen what help me generate the schema for it and it’ll it’ll help you with that a little bit. But I think what you’re looking at is is maybe taking that five steps further than that and making it a bit bit more useful for folks getting there which I think is a very interesting what I’ll do is I’ll book you for March. We’ll follow up on this and we’ll talk about how great the feature is. No, but I I think this is I I I think I love
54:36 This idea here and I honestly a lot of the features that available the migration assistant itself, right, that was launched in Q1, correct? yeah, February, March, I think it was or we announced it I think we announced it back then and it came out came out right after FabCon like beginning and I think for a lot of users and again our organizations and Mike tell me if I’m wrong here they the guidance principles I know I know I just kidding that one I you gave me a softball I’m gonna hit this one rather I’m gonna hit I’m gonna swing every time Tommy
55:08 After five years I really should note this so that’s no but I think that a lot of the features here where there’s this really good the level of comfortability for someone who wants TSQL is getting better and better and then with the migration assistant the fabric editor the web editor and also things that people would be used to in a normal SQL background like snapshots and the time travel too where it’s like how do I actually make sure that I have a backup of my data is becoming more and more
55:40 Available so I love seeing where it’s going. Brad, if if I were to ask you as of today, where someone’s going, , if you were to take one thing from all the features that have been released and drop everything else from to say this is what you if you want to be the most comfortable in a warehouse and you are mid-level here, what would you focus on? What feature would I latch on to? What would that be? well, so there’s in in the last
56:13 Couple minutes here, I do just want to also mention one other thing before I answer your actual question there. because I I want to get your thoughts on this actually. you might have seen that we’re actually getting rid of the default semantic model. Oh, we we raise the heavens on that one that Okay. I was I I thought so. That was like one of the biggest biggest pieces of feedback we had for a while. and and and it’s and the reason we like it’s being gone was the UI that we wanted to build with it was a little bit limiting like it just didn’t do the same things that we can do just by creating the semantic model separately and and there was some weird syncing issues
56:46 There like so very happy that that’s just disappearing and we just get like the lakehouse and then just at least give us like an auto an autocreate button and here’s the here’s the wizardy thing of like okay I have lakehouse I have something here’s the things I in my semantic model or warehouse and boop, it just like works. It just links together and we’re good. Like that’s all we think really needed. I bought a cake to celebrate. Okay. It was a good it was a good day. So, but question. So, so why do you ask that? No, I’m just curious to because I we get
57:20 I hear from a lot of folks that are like, why did you guys put this default semantic model on here? It it’s useless. We don’t ever actually use it. Not that it’s necessarily useless, but like most folks don’t use the default semantic model. And I don’t know if it’s still in the the docs or necessarily anything, but I think we actually at one point said we don’t necessarily recommend even using the default semantic model. It’s there in case you you want it, but we always recommend that you create your own semantic model separately from from the item. , and so that’s that’s why I ask like it’s one of those things we we’ve heard feedback on it really since the private preview
57:53 Days of of fabric back when it was still Trident. , and I think it’s it’s one of those things we just finally got around to and said, “Hey, we’re going to go ahead and do this. Make people happy.” And so I just want to make sure the people are actually happy about this. , because and that’s also one of those things like most folks probably won’t even notice that we announced that thing. One day you’ll just wake up, you’ll create a new warehouse and or lakehouse and you’ll be like, “Oh, I forgot there’s no there’s no default semantic model there. I’ve only got that one thing in there now.” So but yeah, that that’s the only reason to ask. I feel like this is potentially
58:25 Like a leadership level and and I know like air talks a lot about like the five by five right so there’s like five minutes to start five minutes to wow right so there’s like minimal five minutes to do something impactful right that’s was the initial mantra of powerbi and I think the they just kept adding fives to things right so it’s five by five by five like so now so now five minutes to data to my lakehouse 5 minutes till the the semantic model shows up five minutes to report shows up so I I think that was part of that idea. Five sub items behind.
58:57 Exactly. So now we’re up to 15 minutes total or whatever. So So I think a lot of this was Microsoft was I think in the right heart and intention to like let me autocreate some stuff for you so that way as soon as data appears it’s immediately available and ready to be used. I think what we’re finding though is to your point Brad I think it’s a little bit more the technical people want to put their hands on a bit more right. We’re not we’re not we’re able to get some stuff done, but to build the simple reports on top of a default semantic model. We just there’s no there’s more metadata that needs to be
59:28 Added to that semantic model, relationships, measures, and things. And so, yes, it helps, but it doesn’t really get us really where we’re going. And so, while it’s a novelty when you started the process, it’s no longer really needed because everyone’s like, “Yeah, okay, we get we get it now. We understand how it works. We don’t need to go use this default thing. We’re just going to go build our own. Now, Mike, to your point, yeah, there are people who turn off auto day and time and people who don’t know what they’re doing. It’s very simple. Okay. Yes. So, exactly. It’s very simple here. So,
60:00 Exactly. , one other thing I just I want to feature that we we called out in the liner notes here that I do want to just pay homage to as well. , in preview currently is create and manage data warehouse snapshots. I I think this is actually a pretty underrated feature and this is this was announced in May of this year. It’s in preview right now. It’s not currently released yet, but you can create the warehouse and you can immediately create creating snapshots. For those of you who don’t know, just very quickly here, we need to wrap on the end here, but Brett, maybe land us a home point here on snapshots. Why are they important? You
60:33 Know, what was the need here that the community was asking for and and what is this solving? Yeah. So, I think there I’ll give you both sides really quick here. From a developer perspective, when I would go build a data warehouse, you spend a lot of time figuring out how to get all these different pieces to line up. I’ve got my sales data that comes in every 15 minutes. I’ve got this other data feed over here that comes in in real time. I’ve got this other one that I get in a batch every night that may or may not arrive at midnight or whatever that happens to be. And you have to get all that data loaded because if you load your sales before your customers, then
61:05 You end up with orphaned records and all those these things. So, I think from a from an ETL developer perspective, it takes a lot of that orchestration out of the mix and allows you to just say or I guess maybe I should back up real quick and say what snapshots are. Snapshots are basically a a view of your warehouse at a particular point in time and it’s frozen in that point in time. It’s readon. it carries over the security from your existing data warehouse. And so what you can do is you say I create a snapshot. Everybody’s now going to point to that snapshot for their reports, their semantic models, whatever
61:38 That happens to be, as opposed to pointing to the data warehouse itself. So that way all those things I was just mentioning as an ETL developer, I can load the data in whenever I need to. And I can trickle feed that data in, but my users are seeing an a the copy of the data as it exists of that snapshot. They don’t even know that I’m in the process of updating. Love it. And then what I can do is I can just go run a TSQL script and say update my snapshot to the current point in time. And now all of a sudden my users click a button and they’re like, “Oh, hey, my data is all up to date. I’m now got all my sales and all my customers and all these things.” So I think from
62:10 The the ETL developer, it simplifies the orchestration. From the end user perspective, it makes sure that you you can be more confident in the data that you’re looking at that you don’t have to worry about, well, this is only a partial data set or maybe something went wrong in the ETL last night and I just don’t know about it. I don’t have to have that little status at the top of my report now that says when the tables were last updated or any of those things that we used to do back in the day. , so it gives my my end users a confidence in the data that it is in fact the gold layer that has already curated, already been cleaned, all that stuff. , and like like you
62:43 Said, it’s just a a simple roll it forward, roll it backwards. , it’s basically a time travel for everything to a specific point in time , inside your warehouse. So, I think it’s got a lot of benefits for both sides of that. Tommy, did you want to say something before we wrap? No, honestly, , this has been awesome, especially for someone new data warehouses thing. Like I said, you’ve been selling me, Brad. I have to I’ll give you props. You’ve been selling me the pen pretty well here. , but no, this I think for a lot of people again, there’s going to be this mental barrier
63:17 That they have around warehouses like, well, I can’t do that. I’m not enterprise. But I think we’re hopefully breaking down. Look again. Look again. Look, look again. Yes. So, I would agree with that one wholeheartedly. I think I think there’s Yeah, Tommy, I would definitely agree with you there. There’s There’s Take another look at this. This This is something I think people would want to to step back and say, “Let’s revisit this one.” And again, I’m really excited of the snapshots. I think snapshotting is amazing. The amount of times where we’re trying to like snap some data and like build something on the side and then replace like hot swap it at the last minute so it’s all corrected and
63:49 Like what happens if a data load fails? you just get the data a little bit longer before you refresh everything and so someone can like bless it and say yeah we’re I think snapshots is underrated because I think this is a huge feature for reliability and again this is one of these feature sets that I think Brad what you’re asking about here we talked about the whole time is bro devs showing up and saying this is something we need we need the ability to control this one and you can control it with the fabric portal and also the rest API so you can you can do programmatic things with this as well which makes
64:21 Sense because now I can run a pipeline or a copy job or whatever and then when that’s done then we can roll the snapshots over at the end of the process assuming we all have successes and check things. Well, but the last thing I’ll say about that is is like the roll back piece I think is super important to that too because if you do this like in a SQL server environment, you go do all your ETL, you have to roll back like what do you have to do? Well, now it’s how long does it take me to restore my database? How long does it take me to then apply all my transaction? Like it’s it’s not a simple thing here. you’re like, “Oh crap, the data is is messed up. I realize some some pipeline didn’t run
64:53 Last night, but the the snapshot move forward.” Okay, well, let me go run a TSQL command and it’s already rolled. I love it. That’s so smart. Anyways, awesome, Brad. Thank you so much for your time today. We really appreciate you going deeper diving on all these creature comforts of the data warehouse. I am much more of a fan than I was before. I love the structured data sets. This is getting better. Oh boy, Tommy and I have to learn some new things now. So, this is exciting. we’ll probably have more discussion about this coming up. So, thank you so much for your time. Really appreciate you, Tommy. , let’s just do the wrap here. Where else can you find
65:24 The Explicit Measures podcast? Oh, man. Mike, you can find us on Apple, Spotify, wherever you get your podcast. Make sure to subscribe and leave a rating. It helps us out a ton. And share with a friend. This is free free content on the web. And do you have a question, idea, or topic that you want us to talk about? Maybe something around warehouses for chance. Well, head over to powerbi.tipsodcast. tipsodcast. Leave your name and a great question. And finally, join us live every Tuesday and Thursday, 7:30 a.m. Central, and join the conversation on all of PowerB.tips
65:58 Social media channels. Awesome. Well, thank you all so much. Appreciate you. Have a great day.
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
