Lets Talk SQL and New Features: The Future? – Ep. 451
SQL isn’t going anywhere. Mike and Tommy discuss the latest SQL analytics endpoint updates in Fabric, share a listener’s experience connecting Fabric SQL to Power BI Desktop, and debate SQL’s future role alongside DAX, KQL, and Python in the Fabric ecosystem.
Beat from the Street
A listener shares their experience connecting to Fabric SQL from Power BI Desktop and creating a semantic model on top of it—bridging the traditional SQL world with the Power BI semantic layer.
Main Discussion: SQL in Fabric
What’s New in the SQL Analytics Endpoint
Drawing from What’s New and Coming Soon in SQL Analytics Endpoint:
- Performance improvements for common query patterns
- Better T-SQL compatibility
- Enhanced metadata and schema management
- Improved integration with other Fabric workloads
SQL Analytics Endpoint: What It Is
For teams not yet familiar:
- The SQL analytics endpoint is the SQL-queryable interface to your lakehouse data
- It auto-generates tables and views from your lakehouse Delta tables
- Enables SQL-first teams to query lakehouse data without Spark
- Supports T-SQL syntax for familiarity
SQL’s Role in the Modern Stack
Mike and Tommy debate where SQL fits:
- SQL for querying — Still the universal data language; everyone knows it
- DAX for measures — The semantic layer language; where business logic lives
- KQL for streaming — Purpose-built for time-series and log data
- Python for data science — Notebooks and ML workflows
- SQL isn’t competing — It’s complementing each of these
Connecting SQL to Semantic Models
The listener workflow:
- Data lands in lakehouse
- SQL analytics endpoint provides T-SQL access
- Power BI Desktop connects via DirectQuery to SQL endpoint
- Semantic model built on top with DAX measures
- Reports published to the service
This hybrid pattern works well for teams with strong SQL skills who want to leverage the semantic layer.
Looking Forward
SQL’s role in Fabric is solidifying—not as the only query language, but as the universal bridge between data engineering and business intelligence. The SQL analytics endpoint improvements make this bridge more reliable and performant with each update.
Episode Transcript
Full verbatim transcript — click any timestamp to jump to that moment:
0:16 Heat. Heat. Good morning and welcome back to the Explicit Measures podcast with Tommy and Mike. Good morning everyone and good morning Tommy. How you doing? Yeah, good morning Mike. How you doing? I it feels it’s so Okay, so just caveat for people who are listening this is a
0:50 Recorded episode. What do we do? We’re doing this one actually at night, but I’m so used to seeing this screen with Tommy and I on this. I can literally doesn’t even matter. It I just say the words now. It’s like I know it’s a good morning. It’s a It’s Tommy. We’re here. We’re talking. We’re having fun. We’re already talking before the episode. I know. These night episodes when we record them, Tommy, I think they’re spicy. Honestly, okay. The reason I think these are a bit more spicy is because we’ve had a whole day to think. We’re like up. We’re getting a little slap happy. It’s in the
1:22 Evening. Like we’re like it’s on, man. This is This is going to be a fun episode for Man, the life of a consultant. It’s It’s late for us. And you’re right. I think it’s We’ve had a day. 7:45 is late for us. We’re ready to go to bed. Yeah, it was 7. It would be great if it was 7:45, but Oh, listen. We’re We’re already just in that mode of like I’m exhausted, but at the same time, I’m still on. So, yeah. I I cannot wait for this one. So awesome. well, let’s get into the main topic for today and then we’ll go
1:53 Through some news or beat from the street things some things we’re observing around fabric. Let’s talk about SQL, the new features around SQL. So we have a fabric SQL database. Let’s talk a little bit about that. What does this look like for getting into fabric? We have this SQL endpoint. there’s actually a great article here by I think it’s Anie Ansie Anie Phillip we’ll go with that. she writes a wonderful blog around how we’re going from preview into GA what is that going
2:25 To look like? What are some coming soon features case of in insensitive information? We’re going to talk about some other varax our favorite right? So oh man this is so exciting. , we’re gonna get really in the details, but anyways, we’re going to talk about this blog post here coming out here. What is Fabric and SQL? We’ll unpack this a little bit as we go through the episode today. All right, Tommy, with that any news items that you have, Tommy? Anything that’s going on in your world that’s interesting? No, I I think I heard you got something, my friend. So, and we were two things actually now that I’m
2:57 Thinking about it. Okay, we were talking offline. I’m like, save it. Save it because dive in. So, so let’s let me do the fun one first and then we’ll do a little bit more of the real real world technology one. Okay, f first off, I got to say congratulations. So, this is a this is a record episode. Congratulations to the Brewers for winning 12 in a row. I had to just rub this one in there a little bit. I was seeing all kinds of memes today that the Brewers have completed their 12th win in a row and which is putting the Cubs even further behind in the series altogether.
3:31 Again. It’s I know I know I know not your team but you’re in the area. I gota I gotta ri you a little and it’s crazy. So it’s fun when you actually let’s tie this in. So the Brewers are actually what would be considered a small town team from a salary point of view. So they’re a phenomenon them and the Rays because they use analytics like no one who the Rays. The Tampa Rays. Tampa Rays. Okay. The T they’re not even playing in their own. They’re playing in a spring training stadium because they’re Hurricanes. They have no money. And yet
4:03 And yet the Rays have been consistently one of the best teams in baseball or comp competitive wise despite having one of the lowest salaries. I think one of the stats is their whole team makes up a single player. I forgot which player, but and you think about this, data is taking over and there is something to be said. There’s I think we’re in the mindset that data can do everything. baseball has a little of like I I lean towards more of it’s the I and the data test like see because but there
4:37 Is this listen we’re living in an age now where we I think we have said it for a ton we there’s more data in the world than there’s been going to be created next two years yeah it’s exponential growing so fast and now all this trash AI junk that we’re getting thrown at these days like I every other day have you been watching your YouTube shorts feed. Tommy, have you not been seeing like the prolifer? Plea close to throwing. I think I was missing the two words together. Proliferation and plethora together. The proliferation of all the
5:10 YouTube shorts from the PowerBI tips explicit measures podcast. Well, thank you I AI. Thank you for helping me make easier shorts faster. Let’s go, baby. So, the reason you have a bajillion of these things in your in your YouTube feed now is because AI is making it easier for me to make content from us talking over here. So that’s all that’s doing is taking one hourong form piece of content and making like 13 little small minute pieces of content as well. This this is how it’s going to go. We’re going to have all of content now. The fact that we are not syndicating in all 300 languages across the world yet
5:43 Is on us now. We can do that right now. YouTube’s already doing it. YouTube’s already YouTube’s already added the AI with our cadence. Like we can literally start Oh, actually Tommy, we’re going to get cancelled. We’re going to cancel us. You can’t start using other languages on the podcast. We’ll get cancelled so fast. AI now where you can actually sound like us in our temperament, in our cadence, except we’re speaking a different. All right. At some point, Tom, we should take the whole show and put it into Italian.
6:17 Go find an AI, Tommy. Take one of our episodes and put the whole thing. We’re g I will I will sit down and popcorn eat a whole episode with you Tommy and we can watch it when we do the whole episode in Italian. That’d be hilarious. We’ll take that on the road southern Italy. That would be perfect. So ma ma. All right. So well yes congratulations to the brewers. They are doing great for a small again small town team. Use the numbers to make it work. Now Tommy you are pretty close by. I also know have you heard of George Webb? George Webb? Yes.
6:51 The restaurant, right? Yeah. Okay. Not not the person. I don’t know if there’s probably a person named George Webb, but no, the restaurant George Webb. Apparently, the Brewers 12th straight win means George Webb will give away a free burger giveaway. So, there’s free hamburgers at George Webb. And I guarantee you there’s going to be some dingbat who’s going to go to every single store, like literally make a a trail and hit every single George Web and get like multiple burgers from this thing. Is that just Wisconsin place? It’s in Wisconsin. Free George Web burgers are coming.
7:25 They’re already They already announced if you follow their Twitter Twitter page. We’re buzzing with the excitement. We’re working hard to prepare. Please note burgers will not be ready right away. A special giveaway will date will be announced with all the details. Stay tuned. Well, we need the Cubs to go on a streak shoot. Well, I’m thinking pillows. That’s what I’m saying. Tommy, you might need to make a quick trip up to here and if we So, here’s a here’s I here’s what I will commit to you. If I can get the details on the George Web thing, Tommy, and we can get things situated here, I would meet you halfway at a
7:57 George Web somewhere, and we should just have a burger, take a picture, and go home. Like, that’s that’s what we’re going to I’m going to do you one better. We’re going to do a live show at a George Web. I don’t know. No, we’re not doing a live show at a web. What? Free burgers. We’ll get an This content will just have gone downhill way too fast. We’re not allowed to do that. Okay. Okay. Anyways, okay. That’s enough of the fun brewer stuff. So, that that was my fun thing. The other thing I want to talk to you a little bit about here is I want to go after So, this is a beat from the street. Okay. Some experience that I had that would surprise me and I’m just going to ask you, have you done this yet,
8:29 Tommy? Tommy, have you made a fabric SQL database? Yeah, I’m at the max capacity. You You’ve already maxed it out. Okay. You’ve made all the ones. They’re like, “Stop making all the work. You should probably just make more schemas in your existing databases instead of making more tried a warehouse. I got that popup.” Okay, good. All right. So, you’ve maxed out your your fabric capacity. Have you tried to connect through PowerBI desktop to those SQL databases using the one lake connector? So, not only that, but I’ve actually
9:02 Done that. So, the one lake connector actually Oh, I see what you’re saying. I’ve used the SQL database connection. Okay. The great SQL, right? And I’ve done that. Power Apps works great. Yep. Not wrong. Not wrong at all. This is This is just a different way to connect to it. I’m asking a different question. So I normally, Tommy, I’m thinking like you would. I would normally go into the SQL database. I’d go, okay, let’s go find the SQL connection. It’s a super long connection string. And then I would connect to it and then I get all this stuff. Well, surprise surprise. All these SQL
9:35 Databases come with automatic mirroring to tables. Did this? Automatic mirroring. Okay. So, so when you build a SQL database, a fabric SQL database. Yeah. You have one side of the database that’s transactional. You can like edit rows, all the things, , normal SQL database things. But what happens is there’s a an automatic setup where this SQL database is automatically mirroring those tables down into a mirrored table structure that is all Delta Lake called Delta. So today, this is like I’m today
10:10 Years old when I found out you could do this. You can go into PowerBI desktop. You can click the the one lake catalog button. Go down to the preview achum that says fabric. Click connect and then it asks you to where do you want to put the semantic model in the service. Oh, so you can edit the semant you can create that semantic model on desktop but you’re doing that with a database. I’m doing it with a SQL database. And so the interesting thing here so two things are interesting to me. This is something I’ve never seen before in desktop. This
10:42 Is brand new. I was like oo I like this experience. So first one was it was easy to connect to the lake to the SQL database. Second thing was the delta table is behind that SQL database and it’s mirroring. I got that I learned this from Alex Powers today which was awesome. He’s a really smart dude and I love learning from him. But in addition to that I it made the semantic model but it didn’t make it in my local PBX file. My PBX file locally was a thin report. It made the semantic model. It created the whole semantic model in the service
11:15 From my desktop. Right. What? That was crazy. You’re talking about the feature that I don’t Oh, we had her on the podcast once. I My name is failing to remember. Carly. Yes, Carly. We had Carly. Carly or No, Emily. It was Emily, maybe. Yeah, Emily. She She met us when we were doing the Ignite reaction. Yeah. And but she announced a few months back about doing this with one lake with a lakehouse. Same thing. Romantic model. Yeah. And you don’t
11:47 There’s no file to save which it really throws you through a loop. Well, I’m used to again I’m used to building in desktop and I’m used to having the entire semantic model there in desktop but I’m editing locally. In reality what it was doing it was actually saying set the name for the semantic model. Pick the workspace that it will live in. And so it makes the connection to the tables one. It was awesome. Very easy. Here’s the SQL database. Pick the tables you want. Bing bing bing bing bing. pick all the tables just appear in the semantic model very quick and easy and the best part was it
12:21 Wasn’t connecting using the SQL endpoint it was using the direct lake tables to get to the semantic model so with a couple clicks this was just brilliant I thought that the experience was very nice very seamless super smooth so hey SQL SQL team and semantic model team nice job you guys dialed this experience in it felt like super smooth to me. It made a ton of sense. Go here, connect to SQL data warehouse, pick the tables I want, drop the semantic model, boom, I’m
12:54 Off to the races. I’m making measures. Like it was like it was very seamless. I thought yes. And even though I was working in desktop, all the stuff worked in the service. It was all service- based. The only thing I was building locally was a little bit of the semantic model measures and relationships but everything the whole semantic model was already landed and working in the service. Okay, I got a question for you because this needs to be raised and let me preface this with I love the feature the experience and I’ve only done this with
13:26 The lakehouse with SQL analytics endpoint not a database. Yeah, but I think I’m feeling this feeling of a bone to pick with this because it’s the same exact experience with the lakehouse to do that is you’re describing the same exact thing. So with all of these things now with warehouse with the SQL analytics endpoint in the lakehouse and now a database what’s the difference between any of them? Not that’s the point though, right? Is that the point? I think it is the point. I think the point is it doesn’t matter whether
13:57 You’re connecting to like the lakehouse, the warehouse or the SQL analytics endpoint. There’s so I think the unifying thing for me is there’s a delta table somewhere, right? And so if you want the real time data directly from your if you want the real-time data directly from your SQL database you need to go direct lake or direct query you can’t go direct lake. So directly because there’s some bit of lag or lead time between like but this makes sense.
14:29 Think of it this way. The closer I can put my operational database right next to my delta table storage database if I can put those two things like side by side so they’re almost seamless. The transactional system can work as fast as it needs to and like every 5 minutes every hour it’s just synchronizing the records down to my lakehouse delta tables or the lakehouse tables that support the SQL. like this makes a ton of sense to me. Like I’m I’m on board. Like if you could make let me run my operational stuff and just have it sync easily to my semantic models. Done.
15:01 So I agree with the feature. But what I’m saying is there’s no different like so then what’s the best practice Mike if is it a SQL database or is it a lakehouse if I can do the same exact thing? Oh yeah it’s definitely use Cosmos DB. Don’t use any of that stuff. Oh yeah that’s what I meant. KQL. Sorry KQL all the way. That’s all you do. But this is You’re right. It’s insane. And now it’s becoming less of it’s becoming less of a technology break. So for me right now, right, it’s now more of what do ? What are you comfortable with? And
15:34 So we had this conversation on Tuesday last week around which do you pick, lakehouse or warehouse? And I thought Brad’s comments were like spot on. Right. The technology is getting so good. It doesn’t really matter which one you need. There may be like some nuancy things like hey if you have like semi-structured data you may want to start with a notebook but ultimately you’re trying to get to tables anyways so you may want to start really considering using the warehouse and using that to give you the performance that you need. So as I as I think about this ecosystem like yeah I like this. If
16:09 Your team likes notebooks and is a data science data engineering realm great keep doing notebooks. You don’t have to upskill your team to something brand new to keep going. Oh, by the way, Tommy, you guys love SQL databases. No problem. We have this whole warehouse and SQL database experience. Just keep doing that. And if those things are price comparable roughly for what you’re doing, I was going to say, right? So, as long as they’re roughly the same price, even if I’m again, if I even if I’m paying a bit more premium for one or the other side of these things, right, if your team already knows that knowledge, which is more
16:42 Expensive? Is it more expensive to run the infrastructure for slightly more or is it better just to keep the skills in your team that they already know and have to retool and retrain people or hire more new people who don’t know the skill? Like I would argue the scaling up of your individual team is the more expensive cost here. Oh yeah. here’s the only concern now we’re getting to a point where all the lines between these products and I’m almost going to quotations at this being multi in product. It’s almost a single product. But the
17:14 Difference is and my concern is with all the features being exact acting exactly the same is the infrastructure is still different and the cost is still different right if because if you use a database a warehouse or a lakehouse even though if you’re going to do the semantic model I imagine the cost is still different. So that’s something to be aware of as of right now because my first right well it’s going it’s it’s a concern it’s a different infrastructure it’s a different architecture like they will have advantages in certain areas like this is if
17:47 Back to your point Tommy if if there was only one database that was needed for everything we’d only have one database for everything there wouldn’t be like this semi-structured database you wouldn’t have this like each one of these database systems are trying to solve a problem that’s unique to like their area like custoto real time high volume lots of little measurements right Cosmos and that Cosmos DB is now in preview you like well what’s that for like JSON structure semi-structure like all this other stuff right and then so now you have databases and SQL is
18:19 Really good at what it does and so each of these things have been tuned for what they’re good at notebooks lakehouses like high volume big batch loads multi MPP processing each one of them are designed to do their own thing and right now it would be nice for the tool just to solve all these problems and pick the best tool at the right time. But I know that that’s not the case. And so until we get to a world where the AI can just do all this for you, right? And say, hey, look AI, I’m going to give you an image. Okay, it knows what what storage engine to pick to do that. Hey AI, I’m
18:52 Going to give you a structured table out of a SQL database. It’s just going to say, oh yeah, we know how to do that. And it’s just going to handle it and store it where it needs to be most efficiently or cost effectively. Well, we’re getting to the point, Mike, and I don’t want to speak too ahead of things, but it sounds like to me Microsoft’s even looking at these individual products and saying, “Oh, well, warehouses for structured data and lake is for unstructured.” They’re like, “Well, why can’t a warehouse do unstructured if we just implement X, Y, and Z?” We’re really blurring the lines with each of these different in like infrastructures here, man. But why? But
19:25 Yes, I I agree and I I I agree with your sentiment and I think I also echo this, but on the other hand, I’m like, we should be blurring the lines. I do I do think this makes sense, right? No, I would agree. They can all have the same features. Yeah. Let’s store all the semi or nonstructured data in in files area or in a lakehouse thing, but then let’s put pointers inside a database that’s really good at what it does. I don’t I don’t know if are you do you follow Duck DB at all? Yes. Yeah. So, I follow Mim on Twitter and Mim on Twitter talks a lot about Yeah. The legend of Mim. Oh, that would
19:59 Be a great shirt. Like a legend of Mim. Yeah, I like that. , but Mim does this thing where he just starts talking about , , Duck DB and I think there’s like Lake DB as well. And so the duck duck lake DB careful careful almost got censored there. duck lake DB databases or something like that. There’s there’s something like there that’s that is in there as well. But the idea is it’s mixing part of the big data side of things with a SQL server or the the duck
20:34 Database as the SQL sideesque type of thing. So if you put these two pieces together like the delta parquet side with all the structure the files and the and the massive parallel processing but then you join it with like hey we need a lot of like metadata that supports all these different files. If you blend these two really good systems, the updating and inserting of records is really good on the SQL side, but then the semi unstructured data is really good on the database lakehouse side. Like you you put those two together, it’s like this really powerful combination. And to your point, Tommy, I
21:06 Think this is what we’re going to get to. We’re going to find that these data systems need multiple different kinds of engines to do different things at different points. And it’s now like instead of just building a fish, we’re now able to build an octopus that has all these different legs that do different things for whatever they need to be done. I don’t know. Tough analogy. I was like, this has been a long walk to instead of getting a fork, we get a spork. Yeah. No, but I I think this is the cool man. I I don’t think a lot of us appreciate where we’re going with this.
21:38 Like it’s nice to see the features come out, but again when you when if you were to talk about this stuff 5 years ago with a warehouse and a lakehouse and a database, again different skill sets, different services to try to get your data in. It took, , you had to have a certain route. Now I was like, well, you do either one works and you can get in all three almost. So this is this is really cool. This is I I did not know that about the database, but I want to try that out. You should definitely check it out. It
22:09 Was a cool experience. I thought it was really neat. I liked how seamless it was. Again, Microsoft, good job on point. You’re making it easy to get in and out of the SQL database. I I was very skeptical when the SQL database showed up. I was like, I don’t know. I’m not sure if I’m going to like it. , why is why keep your transactional crap out of my data warehouse like side of things. Get it out of here. But now that I’m looking at it, I’m thinking, okay, I’m warming up to it. Like, I’ve always liked SQL. It’s a good language. I didn’t really like how long it took me to write every single stinking statement. But, you
22:42 Know, I I’m getting better at it. And now that Copilot’s there, I can just have it fix all my bad errors when I can’t find a cast. I can’t cast a value correctly or, , I need an additional, , formatting of something like just let co-pilot figure it out. my days of SMS man and doing that Google searches. Hey, let me ask you this and I this will maybe transition as we go into it but from a SQL point of view. How much do you go to Stack Overflow anymore? Zero. Never. Zero. How much did you used to
23:14 A good amount of time? I would actually enter in the URL and go there and go to it. I I will say this I was just doing some comparisons on the website recently and to your point Tommy I I go to co-pilot now. like I go to copilot or I go to I don’t have chatgp. So if chatgpt was in my browser I think I would use it more. I just don’t have it there. I use grock a little bit but I think my my go-to right now for most of my coding type questions. Copilot’s good enough. It gets me what I need done. I don’t need I don’t need I don’t need co-pilot to write me an entire app. Now I will say this
23:49 Well not that I don’t try. I do I do use but that to me that feels like VS Code. Like that’s what I’m going I’m going there to build like stuff, right? If I’m just trying to get like a Python function out or I want to get a little SQL statement solved. I’m going to like copilot that seems to be good enough to get me done, right? I will say this though is every so often I need an iframe and I need to test an iframe just for whatever reason. Today I was playing with Excel. You can embed Excel in a website page. Like oh that’s cool. Interesting. You can just embed an
24:21 Excel sheet into anywhere you want. Cool. And then I was looking at it going, hm, it lets me save and share this iframe thing. Okay. Well, how do I build? So, what you need is you need like a little bit of HTML page. Open up VS Code, start a file, blank file. Just type in test.html. Go to the agent by rep Claude and be like, “All right, Claude, I need a s a simple web page with an iframe in it.” Boom. It writes 300 lines of code, some styling, some simple stuff. I want it to
24:54 Be this width, this height. Boom. Done. I just paste in my code and it works. Like it’s what? Sorry, this is going to be a little off topic, but I So cool. I have a bunch of local models that I write notes on. I want to modify I and I I have like like categories. And I was trying to deal with it in something I downloaded from GitHub. It was like a grado app. So like all Python like well I could cut I wonder if just AI can take care of this. So cursor the the ID I use love cursor so goodropic max all that and I have this full app that I can it
25:26 Just starts on my computer runs and it’s a local local host whatever but it’s all HTML some little JavaScript a little Python and it has my the the like a logo for each of the models I have my notes if I want to modify something it’s created these filters it’s created some rich interactive features we are at a point with listen as soon as we plug this into data agents and the other things we’re trying to do. If you want to plug if you want to build something really deeply embedded with something in PowerBI, there’s not a lot stopping you.
25:59 What’s stopping you? Time. Time. It’s it’s the time to sp to spend the actual time in these products where Mike, we’re at a point where there’s no excuses. Like it really is I we we talked about this initially. AI is not going to take our jobs away. someone who knows AI will take your job away and that’s where we’re getting to. Yes, you are 100% on point with that because I think that is spot on. It will it won’t it won’t be the people who are going to take your job. It’s not going
26:30 To be it’s not going to be the AI. It’s going to be someone who’s learned how to use AI better than you and knows how to debug it better and use it better and prompt better. Like that’s going to be that’s that’s way more of a threat than the AI itself. Do what I’ve done some mornings? I’ve had my GitHub workspace open. I know I had some bugs on my Tindle and before I even get my coffee, I said, “Hey, can you debug this? There’s some I know there’s some tax errors and add some descriptions and then I go get my coffee, I go upstairs, then I come down, it’s done.” And it’s insane. It’s insane. I’m working with our developers right
27:03 Now on projects and we are writing code. We’re building things and they’re like, , again, all we’re all Yeah. using AI agents to help us build some of these codes and things and it’s it’s working. It’s great. Not a problem. But at the end of the project, now the most dreaded part of every consulting project is you hand the code back over to the client and the client’s like, “Great, you handed me a bunch of code. I don’t understand any of it. Where’s the comments? Comment and code.” Like, and I’m like, “Yeah, no problem. Yeah, no problem.” So, anytime we hand code back, we rip it through an AI. All right, tell your guys, here’s all my
27:35 Code. Here’s the spec that it came from or whatever you’re using. Here’s the lang like it’s in it’s in Python. It’s in M. It’s like, here’s a website that talks about the M. Go through this and comment every line. Comment what this code is doing. Add sections like this before every function. Describe what the function does in a general term. Describe the inputs. Describe the outputs. Like this is a prompt that you should be using on everything. Like it duh. All right. I’m going to ask you a question. I know they’re tied together, but then we’ll get to the thing. If you had to choose one or the other at this point in your life,
28:08 YouTube and the internet or AI, if you had to pick one, , you wouldn’t have I know. I know they’re tied. I know they’re you wouldn’t have the the AIS wouldn’t exist without the YouTube and the internet. I understand. I understand. I understand. I’m I’m going to answer this question purely as because how I’ve gotten here it is today, right? I don’t see a world that could have AI without YouTube and the internet. Just Just let me just start there. Like that
28:40 To me. I agree. I agree. It doesn’t it it’s like can you build a car without the pieces? Like how can like that doesn’t even make sense. Like there’s no there’s no way this would it wouldn’t happen, right? Would we have cars if there was no one ever invented gasoline? Like no, we wouldn’t. I know. so it’s a very tough question to answer. I would say I’m going to have to go with the YouTube and the internet. I think those have been pivotal and they have shaped and formed a lot of interesting like that that has formed a lot of what’s happening here. AI is interesting but I don’t think AI would have been
29:13 Like if you took the YouTube and the internet away. Let’s just say we had really smart computers with no internet attached to them. How would they ever learn? Where would we get all this information from? How would we get to how do we collect over? So yeah, like we’d have it would be so much slower. you’d have to go through all these books and other so I don’t think you could have one without the other now if you give me AI and I have the choice okay so let’s let’s imagine we’re in the world we are right now I have the internet and YouTubes and I have the AIs I think I’m going to maybe choose the AI maybe a little bit it’s interesting
29:47 That the stuff that people are creating with it is so creative and it fills in a lot of gaps that the internet was not good at and coding is not good at At the end of the day, this all feels so Star Trek to me. Like, like that’s what I I look at this, this all just feels so Star Trek because it’s it’s I’m looking at this and going when you’re on a spaceship in Star Trek, all they’re doing is like computer do this, computer do that, computer things. Like they just say data. Yeah.
30:20 Not Well, there was a guy that was the early Star Trek. the newer like now like well yes but like they have computers now these full things that are like so I’m looking at this going like they have entire there’s an entire computer built into a ship that the the ship is the computer basically and it can do all these AI things all over the place and it’s generating the stuff on demand when you need it all this information that’s where we’re at right now right now that’s I feel like we’re in this space the only thing I do is I just type to it right now at some point. I’m
30:52 Just going to talk to it and it just works. Dude, it’s it’s crazy, Tommy. The world we live in right now, it’s going to be 5 years from now, even two years from now, it’s going to be vastly different than where we’re at right now. I’m just happy sometimes I have AC so and these August month. Whoever made the snowblower and air conditioning should get like a Nobel Prize every year because it’s just hot. We got AI. Yeah. Just thank goodness have a snowb blower. All right, I think we need to get to the topic because we’re doing that late night thing. Oh, we are. We’re definitely rambling here much more than we should be, but
31:24 All right, Tommy, let’s go through the article here. What’s new and coming in SQL Analytics Endpoint input? And I alluded to this already. I was talking a little about the SQL Analytics Endpoint, how I’m interacting with it. Again, I just learned that stuff today. It’s been out since April. I did not know end of April. So, there you go. All right. Give us an overview, Tommy. What do you want? What do you want to talk about in this article? So, this is a pretty minor article. really talks about what’s new and what’s coming around not just the database but the endpoint again which is available in a warehouse in a lakehouse and as we know now in a database and to me Mike I
31:59 Want to have this conversation because I really think the idea of the endpoint is underutilized with a lot of people right now in fabric it’s a oh it’s nice to have or it’s there but I think what you’re what you mentioned in the beginning with the beat on the street is this endpoint does a lot more than just sit there and connect anything. Now that’s probably the biggest feature is this ability to query anything in one lake but it’s it solves so much and one thing we love to see is a huge
32:35 Part of what Microsoft’s working on. So Mike, let’s let’s set the stage here and let’s let me just ask you the question. Why such the big focus right now do you think Microsoft’s having on SQL integration with Fabric? And again, not just databases here. We’re talking about the ability to query anything in one lake seems to be a a hard drum beat by Microsoft, dude. 100%. , but why? , this is this is Microsoft’s bread and butter,
33:07 Honestly. Like, who runs more SQL than Microsoft at this point? Like, there’s other databases that are out there and Amazon’s pushing its version of NoSQLesque. They’ve got like MongoDB and they’ve got some SQL things, but AWS is on like what the MySQL space with whatever their Aurora database, Aurora DB, whatever. Anyways, they everyone’s got a flavor of this. Like, let’s be honest. I was arguing with a data scientist a while ago and he was like, “Oh yeah, well we do a ton of Scola. Scala is the way to go.” Like it’s on Spark. It’s
33:39 Going to be the future all this thing. I’m like is it really? Like I’m not sure I buy into that and was like the of the idea of like you realize like most of the world runs on top of SQL like everything like doesn’t matter like if it’s my SQL. Yeah. It’s a slightly different favorite. Oracle’s got its own flavor. TSQL like all these most of them it’s there’s so much out there and I know for a fact there are entire companies that run off access databases like the entire company. So I’ve seen it. I know it’s out there. it may not
34:12 Still live. They may not be doing great anymore. I don’t know but I’ve seen it. So I look at this going like this is why Microsoft is doubling down with this right? Let’s look let’s look at the we have a PowerBI ecosystem that is very familiar with Excel. Let’s call it we got a billion user in Excel. If we then flip over the other side of that hat or that card and say okay well of those billion users of using Excel, how many of them or how many other more individuals are also writing SQL at some level? Is there another billion of
34:44 People writing SQL on things? So 27ths of the world’s population is covered by Excel and SQL languages. Like that’s huge. So if you think about the scale of this, like it’s it’s got to be done. You’ve got to figure out how to make this stuff work in these ecosystems. And I think all you’re doing is you’re bringing the story of the SQL developer closer to where fabric is and getting them more comfortable building in that space. So I think you’re going a little too hard there saying the SQL
35:16 Developer because Mike back in the day SQL before PowerBI like that was something I had to learn just working in analytics and I wasn’t a SQL developer. I had SMS but that was the query but I wasn’t building tables or doing jobs or doing the integration in SSIS for those who know right but I still to your point it’s a large group of people are using SQL and we almost went away from it with PowerBI not that I didn’t use it but I didn’t need to I didn’t have to be an expert in
35:48 It right and it wasn’t something I had to do on a daily basis well I’m going to challenge you there a bit you use SQL just enough to get it into PowerBI Okay. And then you wrote Dax. I would completely agree with that. I used just enough just enough. Just enough to get it started. Just enough. Just a little pinch of sequel. Just a bit of salt thing, but but no more, right? But , like because I didn’t have to. Now I could. And there were definitely ways that you can utilize it. But I would I would challenge too for a lot of PowerBI developers, PowerBI users, you didn’t
36:21 Have to know SQL. or if you were in the managed self-service space, you didn’t have to do SQL at all. And it was funny because I feel like we’re almost I got another t-shirt for you. It’s a coming back to SQL now with fabric where it’s almost it’s return to the language where you and I have talked probably the the largest language you and I have talked about coding language on the podcast the last year, I’m going to probably say is Python. We probably talked about notebooks the most. And dare I say that’s a bit misleading because here’s
36:57 The thing Mike yes the integration side of things is huge but SQL runs the world and if we have assumed that and the fact that every single product that we have talked about now has an endpoint that you can query and Microsoft’s pushing this idea of querying anything in one lake but it’s not through PowerBI it’s through SQL So almost giving everyone a base SQL language is almost more powerful than giving someone a base Python language or or or skill or more importantly you can
37:32 Do more with the base SQL understanding than you could with a base PowerBI understanding. Not saying I agree with that but there’s an argument there right now. I I definitely feel like I’m seeing your points though, Tommy. I I just would argue that there’s a lot of people a lot of people running SQL in the world. Yeah. So much so that they made an entire event called SQL Saturdays. Oh, yeah. That Yeah, right. Which was and they have and they have fabric Fridays and SQL Saturdays. And which day
38:06 Of the week is PowerBI? Cricket. Yes. There’s no there’s no like PowerBI Mondays like I haven’t heard those yet. So So think about like the the like data engineering is really large. People are going to get more data. And to your point, Tommy, like the world’s just not going to stop and be like, “Oh, we’ve we’ve completed it. We’ve made enough data. We’re done. We’re we’re not going to make any more.” Like every day, I was just on you. You man, you used to manage websites, Tommy, so you’ll get a kick out of this one, I think, a little bit. , there’s a number of different
38:39 Analytics tools you can use on the internet for things just in general. Hold on, I got to sneeze here. Hold on. Okay, I think I sneezed. , the the Yeah, , Google Analytics is one of them, but it’s not the only tool to do web analytics. Adobee’s got its own product as well. , and then Microsoft has its product called Clarity. And so when you look at this volume of web traffic, I was showing one of my developers today. I was like, hey, it would be really cool to have this Clarity thing and this web app that we’re building and it was like, whoa, oh
39:12 Wow, that’s really neat. And Clarity was able to watch the mouse of the user on the website go around and then on every page on the website, you could have a heat map of where every user clicked on that page, so which areas were the hottest areas or which ones are the most clickable on the page. And so there’s just that just the fact that people are on a website and every little click, mouse movement, tracking, all that can be digested into information that can be tracked. This is incredible. Like there’s that
39:46 That volume of data. If just you’re alive, you’re making more data, right? And and we’re having every app or all these apps you’re interacting with just make this volume of much more data. So anyways, I just look at this going like there’s so much information here. Something’s got to be done to build this language. We’ve got to be able to query this really big pile of data and figure out how to best move it into systems that we can look at it and aggregate it. That’s what we’re doing here. And this right going to h part of this will be just it’s SQL. It’s a standard. It’s just going to be there. Let me ask you a question because I think the way the I’m reading the
40:18 Article and again a direction we’re heading. Not saying I can agree with this. So I’m just saying it’s an argument to be made. We’ve talked about about the semantic model being the elevated platform for whatever you’re doing in fabric. But there’s an argument here about SQL and the analytics endpoint being that area of your gold standard and the semantic model is just a level below that. Not saying that you don’t do semantic models. However, this REST API
40:51 They’re talking about the ability that you can keep all your data in sync with the REST API for all your data and again and that each product creates a semantic or creates a SQL endpoint. To me, Mike, I’m looking at the future here and I’m looking at what we’re doing with the endpoint. And to me, I I’m I’m almost making an argument in my head. Well, gosh darn it. Shouldn’t we focus on giving people the ability to the endpoint, not necessarily the semantic model as the first off? Because the semantic model is tabular. It’s
41:24 Analytical. It’s not transactional by nature, right? And the SQL analytics endpoint, I can do a ton with that. I can put it in an app for consumption. I could put it in an AI app for , cross cross modification. Semantic model is great for what it does, but the endpoint almost does more. So, let me ask let me pose that pun query to you just about this idea of where’s Microsoft going here when it when you think of fab when you think of
41:56 Fabric is it the endpoint or is it the semantic model? So, I’m going to I’m going to lean on what again I’m going to this is not me thing making this up. This is me pulling information from people that are way smarter than I am. , Marco Russo got super bent on social media recently. I don’t argue with Marco. No, I’m not arguing with Marco. Marco has seen this stuff longer than I have. I really value his opinion and his impression of things. So, I’ll definitely say Marco Russo has in one of his recent articles he he made a post on YouTube and I believe he was talking with or had listened to the podcast from
42:28 P3, Rob Collie, who’s the gentleman who did Excel and and Power Pivot, all the things, right? Yep, that’s him. So Rob Kie wrote his book around all those things with AI but he said this idea consultants are now saying like abandon the semantic model don’t don’t use the semantic model go right to these flat wide tables that’s what the AIs need to go figure out things and Rob was just like this is totally wrong and not the right way to think about this. The AIs don’t need that. The AIS need context and descriptions and relationships and
43:03 Measures and calculations what is important usage data like the models need more context to what users will find that is interesting so the AI can make better decisions around what data is presented and I totally would agree with them. So is Microsoft abandoning no the semantic model or moving away from it in favor of APIs and endpoints? No, absolutely not. All the stuff I’m reading in this article feels to me like this is more of like a story to the data engineer. Hey data engineers, , we’ve had problems with you making a lakehouse table and the delay between the lakehouse table getting into the SQL
43:35 Database or having it refreshing the metadata, right? Pulling in the latest version, right? You may be changing that lakehouse table every 5 minutes. The SQL endpoint is not recognizing that for 15 minutes, right? So you’re you’re trying to make a transaction change and the the SQL database you’re expecting it to be immediate after you make the change because you would see that in a notebook. Again, Microsoft is biting themselves in the foot here to some degree because what’s happening is the lakehouse and notebook experience is what I would expect. In cell one, I
44:09 Write the data down, save the table. In cell two, I read said table and the data comes back in the model and it’s it’s right. It’s exactly what I wrote. What this I think initial this first API call is addressing here is I write data to the lakehouse. I immediately read that data with the SQL endpoint and like wait a minute that’s still the wrong data. That’s the old data. How long does it take the metadata to refresh to go get the new definition to get the new tables? So there’s there’s still some automation mechanics here that’s not actually working where you would have
44:42 Expected the SQL to immediately read the most recent version of the table and in fact it is not. It’s caching it’s doing something to make it fast but in doing that it’s delaying the time between when the lakehouse tables are written and what the SQL analytics endpoint can pick them up at. So that’s how I read this article. It’s a like look data engineers we’re going to make your world better for you. Just stay tuned. This may be a bit of a hot take, but to me there’s then two options. There’s two routes that we have to take. We either have to expand our definition of what
45:14 The semantic model is. Because if I were to ask you the business definition or the actually let me let me rephrase that. The business purpose of the semantic model in the age of fabric, right? what that is or what it’s meant for compared to the business purpose of having a SQL endpoint is in fabric what to me what are those definitions and and and I’ll start so I won’t just put you on the spot there but I think when you look at the business purpose of a semantic model prefabric for PowerBI
45:49 It was for analytical reporting to get truth the data and your your source of truth that would be my my definition, but it lived in PowerBI. It lived in a report. It might have lived in a metric set or a scorec card, but for the most part, your definition was it’s going to end stop gap in a PowerBI report in some bar chart or table. Now, , but more or less, okay, so let me ask you then I’ll pause there. Prefabric, what was the what’s the
46:22 Business purpose of a semantic model? Let’s let’s talk about what semantic model so it’s to me I think it it is the pinch point as a and at a lot of data things let me rephrase I like that right yeah it is the pinch point t-shirt yeah yeah I build pinch points that’s what I do right so think of it this way we have all the so on one hand on one side of the world we have all these really awesome tools high scale high volume spark gusto SQL database data warehouse right these are all but
46:55 They’re all different tools. They can store data different ways. And to to some effect though, most of those tools when you were looking in Azure or even in Synapse for that matter, these all tools were very disjointed. None of them worked together. None of them played together. I didn’t have the same common format under the hood for all of them. It was just a mess. So if I wanted to use a SQL database, I could only use a SQL database. I couldn’t go hand some data over to a lakehouse or I couldn’t get data from a lakehouse into a database easily. Didn’t work. So on that side you have all this mess of just storing and transforming data. The semantic model is like okay I’ve taken
47:28 Some of this really weird nonstructured data or or tables and data. Now I’ve got definitions around it. Now at the semantic model level I can say look I don’t really know what questions I’m going to answer the business but I do know if I aggregate these kinds of columns they’ll provide value. So now we can start talking about like what provides real business value. And this is where I think Kimble really comes into play here. Let’s talk about the cuts of the data. How do we want to filter? How do we want to cut the data? What does the cube look like? And I’m not talking like MDX old cubes. I’m
47:59 Talking like semantic models. Let’s not talk MDX cubes. Let’s just talk about it’s a cube, a mental model of data. I want to cut and slice and dice this data many different ways because in my, , hundreds of billions of rows of table data, I’m looking for a handful of them that are going to tell me why sales are up or down. why we’re not making money, what should we be doing, what’s the trend, how do we compare from now till last year. That may take a lot of data points to aggregate up to get to the answers I need, but I don’t necessarily know what I’m looking for until I really started getting access to the data and see what’s going on there.
48:32 So that that to me that’s the pinch point. That’s the part where everything in the data engineering side funnels down to the semantic model. And now on the other side, okay, on the reporting side, we now start expanding again. Back in the PowerBI days, it was semantic model report, semantic model, dashboard, semantic model, like what else? Like maybe metric sets or goals, whatever the heck they call it this day. I don’t know if they keep changing the name or but like there was
49:04 Like three things. I only had three things that come out of the report and I could do it analyze in Excel, right? So maybe four, right? Well, now the number of tools that I can use on the other side of the semantic model, we’ve now been able to leverage this. Now we have explorations. Now we have metric sets, we have goals, we have analyze and excel pageionated reports and now the reporting. So now we have all these other tools at our disposal on the other side. But at the end of the day, if you stand back, both of these proliferation and tools on either end of the spectrum,
49:36 The data engineering tools all funnel down to the semantic model. The semantic model becomes your pinch point. And if you expand on the other side, we now have a proliferation of all the all these tools that need real time querying in cache memory that’s super fast. So when you make the semantic model super rock solid and this really solid core I think it’s like a keystone right talking about like an archway this is the keystone of the whole system if you don’t have that you can’t get the fast analytics on the one side if you don’t
50:09 Have the keystone you can’t you don’t know what you’re building your data engineering to and so to back to your question right let me let me really unwind here you said what’s the point of what’s the point of SQL servers the SQL analytics versus the semantic model. I think the semantic model is is a tool to be used to build all these other data like experiences that are easy to use. Now, we talk about skills of our users. I I’m going to keep going back to this. We do this all the time. I like where you’re going. We’re going back to some users don’t want to write SQL. Some users don’t even
50:42 Want to use explorations. Some users just want to have a report given to them and that’s what they use. They work from that and that’s what they do their job with. Fine. we have a tool for that right but there are going to be other users who want more who want more capabilities so I think if we look at the 30 million user base of PowerBI that’s out there right now awesome a lot of them are going to use just reports and standard stuff but there’s going to be a portion of them that are going to want a bit more and so this is where I think the semantic model and then going back upstream to like SQL
51:15 Server is like okay here’s all the tables that the semantic model lives on write your own SQL query fill build your find your own analysis. Figure out what’s in the data that you care about and then you should be able to do that. There there are there are advanced users that you want to give more access further upstream to build their own things because it will be it’ll take too long. It’ll be too costly for you to stand back and let that user give you all the requirements cuz they don’t even know them yet. So, it’s faster for me to say, “Look, here’s the SQL analytics
51:49 Endpoint. Here’s the SQL database. Here’s the secret warehouse. Here’s the notebook. I’ve made you a bunch of shortcuts into lakehouse. Go have fun. Go figure stuff out. Get your job done. Dude, I I love all that you said and I want to highlight some major points that you said because they they just deserve to be reiterated and really emphasized here. I was going down that path in my head, that scenario, but then you made a good point. Well, here’s the thing. What consumer applications are there for the endpoint and there are
52:22 None and so and and we’re living in a world m may maybe maybe your your power apps that’s like one maybe like I would I’d give it that a little bit but at the same time that’s not for the access that’s still to structure your data more SQL endpoints as great as they are they are not going to take the place of the structured data they are giving you better ways to structure your data but not the structure itself and that still is where that primary semantic model comes in. The semantic model is the userfriendly way and I think this the
52:55 Big point that you’re saying here no matter what we’re seeing and as great as we’re seeing with the SQL analytics endpoints and the databases it in itself is not what I’m going to give a consumer because there is no going to be application friendly for a consumer. Now for a data scientist, , we’ve we talked to Ginger about this and we’re like, why can’t we just give them a semantic model because the data is already clean and she kept saying they don’t want that, which I still don’t understand. But so this analytics endpoint and what we’re doing
53:30 With SQL is giving more and more abilities for transactional and for engineers and for scientists and for analysts to do things more abilities than they’ve ever had before. But that does not take away the fact of how important structuring our data through a semantic model is. And and I think what I I was trying to play in my head was is one going to take its place, but it’s never going to be that way because they really do serve those different functions here. And I so I love the idea here is we don’t have a
54:02 User application for SQL endpoints and we’re not going to there’s not going to be a hey here’s office 365 SQL for you’re going to open Excel and you’re just going to connect in the entire data they have that and it’s with the semantic model. So where we’re going with this is and I think about this as the future is where would a BI team invest in this? And that’s what I always want to think I always like to think about when I ask these questions is where should I invest? Where should my team invest? Where should my
54:35 Organization invest their time, money and resources? And yeah, there should be a lot on the endpoint. There should be a lot for giving other analysts access to it. Mhm. But that doesn’t mean that the semantic model goes away. No. And that semantic model, , I’m even just googling a couple documents here from data bricks. Data bricks is even saying like look, the semantic model is for the business user. I’m like, I agree. Like it is like I the fact that I can go like I was building reports and I was writing SQL statements. I was building tables of
55:09 Data using PowerBI and then trying to make those same tables using SQL. It’s a it’s a lot slower to write SQL to build a table than it is to to drag and drop a table into PowerBI. The exploration feature alone. So, so for that alone, I would say like that’s why the semantic model exists. I can just drag fields to a table and have it appear like that that makes sense to me. So, like I love what your point is there, Tommy, and I agree wholeheartedly. There’s a there’s a layer for this. There’s different technical people that need to be addressing these different things. But I think like this article in particularly,
55:42 , it’s good that there’s improvements coming. I also like the fact that SQL is getting some more love. It’s getting better. It’s this analytics endpoint is getting more features. Great. Everything that we’re going to be dealing with with the SQL, well, not everything, but like this API surface area that they’re trying to address here with this SQL analytics endpoint API. Great. Dig it. I think this is going to be a big win. It’s going to make our lives easier as we do more data engineering. But this is purely a story to the data engineers. Yeah. my my I I know we’re getting into your time, but as I think about the
56:15 Journey of your own data and organization and I always think of I still have in my mind the map of just PowerBI it was data integration transformation semantic model report and there’s almost now once you get to that integration side where you can split them because now I can say what I can provide a ton of teams and applications cleaned data that I did through fabric where I have that ability. Yeah. And then I still get that data fed through the to the semantic model. But here’s the thing. I think for a lot
56:47 Of organizations and for a lot of teams is there’s a big investment that needs to happen with what you can do at the end point because it really to me is still underutilized for again for things like power apps for your AI for just feeding that to other systems or to other lakehouses too. It’s incredible what you can do, but don’t let that take the place of how important the semantic model is for you. what it’s gold. So really, that’s where I’m gonna end on this. I love this. I can’t wait to see what we’re doing with SQL. It’s a it’s a coming back to SQL moment. Yeah, it’s
57:20 Always been there. It’s just more of like I’m now more aware of it, I think, now. all all I’ll say is I’ve been having a great experience with SQL, the SQL endpoints, SQL databases, the SQL data warehouses. I’m finding a lot of value there. , I I personally am going to take a deeper look at a lot of them to figure out where I can use them more in my workloads. With that being said, we do appreciate you listening. This was a full episode of a lot of SQL talks. It was hopefully a lot of fun for you at the beginning and a lot of fun here towards the end as we wrap up here and capture our thoughts around SQL. If you like this conversation, if you
57:52 Enjoyed this, we really do appreciate if you would share it with somebody else. We know that this doesn’t get out to everyone. , if you find this valuable, likely someone else in your tech space may also find this valuable as well. Please share it with them. We’d love for you to communicate with them. And if nothing else, it opens up a conversation with somebody else about working with data things. That’s it, Tommy. Where else can we find the podcast? Oh my gosh. You can find us on Apple, Spotify, wherever you get your podcast. Make sure to subscribe and leave a rating. It helps us out a ton. And share with a friend since we do this for free.
58:24 Do you have a question, idea, or topic that you want us to talk about a future episode? Do you disagree with us about SQL? Well, head over to PowerBI.tips/mpodcast. Leave your name and a great question. Join us live every Tuesday and Thursday 7:30 a.m. and join the conversation on all PowerBI. Tips social media channels. Awesome. Thank you all so much and we’ll see you next time. Knock down.
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
