Fabric SQL Databases - Now What? – Ep. 388
Mike and Tommy break down what Fabric SQL Databases are, where they fit in the Fabric ecosystem, and the scenarios where a relational database is the right tool instead of a Lakehouse. They also cover how this changes your architecture choices and what to watch for as the feature matures.
News & Announcements
This episode is centered on one big theme: Fabric is expanding beyond “lakehouse-first” patterns and adding more first-class options for teams who still need (or prefer) a traditional relational database for certain workloads.
-
Decision guide: SQL database in Microsoft Fabric — Microsoft’s decision guide frames when a Fabric SQL database is the right fit versus other Fabric storage/compute options. It’s useful if you’re trying to map requirements like transactional writes, concurrency, familiar T-SQL patterns, and operational app integration to the right Fabric artifact. The key value is clarity: it helps teams avoid forcing every scenario into a Lakehouse just because it’s the default.
-
SQL Database in Fabric — SQLBI’s write-up helps translate the announcement into practical implications: what the new SQL database capability means, how it relates to existing Fabric experiences, and what to consider before adopting it. If your team lives in the BI/semantic model world, this is a solid perspective on how the relational option might simplify some architectures (and complicate others).
Main Discussion: Fabric SQL Databases — where they fit and what changes
With SQL databases now showing up as a first-class citizen inside Fabric, the conversation focuses on the “now what?” question: how should teams think about architecture and workload placement when they have both lakehouse-style patterns and a relational database option in the same platform?
When a SQL database is the right move (even in a lakehouse world)
Mike and Tommy talk through the situations where it’s not only reasonable—but smart—to choose a relational database:
- Operational / app-style workloads where writes, updates, and high-concurrency querying are normal.
- Teams with deep T-SQL skillsets that want to move quickly without re-tooling everything immediately.
- Scenarios where strict relational modeling and constraints are a better fit than files + parquet + “schema-on-read” patterns.
The practical takeaway: Fabric isn’t just “Power BI + Spark.” It’s becoming a place where you can land data in the format that matches the workload instead of bending every workload into the same storage shape.
Architecture implications: don’t confuse convenience with correctness
A recurring theme is that “it’s in Fabric” doesn’t automatically mean “it’s the best choice.” A SQL database might reduce friction for some patterns, but you still have to decide:
- What should be operational vs analytical?
- What needs governed, reusable semantic layers vs what is just an application store?
- Where are you optimizing for simplicity vs cost/performance?
This becomes especially important when you’re trying to keep your org from building three parallel data platforms inside one Fabric tenant.
Beat from the Street: Tabular Editor vs ALM Toolkit vs DAX Query View
The episode also hits a practical tooling question many modelers run into: which tool should you reach for?
- DAX Query View is great for quick exploration and analysis when you’re already in Power BI Desktop.
- Tabular Editor shines when you need serious model development productivity (measure management, formatting, automation patterns, best practices).
- ALM Toolkit is often the go-to for comparing/deploying model changes and managing differences between environments.
The key is choosing the tool that matches the job: exploration, development, or deployment.
Looking Forward
Fabric SQL Databases add a strong option to the platform, but they also raise the bar on governance: teams need clear workload patterns and ownership so “new capability” doesn’t turn into “new sprawl.” If you’re evaluating the feature, start with a small workload, validate your performance and cost expectations, and use the decision guide to align the choice to real requirements—not just familiarity.
Episode Transcript
0:31 out good morning and welcome back to the explicit measures podcast with Tommy and Mike good morning everybody welcome back to our to our podcast it’s going good morning Mike can you believe it’s already what is this the 9th of January already things are already zooming by very quickly I I’m not ready for this I’m not ready for the new year yet get ready for a fast year I think it’s going to be another they say the the older you get the faster time goes it feels that is a true statement for me anyways
1:06 man well we’re back at it another year that we’re going to put in the books hopefully hopefully so so let’s jump in we have a couple main topics today so what one recent announcement was the addition of fabric SQL databases so a SQL database now exists inside fabric this is a major move I think by Microsoft they are not only looking to have your reporting data but Microsoft is also looking to land your operational data inside fabric so if you’re Building Solutions around
1:37 so if you’re Building Solutions around SQL databases believe this is this intent here is for people to have an actual SQL database inside their fabric environment and with this comes automatic mirroring to your Lakehouse so you can have reporting right next to where you’re actually making and creating the data very interesting so we’re going to unpack this today we’re going to go through some articles here from Microsoft we’re going to there’s actually a really good learn document from Microsoft called the Microsoft fabric Decision Guide choose a SQL database we’re going to unpack
2:08 database we’re going to unpack this article today and go through that that’s also in the description below but before we get into that I don’t have any news that the front on Microsoft has been very quiet there’s really not been a lot of things going on at the Microsoft side there has been a a couple things announced in fabric on the fabric blog but they’s smaller at this point so I think we’re really not going to get moving on new features and releases of things until we really hit February February where things starts really picking up again I think the developers are coming back from vacation they’re building things now
2:40 vacation they’re building things now but but one thing I did want to talk about here I had a really engaging conversation yesterday around this topic it was okay given the tools of fabular editor either two or three Alm toolkit and now we’ve added dax’s query view what what would you say Tommy are your key differences between these three tools and really the secondary question I have to this is one is what does the main feature differences like what would you say
3:10 differences like what would you say feature-wise why would I pick one over the other let’s start there and I have a secondary question that goes along with this as well that I want to follow up with after you answer that one so Tabler editor Dax query view well my biggest difference is they’re not all tabular editor so here’s the thing okay and I’m gonna be straight with I think I know where you’re going with this yeah yeah so all the tools can do some of the things but one only one
3:40 do some of the things but one only one of the tool can do basically everything else the other toolkit except for Alm toolkit that’s more or less one in itself but not necessary because of get in projects so let’s focus on tablet editor Dax queer viiew Dax Studio Tabler editor 3 is the number one the only place to really go if you want a query build your models
4:10 query build your models automate I know that it’s already in the desktop and I use it sometimes to do something really quick rather than opening up Tableau editor 3 but it’s only because I don’t want to open up Tableau editor for two seconds okay all right the reason I’m bringing up this question was is I I think about when I whenever a new tool gets introduced to my library of tools right I have a I have a toolbox and I’m adding a new tool to it one of the new tools that we’ve just recently received was Dax S viiew I really like Dax sare
4:41 was Dax S viiew I really like Dax sare viiew I think it’s very useful to your point Tommy it does a lot of really basic things if you are debugging a model and you’re trying to figure out the dependency tree of all your measures it does a really good job of just writing them all out and defining them and then you can play with the measures and adjust them and trying to figure out if it’s working correctly moving forward so from a designer Builder experience on top of the deck level a deck area I really like the Dex revew so the reason I bring this question up is because I was trying to think of what are the distinguishing features of when like if
5:12 distinguishing features of when like if I’m giving guidance to someone when would I want to use one tool over the other so let me unpack some ideas I had here so tab your editor 2 is a free tool you can use it straight up right out of the box but it has a lot of limiting features to it there’s a lot of really nice things there’s some like better editing of experience of Dax there’s better user user interfaces like so the the tab editor 3 lets you create tables of data and view them and it has like this Matrix
5:42 view them and it has like this Matrix view of data you can make a matrix view of data and return the data from the semantic model and see what it’s producing I do I really think that tabular editer could be the entire Suite of if you need to build anything semantic model related tabal editor is clearly the premium tool however when you look at tabed 3 I’m getting a little bit of push back the business level of tabular editor is $35 a month the Enterprise version is $95 a month if you are doing modeling every single day and that is your job no brainer this saves
6:14 that is your job no brainer this saves you that much time easily it it will easily save you an hour of time a month which should clearly justify the price of the tool but when I look at the other tools like what difference does Alm toolkit have compared to tabular editor and so I think that tool really shines in the idea of I’m publishing things or I’m looking to compare two separate files Tommy to your point GitHub or git solves that problem as well but I don’t think a lot of people are comfortable yet with Git and it’s not as seamless as
6:45 yet with Git and it’s not as seamless as you would think because you need to be able to see different commits and I need to be able to have code that’s been pulled down I need to there’s this local branch and the remote branch and it’s a little bit more confusing for people to like figure out all the details so I’d argue Alm toolkit is a really nice tool to understand what changes did I make and what and what specifically which changes is I’m are I am I going to pull through to a a published model right I’m going to cherry pick a couple
7:16 right I’m going to cherry pick a couple measures I’m going to cherry-pick these changes to this column but I’m not going to update the partition strategy right so there’s there’s very nuanced pieces that to me the Alm toolkit is around deployment of things that’s its defining feature and the Dax quore review is really around it’s a very it’s a lightweight version of tabular editor 2 is what I see Dax qu review the lightweight version of this the reason I bring all this up is because as I think about the different tools I believe Microsoft’s now creating a suite
7:46 believe Microsoft’s now creating a suite of tools that are chipping away at some of these more traditional owned tools previously which I think is the right move I was just curious your opinion here yeah the Dax queer review it’s great for quick things and I know some people are building off of it but yes so it is you can do the same basically output the same result with the Dax query View and table editor 3 in terms of what’s possible is the same but from a user interface and just
8:18 same but from a user interface and just usability point of view yeah that’s where the differences really really lie and again Dax query reviews that’s literally if you just took out that that Dax script from editor and that was y That’s it so and it’s great like I’m glad it’s already configured and built packaged into powerbi desktop but if I’m doing my normal model developments I’m still opening a tablet editor I’m still going to do the queries
8:48 editor I’m still going to do the queries there because it’s just a better environment and again and I really think too with daak Studio is the only one that really has I one that the others don’t and that’s really the the the logging of memory that you can actually look line by line to see actually your memor the memory and time for Dax query to actually take place that’s a different scenario different design pattern right I’m saying this if you’re doing if you’re doing tuning and optimizing of Dax 100%
9:21 doing tuning and optimizing of Dax 100% Dax studio and actually now Dax Studio
9:23 Dax studio and actually now Dax Studio the Enterprise Edition it comes with Dax Optimizer Dax Optimizer comes with Dax editor at the Enterprise level so Dax Optimizer is the sqlbi tool that helps you figure out what inside your model is not performant or what inside your model does not adhere to good standards or best practices there are ways of writing Dax that is not efficient so it helps you pick out those out those things besides that then if you control
9:54 things besides that then if you control Tomy Tommy if you control more of the data engineering experience like for example if I am making link houses of tables does that give do you do you find your Dax becomes simpler because you actually just shape the data differently and prepare it better for the semantic model because so the reason I’m saying this question here is when I when I didn’t control the data source when I didn’t control the tables that the data came from non a non- Lakehouse or I had to write power query against it
10:25 or I had to write power query against it I found that my Dax was more complex I feel like my Dax now is much more simple in nature yes there are still some edge cases where Dax gets more complicated and we have to do some summarize because we want to see some Dynamic aggregations of things before we calculate stuff but I think in general my Dax has gotten simpler over the years because I’ve done different experiences I’ve done different modeling designs on the front end that makes it easier for me to use the data and I’m
10:55 easier for me to use the data and I’m I’m becoming more rigid in we need to do a better job data engineering the data as opposed to compensating with Dax so I feel like most of my work could be done in Dax quer review you feel like this is the same pattern that you see no no honestly like the Dax query review is fine but I think compared to to your point where our data is being stored now too I’m surprised you’re not saying python out of all this within notebooks
11:26 python out of all this within notebooks thing that you’re not moving to that one but I’m not yeah it’s Python’s interesting but like it doesn’t it maybe helps with the automation of building of things but getting things in and out of a data frame displaying them there’s no easy UI it’s a little bit more difficult to write the Dax inside a python notebook I I know I I understand your point there Tommy but I I feel like I’m still leaning more towards Dax cor review review interesting will will your mind change at all if we if we when when timle
11:58 at all if we if we when when timle editor shows up for powerbi desktop will that change your mind at all on these features at all would that would that more pull you away from fa editor or Alm toolkit I don’t actually think so because the one thing that I would want especially with the Dax query view it’s not as full-fledged as your normal SQL queries like it’s cool but what I find with Dax queries a lot of times too I’m not is pulling the
12:30 times too I’m not is pulling the subset of data and more to your point like doing actions or building off of it Dax queries right now for me are a lot of the testing that’s what I I have my macros and tablet editor 3 sure yes you can do it and then there’s also the Dax scripts which are more or less the same thing but honestly the biggest thing if I were to start using that I need a library I need something to to go off of thing and just a little some it’s I guess but again I’m
13:00 little some it’s I guess but again I’m talking about table 3 I didn’t even know it I’m like I need something like a SQL SSS but for powerbi it’s table editor so editor okay interesting so I’m I’m just trying to unpack some of this so interesting conversation I think I think I’m I’m going to disagree with you here Tommy you’re a professional you here Tommy you’re a professional how to write Dax I think Dax ciew know how to write Dax I think Dax ciew is going to be very relevant for people moving forward and I think in in my conversation earlier in another outside of this podcast was I really do think that the DX sview is actually taking or eating some of the other
13:32 taking or eating some of the other tools capabilities and and and bringing that indirectly to desktop which I think is the right move for Microsoft if if co-pilot was able to be used all the way down to an F2 skew I think you would actually get a bit more adoption of Dax cor review I think some of the power of Dax cor review is being able to have a co-pilot with you to help you write Dax that’s a feature that does not exist in fabul editor or Alm toolkit you can use open AI or
14:02 you can use open AI or something else with d tab editor if you write some custom scripts but I don’t know what the threshold for people would be is to write a whole bunch of C scripts to help them integrate or talk to a an AI to get that into your your tabular editor so because of that there’s there’s some interesting things coming so I’ll be curious to see how things roll out I’m really excited around timle editor in desktop I think that’s going to actually solve a lot of my problems and I think I’m going to continue to move away from desktop I
14:33 continue to move away from desktop I think a lot of these features already exist in the service Dax squ R viw can be used in the service if I’m in the service and I’m adding a model yeah I’ll hop into Dax score review I’ll write a couple measures test some things out I don’t even need to go to desktop anymore so talk about ease of things I I’m there is no equivalent Dex a tabular editor in the web there is no Alm toolkit for the web so those workloads don’t exist so I’m I’m going to be interested to see where things are going to go in the future future here I’m going to call it now Tommy
15:04 here I’m going to call it now Tommy we’re going to find more and more tools that are going to appear inside Fabric and we willb or powerbi in fabric I think a p I think fabric is the the gateway to all these extra tools so you’ll buy powerbi and you’ll have the standard desktop features and then I think you’ll buy fabric and you’ll get a lot of extra tooling that will help support you building better models optimizing and tuning things mark my words I think the fabric environments are going to get much more rich with features that you’re going to want to use anyways all that being said any
15:36 use anyways all that being said any other topics or questions Tommy like beat from the street type notes here for you we’re good all right with that let’s move on over to our main topics our main topic today is SQL database in fabric we’re going to probably cover two articles here there’s one the the decision guide from Microsoft which I’ll put here in the chat window right now here’s the decision guide for Microsoft Microsoft learn and then the secondary article here is in November 23rd sqlbi wrote a really
16:06 November 23rd sqlbi wrote a really interesting article around SQL database in fabric so we’re going to unpack both of these articles together here Tommy give us an overview of The Decision Guide let’s what are we going to talk about here what are our main points we’re going to start off with so really the first thing is this introduction of fabric databases SQL databases and like a lot of what Microsoft did when they were introducing Microsoft fabric fabric they’ve put together basically a whole list of decision guides now the big difference is unlike some of the other products
16:38 is unlike some of the other products that they’re really moving from like synapse to fabric well SQL databases are probably out of all things that fabric has introduced probably the most longstanding there’s probably the most that of exists of these outside of fabric than any other thing in fabric right now it’s a big Fab databases are how a lot of companies live and breathe so we have two decision guides one where are you gonna Park your SQL database
17:09 are you gonna Park your SQL database because if you need one and really just going through whether you’re going to do an Azure SQL database or you’re going to do a SQL database in fabric then we get led into because we have so many options in Microsoft fabric for not just storing our data and that goes into Marco Russo’s article where it’s like great why now so to speak yes well we have a few places already in fabric I can put my
17:39 places already in fabric I can put my data and it doesn’t have to be a database again Mike my entire life everything went in a database everything goes in a goes in a database but now there’s maybe a little more thing that need to go around this yeah so let’s let’s talk about so one there’s this article from Microsoft is clearly comparing the SQL database so Azure SQL databases that’s what they’re comparing themselves between and then the SQL database in fabric which again I’ll be
18:09 database in fabric which again I’ll be clearly point out here SQL database in fabric is in preview so that that is a preview feature and then it goes through here and calls out a couple things that are clearly different in the two experiences right so a fabric SQL database has an elastic pool yes it does you can use elastic pools with as your SQL databases but in the fabric database world you don’t have the ability to have an elastic pool it’s not available but most other features are fairly covered like the other the
18:40 covered like the other the other option here is you have different purchasing options for Azure
18:44 different purchasing options for Azure databases in Azure versus the fabric it’s just a provisioned and it’s just a capacity consumption method for the skew so where where do we see this fitting at the bottom of the article it has two interesting scenarios Tommy and I’d be curious your your re action to these scenario one a gentleman or woman I’m not sure who this is but Kirby could be I guess it could go either way Kirby is a Solutions architect and they have they’re creating an AI application on top of their operational data they need
19:14 top of their operational data they need an easy to manage operational database that can integrate across different platforms queries against some real-time data so it’s querying data potentially from parquet files some master data that maybe lives in the warehouse that or maybe pulling from the lake house but they’re doing some real-time queries on top of that so the SQL database is going to be using the operational data and it’s going to have server list so it will auto scale up as large as it needs to so when the query demand gets higher the database will automatically scale up to the size it needs to run those
19:45 to the size it needs to run those queries against the data scenario number two is two is Aaron is an architect and working with a net application and working with developers to build an isv type solution they’re developing a multi-tenant architecture and they need isolated databases for the customers so in order instead of having to provision things using I they’re trying to say multi-or architecture here customer is worldwide they need multiple
20:15 worldwide they need multiple databases that all have the same structure it looks like and they just need to turn them on right get them up and running as quickly as they can and land that specific customer data and isolate it per customer not a single database that houses everyone’s data cuz from I guess from a security standpoint let me just pause right here what do you think Tommy about these scenarios the AI operational data solution and then the building a net application for customers where each
20:45 application for customers where each customer needs an isolated database how do these in your world or what you’ve seen of customers today currently what does this feel like to you so some of those Advanced toolings point of view I still think think the scenarios are showing me that databases are not going away they’re not we’re not going to move away and try to migrate everyone off of it I think just letting us know that we do have more options because a lot of these Solutions well especially trying to analyze data
21:16 well especially trying to analyze data just storing our data somewhere again databases are right now the in a sense most preferred way of having something that can be transactional yeah but why SQL though why not custo why not Cosmos it’s not universally accepted so like so yeah well okay so if I have you can you can write SQL against the cosmos DB you can write SQL another so like so okay I understand your point like SQL databases are univ you’re going to store data somewhere but the the
21:47 to store data somewhere but the the variety of solutions that can store data are very different now and I would agree with you Tommy like the lake house is good for collecting lots of data it’s good for collecting real time data and either batch processing micro batch processing them or streaming those data into new tables like that that’s really a good design for a lake house but like my opinion here is so what like I’ve gotten along for years without having to spin up a bunch of SQL
22:18 having to spin up a bunch of SQL databases my point my my point here though is let me finish my point here is this is a new persona like if I look at what I’ve done in my traditional business intelligence data engineering space the SQL database coming to fabric is a new user it’s a new user persona it’s the developer it’s an app developer in both these scenarios this is looking like an app developer to me so I want to like that so I understand Tommy what you’re talking about you Tommy do a lot of app development with Power
22:50 of app development with Power power power apps and you also do that with SQL databases supporting it so if you build power apps and you need a place to put your data equal database in fabric I think makes a lot of sense because that you’re now building the operational system directly inside fabric but this is something that fabric has never touched before or Microsoft hasn’t really been pushing it until now and this is true and honestly so the capability is definitely getting there like and it is crazy when you think of the whole breath and scope of what
23:20 the whole breath and scope of what Fabric’s going to be able to do however let me push back so do you don’t want to know why most people have all their data in databases and SQL databases even though you’ve already why I’d love to know why tell me why you want to know why I want to know why tell me why you said do I want to know I’m like yeah I want to know why wouldn’t I just put them in like an Excel sheet Tommy why my database it’s SQL it’s the universally accepted way of storing and processing
23:52 accepted way of storing and processing transaction data with the one of the most popular languages in coding they’re being May some people would consider a coding language so be be able one be able to query and extract data and then on the other side the database Administration side of things yes does fabric have the security roles workspaces great awesome but again that is a only really unique from the structure of that how the security
24:23 structure of that how the security levels are for fabric in every database I have a user with the rules I can read or write y y and I can all control this if I’m a database administrator and again we’re dealing with that one person here in the same language if I move from any company to any company all through some SS SMS or whatever the SQL management tool is because it is the number one tool to get data now the AI
24:54 number one tool to get data now the AI feature the the one where we’re building the net there’s a ton of the data engineering here that’s we know where that’s reqel that’s not data Engineers netet I know I’m saying and the AI like that’s not that Persona though like I I would sorely disagree with you a data engineer may be using or creating a SQL database and loading data into it they would be comfortable with that but the scenarios that they published here those scenarios are not data engineering scenarios those
25:25 are not data engineering scenarios those are those are app developer or developer Centric Solutions and they just happen to be using a SQL database to build out part of these Solutions maybe the AI generated one is a little bit more a little bit more towards the data engineer because we’re taking a lot of existing data doing some AI to it and then popping out an answer or some kind then popping out an answer or some query that comes back from that so of query that comes back from that so like maybe that’s maybe that’s a scenario here but I’m the again I just want to be clear here I feel like
25:55 just want to be clear here I feel like we’re I’m like at this moment I’m seeing a sh sh in the different personas that need to start using Fabric or or could use fabric for that matter right traditionally it’s been the data engineer the data scientist and the analyst those are the three personas that Microsoft really resonates with as these are the people that should be using powerbi and fabric this article and particularly these scenarios makes me introduce a fourth user that is the app developer or the Cure developer that’s going to be building things and
26:26 that’s going to be building things and my note here is does this make sense like my question is are is this something that the community the developer Community is going to be embracing or are they going to say yeah you gave me I don’t have enough control there’s not enough knobs not enough buttons right I worry about it’s yes it’s here you can do some simple things with it but will it be a full-fledged database that I’m really going to use for my operational data is that going to be a thing that we’re going to use and
26:56 be a thing that we’re going to use and one of my here here’s one of my cautions right here’s I’m thinking as I’m unpacking this idea or concept I’m thinking about this one what happens when it throttles so my question here is I’m paying for a certain amount of compute units let’s say I’m on an f8 of a fabric skew and for whatever reason that SQL database gets hammered for for whatever reason what throttling or limitations will will the queries just start failing at some point do I have the same problem with a SQL database that comes from Azure where it
27:26 database that comes from Azure where it auto scales up to a certain level and I pay for my use pay for what I use kind pay for my use pay for what I use thing I think the pricing model of thing I think the pricing model shifts here because now we’re paying for a certain amount of compute units and now we’re expecting the bursting or the throttling of the the fabric capacity to absorb any additional costs for the SQL database so I just have some concerns there I don’t have any testing that supports this I don’t know if anyone’s using this in my companies we’re not using it yet so I’m I’m still trying to understand like is that going to be a
27:57 to understand like is that going to be a risk maybe it’s not maybe I’m overthinking things maybe it’s the the smoothing and the usage of that SQL database won’t cause a problem but I
28:06 database won’t cause a problem but I I feel like if I’m using operational things I want High guarantees that that system that database will autoscale up to the size that I need regardless of usage because I can’t have production going down because I didn’t pay enough or didn’t buy enough fabric capacity it feels like a risk to me to be honest though and I think it’s very fair but I think with the Lakehouse if you’re using data flu gen to really all the cost is still
28:36 flu gen to really all the cost is still big thing but to me that wouldn’t be like a a deal breaker right because again odds are you or the company or someone you’re going to work with is utilizing or already relying on this and already has basically this cost structure planned out again there’s nothing new here in terms of the most UN universally accepted way to store transact push and read data in an organization right so and like we can’t
29:08 organization right so and like we can’t get around that so the chat also awesome chat you’re you’re very engaging here I love the comments you bring out here so I’m going to call out Enterprise AR has a really good point here saying the group that traditionally handles the database stuff database stuff the on-prem the SQL database administrators that group is typically different than the business intelligent group right that that’s a different set of people and even when we talk about 2024 2025 now the idea of control and
29:41 2024 2025 now the idea of control and security it’s it’s still going to be an issue we need fine grain controls on this one so right now how I look at things when I look at a workspace the the boundary of the workpace is the security boundary between these different objects so is this we’ve already talked about the best practice of using models and reports in two separate workspaces because then I can control the models independently of the reports we’ve also talked about hey if I’m adding a lake house maybe the lakeh house gets its own workspace so I
30:12 lakeh house gets its own workspace so I can control the lake house and the table structure independently of the semantic models and then I can control that independently of the reports again depending on how big your reporting organization structure is I think you need to consider more workspaces to break out these different things but now with the addition of SQL databases does that mean we get a fourth workspace that’s now controlling the operational and SQL databases I think this also ties really heavily into licensing now because again back to my point earlier if things start getting throttled and
30:42 if things start getting throttled and you’re not going to have all the capacity or maybe I’m running a notebook job in the middle of the day does that take away so much computer away from my SQL database that the SQL database can’t return queries that’s unacceptable I can’t have that happen right so in order to give clear breakpoints between the different fabric environments I’m thinking again I’m just spitballing here as we’re thinking about this out loud the SQL databases gets their own workspace and it gets potentially even its own capacity so I I do think
31:13 its own capacity so I I do think there’s something here with this I’m not sure exactly how it’s all going to fit together but I’m still unpacking the the impact this is going to have to the market well let’s quickly to go back on your scenario that AZ your architect theet application I think what I we want to make sure and be said is the goal with fabric database I don’t think is we’re going to move all of your databases to fabric SQL databases and that’s not the case right because we know of all the different
31:43 different scenarios scenarios but and I’m not going to put a number out there but unless you’re not a niche Point case where again you need to store your data you need to process it your data you need other applications and apis to read that data you want to build reports off of that data what those cases from there yeah you probably already have a database with your already being used and it’s probably not
32:13 already being used and it’s probably not terribly custom in in regards to like it’s not an elastic database it’s not doing all these fancy things where you’d always need something custom regardless and again I think the other point here too is we’re not saying hey company you have 25 databases or you company you have 25 databases or with all these tables well your know with all these tables well your goal is now we’re going to migrate that to Microsoft fabric that’s not I don’t think that’s how we see this so you so let me unpack what you said there so
32:43 so let me unpack what you said there so you’re proposing that new databases like if I have a new project showing up if I’m going to do something brand new that’s where I’m going to probably start with a fabric database like I’m a I’m a smaller company I have some stuff I’m I’m exploring I think I need a database now okay let’s turn that on and let’s start using that as our our power apps interaction layer between the power app and storing the data for reporting later on right is that that what you’re thinking yeah no that that’s that is really I think
33:15 no that that’s that is really I think from a getting started point of view yeah I agree with that and then teams that already have a your SQL teams that already have SQL on Prem they may not initially start with fabric rather they’re going to probably keep with their existing pattern of using SQL fabric as well and again I have no I have no problems using Azure SQL as opposed to fabric SQL right there’s no issue there you do get more features there’s different features the billing methodology is different when you use that I think of it is the same way is
33:46 that I think of it is the same way is working with datab bricks and fabric as well right data bricks can engineer the data datab brick is more of a pay as you go model as opposed to fabric which is prepaid modeling right so you prepaid for what want you can still do all the data engineering you still have notebooks you still have python all those Rich data engineering experiences they are both existing in both tools but now you have more options to pick from do I go down the data breaks route is that the right solution for us or do we or do we migrate our solution over to solely just using
34:16 solution over to solely just using fabric what is that what is that divider line between these different tools now and how does it play well together so and let me do you one better because not just only new scenarios we’re going to create a database but I can you bet you guarantee you that there are a lot of Niche cases where your data is all over the place and we’ve already talked about the Lakehouse I think really the competition for the database is from a creation point of view is our our fabric lak houses
34:48 view is our our fabric lak houses because here’s what I want to do here’s what I probably have in my organization I’m working on a particular team and we don’t control the SQL admin and I don’t control the but the data is coming from all these different systems could even be coming from Excel things we just want to have it clean and structured but that that’s what Enterprise ar ar was saying earlier in the com in the comments Enterprise AR was was making the point that the team that manages the SQL databases is different than the bi team to your point Tommy so if you are an established
35:18 Tommy so if you are an established company the team that owns that is somewhere else do so the question I would ask Back to You Tommy is do we start enticing that team to bring their workloads over to fabric oh 100, 000% really yes absolutely if I could have had this with some of my teams to actually have a a little more control over that data and how it’s coming in holy crap yes so so I feel like there’s also there’s an incoming article here either from you tomm your or Community if you’re out there listening to this one this might be an
35:49 listening to this one this might be an opportunity for you if you are a SQL I’m I’m not a SQL DBA admin I work with SQL I do a lot of things in SQL I know I do a lot of things in it but do I administer a bunch of them nah I don’t I work in fabric fabric is my space I like working with notebooks and lake houses that’s that’s the technology stack that I wanted to really align to if you are a SQL DBA there’s probably a great article out here that says hey help us migrate from Azure SQL into fabric SQL what does that migration path look like how can you do it can someone provide some automated scripts that
36:21 some automated scripts that hey take this old SQL database create the backup file back it up into Fabric and boom now you’re on and running in fabric so it’ll be interesting to see what articles or what the things in the community are able to absorb and pull from the community here because I I believe a lot of people I think the SQL data Community is extremely large maybe not as large as the fabric Community but it is a huge community and I’m I’m still a little bit I haven’t heard a lot of news from them and their excitement
36:52 news from them and their excitement around getting to fabric with SQL so you bring up a interesting scario here one I’m more think about the databas is where when we say we’re going to move our our resources over to a fabric database yeah when you said that did you have in mind that they already have a database or something structured or they’re starting from okay interesting so I’m I’m envisioning here like again the idea is so some of the arguments I’ve heard for fabric is it just works I can just turn it on it just runs no big deal right you can you can
37:23 runs no big deal right you can you can directly drop into Fabric and with a couple clicks of a button You’ turned on
37:28 couple clicks of a button You’ turned on a pipeline you’ve turned on a notebook you’ve turned on Sparks you’ve turned on like all these things that we would traditionally have to go to Azure to go build and wire up and pull together it just works inside the context of fabric love that I think that’s the right approach so yeah Tom you’re right like if maybe I’m an exploring things but the reason I’m bringing this point up is I think the individuals that understand the power of the use of have built real solutions on top of SQL databases they’re not going up to your Fabric and saying saying R I’ve been using lake houses this whole
38:00 R I’ve been using lake houses this whole time I’m going to try to build something in a SQL database that that’s not this user right so that user already got established processes built Solutions in other places whether it’s SQL on Prem whether it’s seel in Azure they’ve already built something so I think I think the potential here is if you want to get that developer Community into fabric you’re going to need to play to the individuals that are already have Solutions and you’re
38:30 already have Solutions and you’re going to want to entice them to migrate into fabric what additional cost incentives do you give them what additional licensing incentives do you give them one of the notes Here in the fabric Decision Guide talks about which version of software does or what is it the versioning of software it was talking about you only get serverless inside fabric so there’s no provisioned version of this one and then there’s a hardware requirement so inside as your SQL database you have Gen 5 fsv 2 or
39:02 SQL database you have Gen 5 fsv 2 or DC there’s three different versions or Hardware versions of the sequel as the Azure SQL database in fabric you don’t get to pick that it’s just the latest version so if you’re giving me the latest most robust version inside fabric then yeah maybe I’m actually more interested in moving over to that because yeah I didn’t want to get from I I I’m on a gen 3 and we know we need to migrate up to a Gen 5 there’s some there’s some friction there well if you’re going to do the migration why not migrate right into Fabric and just have it always be current it just always talks to the data as I’m saying this
39:35 talks to the data as I’m saying this now I’m even unpacking this as we go here Tommy another thought here is one of the scenarios that they spit out here at the bottom the AI scenario they were talking about operational data and mixing it with Lakehouse data I have not up until this point I have not been considering the mix of those two things right so imagine imagine you have operational data that’s happening and you need to regularly get access to a Master data table company information
40:06 Master data table company information things that are being processed in batch but you need to have access to that when you’re doing the operational system portion of things this SQL database and fabric allows you to query other things right they’re talking about the the multimodo cap capabilities of SQL right cing based on relational databases a graph database going after Json and key value data structures so I think what they’re also proposing here is like not only are you getting SQL
40:37 is like not only are you getting SQL we’re getting SQL right next to the Lakehouse the custo DB the other database systems that potentially could be feeding your company now it’s all in one place so now I can query and I can have the operational portion of fabric but then it can also go get information directly from the stationary the non-actionable items to me that’s really impactful like that that can change my designs of what I was traditionally doing that makes it a lot easier for me to go get
41:07 a lot easier for me to go get information instead of having to write a store procedure to go copy the data out make sure it’s synchronized one more time between my data warehouse and the SQL database I can just go directly access it with a query that might be an enticing feature that people will want to go to go get we got an article from you two coming I what I’m have to figure out how to do this because yeah so there’s a portion of this that’s operational but there’s also a portion of this that’s could be queried off of the warehouse and that’s maybe what they’re trying to do here is make it easier for again I think in general I
41:38 easier for again I think in general I would say Microsoft is always trying to reduce friction and make data more like a commodity everyone can go get it and build with it and I think that’s the right approach but I’m just I’m just trying to unpack the idea here what do you think I don’t know about you but it’s amazing we can do that do that but maybe maybe I’m just a simple person Mike I I just live by simple rules but in the world of our old just database days I knew that there was a Dev server
42:08 days I knew that there was a Dev server and a production server and this is where our data lived and if anything happened to it the company would fall off a cliff basically well now we’re introducing well I’m just going to integrate a lake house with a database and I’m also going to integrate mirroring from other databases here into my fabric database but I also have these Standalone lak houses and it’s a warehouse also doing SQL but not a database I have all these things convoluted and here’s my worry right
42:39 convoluted and here’s my worry right now where which one is this worth of truth right yeah we’ll have the lineage but I think we’re T we have not really been discussing data engineering or that that stage by stage types of jobs really we’re just talking about the storing the data right now so I I’m going to pose to you does this complicate our sources of Truth and really where the data in a sense
43:11 lives so you you’re this is a common message Tommy I think that you you bring up here is like where is the story of Truth where does the truth of something live so I’m less I’m less worried about that story a little bit I do think fabric also provides a a high level of capability for me to build what I’m going to call data Ops I think I think without data Ops as part of your plan you get into the scenario of where does the story of Truth come from
43:41 does the story of Truth come from and Tommy we’ve discuss this a lot of times on the podcast this is probably one of the bigger areas that I disagree with you on in the podcast is how much trust am I willing to give a various various teams about information what do I want to own and what handshake do I want to give to that business unit right so I draw this diagram a lot for my class that I run around powerbi Administration and governance but I draw the line and say look we have many different control surface areas on how
44:11 different control surface areas on how we handle data for users one of them is the apps the org apps and then the regular workspace apps right that’s an app level thing then we can step back behind that and say okay well if I can I can bundle packages of reports and give them to you using an app well maybe the team is more capable than just consuming a report from an app maybe they want more than that okay well here I’m going to peel the onion back a layer and say okay here’s a workspace with some reports in it you can then look at those reports you can build your own reports you can modify the existing reports that
44:42 you can modify the existing reports that are in a workspace and now you have access to the report layer so you build your own reports create your own tables and visuals that’s another layer of control and if we keep stepping further back you can keep stepping back further and further and further to the semantic model The Lakehouse tables and so we have to decide do do I give you access to U gold data only or do I give you access to bronze data or do I give you access to the raw data coming in in in the the raw layer in the bronze so to me if I look at this scenario a lot of this
45:13 if I look at this scenario a lot of this is trusting another team to understand what to do with the data and and shifting responsibility from one team to another I think organizations are already doing this and I don’t really like the idea of having a single team to own everything and not let anyone else touch the data or have ability to experiment with it this is why I really think the the concept of certified data sets certified content in powerbi is extremely powerful and when you find things that should be certified that’s how you control things
45:44 certified that’s how you control things another recent conversation example around this one I had another recent conversation with someone talking about let’s imagine Tommy you’re in a in a position for a company for a year the amount of semantic models and reports you build will be at a at a at a maybe a minimal level or a medium level right you’re going to build some semantic models you’re going to deploy reports you’re going to get things out in the org applications there’s a certain volume of information that you have that you control and that you Tommy own as
46:14 you control and that you Tommy own as part of the bi team as as the organization drives more adoption and as they build more things let’s imagine let’s fast forward now 5 Years From that initial starting point with with fabric how many more things are you going to have double triple 10 times more things like there’s going to be a lot more artist artifacts and items in that in that workspace so if it’s just you Tommy trying to own everything in a one-year span versus a five-year
46:44 one-year span versus a five-year development span I don’t think one person can scale themselves well enough
46:48 person can scale themselves well enough to maintain all the additional artifacts and items that are going to be created in 5 years you’re going to have a whole bunch of old things you’re going to need to deprecate you’re going to have a bunch of reports you’re going to need to move on from the semantic models are going to continually needing changes there’s going to be new data to add to this either the team grows in size to handle the capability of all the data things the data needs you have or you start delegating that out to a team so I think there’s only two really scenarios here you either grow your team or you provide more ownership of those items
47:18 provide more ownership of those items the broader part of the organization and you start as Tommy you control what you can control Tommy you can control and own these data tables you own the data engineering for this portion of the pipeline that’s your job you own the responsibility of that so I don’t really know where I was going with all this stuff I’m just Lally getting on a soap boox and just ranting here for like 10 minutes I don’t know how this relates to SQL databases but I feel like you have to think through these scenarios and we can’t assume that Tommy you can manage just the Tommy could
47:50 you can manage just the Tommy could manage everything from one year all the way up to fiveyear span there has to be a strategy to figure out okay in five years some of the content you’re going to make is is amazing it’s gold it’s what runs the business totally get it we need to be able to prioritize that content over top of all the other things Michael’s building a random report on top of some Google analytics data that I use for a week and then I throw it away like Tommy you shouldn’t be managing
48:20 like Tommy you shouldn’t be managing that like that should be on me but maybe some of the data that I need comes from Tommy’s Master data about our products that a thing that’s important and can be reused across the organization so it’s it’s those things it’s like identifying what data is relevant giving ownership to those things and then allowing the rest of the organization can to consume that data Donald I also point out Donald’s also in the chat here as well I’m going to point out his comment here Donald says Kimble which is the the Kimble star schema I think is I
48:50 the Kimble star schema I think is I don’t remember the Kimble’s first name but the the Kimble group I guess is what what they publish documents around all this thing around star models and data warehousing and things but Kimble recommends only users should have access to the gold layer and this I agree with this but then I would also argue Donald and I’d be curious your thoughts on this as well Donald if you’re in the chat still what about the semantic model is the semantic model considered gold layer or is the semantic model Gold Plus or would you or would
49:23 model Gold Plus or would you or would you change whether or not the semantic model lives in gold or not depending on the capabilities of the team you’re giving those gold tables to I’d be curious your thoughts on that one as well Tommy up to you what do you think did I just dobox a random thought here that’s just not relevant we’re a little off so we’re talking about gold layers here’s the thing pull me back in pull me back in let’s get back here on track you need a tag you need to tag it you need to tag it to tag it so I think the the two biggest things
49:53 so I think the the two biggest things is and this is what we all known with fabric 2 but I think especially when we introduce databases and I’m going to wrap it up with my closing thoughts the ability to have databases in fabric is not only a awesome much needed feature but it’s something I really wish I had six years ago how many projects and teams I was working on and I I just wanted their data to come in a cleaner way and we tried SharePoint we tried all these things but I could
50:25 tried all these things but I could just now but click of a button start a database and just start pushing their data in a very friendly UI sign me up great however again we’re also you have to be mindful I think a lot of things that go hand inand with Microsoft fabric today is you don’t work in a vacuum in fabric unlike if in a powerbi desktop report I can go crazy and if I mess up well it’s not published yet right all
50:56 well it’s not published yet right all the artifacts I can go back and delete we have to be mindful when I’m in Microsoft fabric that you’re playing in a shared playground this time you’re not playing in your backyard so whatever you test try or do is not necessarily just always GNA be visible but you’re gonna add to the Clutter so you almost have to have the mindset too of the administrator of the governance even when you’re testing and creating in fabric so I I’m going to put my final thoughts here I like your final thoughts on those Tommy and I think this is good
51:27 on those Tommy and I think this is good I think we have to be more mindful of fabric is designed for collaboration bringing multiple teams together I think we have clearly added to the mix here we have clearly added a new persona we are no longer data scientists data engineers and analysts in the in the fabric environment the addition of the SQL database really pushes more towards the developer now we want developers to come and land all your data inside fabric as
51:57 and land all your data inside fabric as well one thing I’ll just point out here as my final thoughts here as I’m reading through the article from sqlbi and I’ll actually put this one here as well really good article by the way if you haven’t read it definitely go read the database article from sqlbi I think it’s worth a read and again Marco has been doing this forever and he is an expert in SQL like that’s where they came from analysis Services is a a solution born out of SQL he makes a note here in the middle of the article which I really want to emphasize here is what
52:27 I really want to emphasize here is what we aimed when is the SQL database is this something that we are now replacing data Mars with and the the statement here is the SQL database is a fully-fledged operational database inside Fabric and was the SQL database we get now today was this what we were trying to design the question is is this what we aimed for when data Mart was initially released Marco says definitely yes so when data marks were released we really wanted it to be a SQL database that’s what we really wanted
52:57 database that’s what we really wanted was the SQL database engine or what we get I agree with that statement the second question he asks here which I think is very relevant and he’s going to give you the consultant answer here is this new SQL database in fabric is it going to replace any of the other data stores in Fabric and he says it depends so I think you really need to understand to me this my final thought here is we really need to understand where is this SQL database good what is the design for
53:27 SQL database good what is the design for that SQL database if you need realtime data assd transactions on row-level information and you’re going to build Rowl details of of of data I think the SQL database is a really good solution if you’re going to build apps on top of things and you need those apps to store their data somewhere and use an operational store fabric SQL databases is a good solution may that would be a good a wise place to put your data if you’re going to look at things and make decisions on stuff in batch once a day
54:00 decisions on stuff in batch once a day load things historically and you don’t need information within the under a minute right then I think we need to look at other Solutions Lakehouse what does kql do how are we getting other data in is there an event stream that we need to be looking at so yes I do think SQL is a very good option but there also is a lot of other tools and I think what this does for me is it makes me more critical around all the different computes and tools we have in Fabric and figuring out which one is the
54:30 Fabric and figuring out which one is the right compute for the right data design and I think this is going to get harder for us as we continue to add more and more tools inside fabric anyways really good idea here I really like Marro’s article article I would definitely recommend go read that and that and yeah check it out I think this is actually a good feature here all right that being said thank you all we’re just about to add an hour now so thank you all very much for your time I hope you have found this conversation
55:00 have found this conversation informative enjoyed talking to us about what fabric SQL databases look like let us know in the comments are you using fabric SQL databases do you have plans to go consume and use them in the future what project would you use them for we’d love to hear from you and react to some of your projects that you’re looking to push onto fabric SQL databases with that being said we don’t promote the podcast at all you are the promoters so if you liked this episode if you like what you were hearing today we do appreciate you reaching out either on social media or
55:30 reaching out either on social media or letting other people know in your area that you enjoyed the podcast and you had some good conversation here Tommy where else can you find the podcast yes you can always find us on Apple Spotify or wherever you get your podcast make sure to subscribe and leave a rating it helps us out a ton you have a question idea or a topic that you want us to talk about a future episode head over over to power. podcast leave your name and a great question question and finally join us live every Tuesday and Thursday
56:00 us live every Tuesday and Thursday a. m. Central and join the conversation on all of power B tips social media channels thank you all very much and we’ll see you next time [Music]
Thank You
Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.
Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.
Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.
