PowerBI.tips

Dataflows Gen 1 vs Fabric SQL for Reference Tables – Ep. 399

February 19, 2025 By Mike Carlo , Tommy Puglia
Dataflows Gen 1 vs Fabric SQL for Reference Tables – Ep. 399

Mike and Tommy compare two practical ways to manage small-but-critical reference tables: Power BI Dataflows Gen1 versus a Fabric SQL database. They break down tradeoffs around refresh, governance, CI/CD, and downstream consumption so you can pick the simplest option that still scales.

News & Announcements

  • Submit a topic idea / mailbag — Have a question you want covered on the show? Use the submission form to drop your topic ideas and real-world scenarios. The best episodes usually start with a messy, practical problem from the field.

  • Subscribe to the podcast — Catch the live stream every Tuesday and Thursday and find links to listen on Spotify and Apple Podcasts. If you’re sharing episodes internally, this page is the easiest “one link” hub.

  • Tips+ Theme Generator — Mike and Tommy call out Tips+ as a fast way to generate consistent Power BI themes. If your org struggles with brand compliance (fonts, colors, visuals), a repeatable theme workflow saves hours of rework.

Main Discussion: Dataflows Gen 1 vs Fabric SQL for Reference Tables

Reference tables are the unglamorous backbone of good BI: things like department lists, status mappings, scenario definitions, calendar attributes, and “friendly name” tables that drive slicers and relationships. The episode focuses on a common question:

Where should these small tables live so they’re easy to maintain, govern, and reuse across reports?

Mike and Tommy compare two patterns they see constantly:

  1. Power BI Dataflows (Gen1) as the place to land and curate reference tables using Power Query.
  2. Microsoft Fabric SQL (a SQL database) as the system of record, with Power BI consuming those tables.

When Dataflows Gen1 Are a Great Fit

Dataflows Gen1 shine when your reference tables are:

  • Power Query-first (you’re already shaping the data in M)
  • Owned by the BI team and maintained alongside datasets/reports
  • Small and refresh-based (daily/hourly refresh cadence is fine)

The big wins are speed of delivery and familiarity: analysts can build and adjust logic in Power Query, reuse the output across multiple datasets, and keep “BI-owned” mappings close to the semantic model.

Mike and Tommy also discuss the reality that many teams already have Gen1 dataflows in production. If it’s working and the tables are truly reference-scale, you don’t automatically need to move.

Where Dataflows Gen1 Start to Hurt

The conversation highlights common friction points when reference tables become more operational:

  • Governance and ownership — Once multiple teams depend on the same mapping table, you need clearer stewardship than “whoever last edited the dataflow.”
  • Change control and CI/CD — It’s harder to treat reference data as a product when updates happen through the UI without a tight deployment story.
  • Refresh dependency chains — Small tables still create outages if downstream models can’t refresh until the upstream flow runs.

In short: Gen1 dataflows are great for BI-managed curation, but they can become a bottleneck when they turn into shared infrastructure.

Why Fabric SQL Is Attractive for Reference Data

Moving reference tables into Fabric SQL (or any SQL store) gives you:

  • A clear system of record — SQL is a natural “contract” for shared tables.
  • Better multi-tool consumption — Not just Power BI: data science notebooks, ETL pipelines, apps, and other BI tools can read the same tables.
  • Stronger operational patterns — You can align with database practices like permissions, schemas, and structured change management.

Tommy frames this as “put the table where it belongs for the organization,” not where it’s easiest to build in the moment.

Decision Framework: Which One Should You Choose?

Mike and Tommy keep it practical: pick the option that matches the blast radius and reuse level of the table.

Consider Dataflows Gen1 when:

  • The table is BI-only and mostly supports one semantic model or a small set of reports.
  • The transformation logic is easiest to express in Power Query.
  • The updates are infrequent and a refresh-based workflow is acceptable.

Consider Fabric SQL when:

  • Multiple teams/apps need the table and it should be treated as shared data product.
  • You want a cleaner CI/CD story and separation between data storage and reporting.
  • You expect the “reference” table to stop being small (more rows, more joins, more constraints).

Practical Tips for Either Approach

A few tactics come up that help regardless of where the table lives:

  • Document intent: a reference table should have a defined owner, update process, and downstream consumers.
  • Design for joins: stable keys and consistent datatypes beat pretty names every time.
  • Keep it boring: the more “core” the table, the less clever the transformation logic should be.

Looking Forward

The bigger takeaway is that reference tables aren’t just “tiny data.” They’re shared contracts that shape every report experience. As your Fabric adoption matures, Mike and Tommy encourage treating widely reused reference data like a first-class product: give it a home that supports governance, reuse, and predictable change.

Episode Transcript

Full verbatim transcript — click any timestamp to jump to that moment:

0:33 good morning and welcome back to the explicit messers podcast with Tommy and Mike good morning how’s it going Tommy

0:38 Mike good morning how’s it going Tommy oh my gosh good morning I am so stoked about today so the topic or just in general because it’s like a a Tuesday and it’s a great day to be a Tuesday it’s always a great day to be a Tuesday when we do the podcast but especially our topic true this is one of the only highlights of my Tuesdays after this it’s all downhill no no I’m same I’m just teasing it’s that’s not that bad Tuesdays aren’t that bad let’s we don’t really have any other news or announcements of things if

1:09 news or announcements of things if anyone has any topics or news things around we looked at the powerbi blog the fabric blog wasn’t a whole lot of extra things coming out but Tommy we have a a little bit of a an opener here but before we get into that let’s talk about our main topic for today our main topic is going to be data flows gen one and comparing that to fabric SQL for reference tables reference tables are these things that are like smaller they’re supporting either data loading processes or

1:40 either data loading processes or they’re setting configurations up for your pipelines or process things something that you might use inside your data models smaller tables that support and supplement your your main data model so where should we put these things now do we change where we land our data is it different now that we have a SQL a proper SQL database inside fabric so we’ll unpack that a little bit talk a little bit more about our experiences with SQL Server so far inside Fabric and that’ll be our main topic all right Tommy let’s go do a

2:10 topic all right Tommy let’s go do a quick little opener here I’m going to open up a little topic here is how does I’ve been doing a little bit more training here in the recent days I’m curious where your training is what do you feel like the blend of companies or customers asking about fabric versus powerbi trading where where do you feel like that’s going right now for you or do do you in general just it’s ramping up I’m curi that’s a great question and be you and I have been doing training for a while I’ve been a full-time trainer for the past three years and did it on the side for a while and you’re this very very similar

2:40 while and you’re this very very similar in terms of just doing powerbi training yeah honestly if I look to the trend for me I’m getting more requests for powerbi I’m still doing dashboard in a day like it’s not like it’s going out of style sure or like it’s not going out of style so that’s a pretty popular really popular is still straight powerbi training General training for people migrating to powerbi we’re we’re still doing Advanced Training around data modeling or semantic modeling sure and still Dax like if you were to say

3:13 and still Dax like if you were to say the three top things it’s semantic modeling Dax and dashboard in a day I would say Dax is probably by far the most engaging part of the topics that we we work on it feels like I do a lot of online training as well and videos on three on YouTube I would say the one the ones that get the most engagement are all around Dax topics for me so anything that’s Dax related seems to get a lot of hits people are really interested in learning more about it I would also probably argue like Dax is easy to start but it’s difficult to

3:43 easy to start but it’s difficult to master so like you need extra training around how to how to work with it how to get better at it so I feel like that’s very relevant I would say this for me I think a lot of my training is still centers around powerbi related things I am now adding a lot more flavor of what fabric is more often I feel like customers are using powerbi for their General reporting for most of their organization and they’re starting to explore what does it mean to do fabric what what does what what should I know

4:14 what what does what what should I know around Fabric and I’m finding more around instead of doing pure training on fabric it’s more like we have a project we think we might fit fabric yeah we need some guidance to make sure that we’re doing it right so it’s it’s it’s less about training but it’s more like on the job like project building but a small like we need someone just to kind small like we need someone just to advise us are we doing this right is of advise us are we doing this right is this something we can actually do inside inside fabric so very good way to put it yeah it doesn’t feel like a

4:44 it yeah it doesn’t feel like a pure training because people are like yeah training is interesting but like why would I want to go connect to random like the New York Taxi data thing like interesting yes can I use a notebook fine but I think more companies are interested in if I’m going to spend the money on investing in something to learn let’s learn on a very small project and spend the money on the training of that or a consultant to help us come do that oh and by the way here the people are going to learn this stuff and they’re going to learn how to use notebooks and how to use lake houses and what does the structure look like there I think that’s

5:15 structure look like there I think that’s part of the education that I think is coming on along right now if there are two pools the powerbi pool and the fabric pool right now everyone’s still helping how to swim how to do a backstroke in the powerbi pool what I’m getting asked to help in the fabric pool is how cold is it if I put a foot it’s very much like a lot of companies are putting a foot in the water but they’re not ready to just simply dive in and learn how to in the fabric pool I I’m I’m hearing your analogy I hear your analogy and I will raise you one analogy oh I like anal yeah let’s go

5:47 one analogy oh I like anal yeah let’s go I feel like powerbi is the pool and fabric is the hot tub Fabric’s I think see the ocean there there’s a lot there’s a lot less people in it it’s very true yeah this is the selective this is the one for the when they change out the water it’s like okay play time’s over this is now the exercise pool Adult Swim it’s adult swi Adult Swim yeah Fab’s Adult Swim you Adult Swim yeah Fab’s Adult Swim the one thing that’s actually know the one thing that’s actually probably changed the most is not so much the content interestingly enough it’s

6:18 the content interestingly enough it’s the persona it’s the type of people that I’m actually train interesting okay so when I first when I first started training obviously it was all business intelligence people that was their full-time job they got paid to do powerbi or we going to get paid to do powerbi I’m getting a lot more and more requests for the managed selfservice side of training where we already have a powerbi shop but our team is going to be building reports and building report level measures we’re not going to be doing as much as the semantic modeling

6:51 doing as much as the semantic modeling but we’re more around the business side that liaison that middle hybrid role rather than a dedicated business intelligence that’s what’s changed yeah I think you’re right I think I would I would also agree I think the people that we’re talking to more of is this is where this is I call it like a spectr of like things that right so there’s definitely people that are pure analyst people doing data analysts or data analytics and then there’s like data engineering and data engineering would be much more on this

7:21 engineering would be much more on this heavy tooling a lot more code looking at tables and and producing tables of data and I think between data modeling data modeling fits I feel like some somewhere in the middle between these two personas and so as you look at the Spectr of analyst to Pure data engineer the lines are continued for me blurring because the tooling I guess I guess how I’d say this is traditionally the tools would define the role right it was a SQL server and you’re a DBA right data engineering right or that’s the role that

7:52 right or that’s the role that you play you’re you’re you’re using business objects and you’re taking dimensions and and measures and and facts or dimensions and putting together making tables or visuals of data that means you’re cly in the analyst side so that line was very like clear because what you had access in the tool was the only thing you had access to you can only do that part of the job I think what’s happening now is the idea is more people are having access to other parts of the tool that

8:24 access to other parts of the tool that are less required to make sure you’re only the day engineer you’re only the analyst there’s more variation so I’m seeing does that make sense like I’m trying to say it’s more of like a a gradient now than just pure black and white and I I think too if I had to put a prediction I think we’re going to see in the next 12 to 16 months 12 to 18 months fabric training because it’s kind months fabric training because it’s be forced in a lot of organizations of be forced in a lot of organizations we’re still at a point right now where fabric is completely optional for an organization to purchase right from the

8:55 organization to purchase right from the licensing point of view I believe in April the premi capacity is going away the ability to purchase it and the only thing you’re going to be able to purchase is the fabric capacity well let’s let’s be clear about I’m going to be a little bit more yeah refined about your comment yes some premi skews are going away but not premi in general like you said premi is going away yeah the ability to purchase yeah it’s like the P skes the EM skes the a skes they’re going to not be as attractive anymore and you’re only going to be able to and want to purchase the fscq moving

9:26 to and want to purchase the fscq moving forward so FCS do everything it does embedding from F2 all the way up to the highest skew you have it can be paused started and stopped it can be ramped up and ramped down in the middle of the month like all these other things that are around the F it’s a lot more flexible than the previous powerbi skes that they’ve been building so it’s not that premi is going away to have premi per user and I think what you were talking about Tommy in April was you may have been alluding to the price increase that’s coming in April as well so a pro user goes from 10 to 14 a

9:58 so a pro user goes from 10 to 14 a premi pre user go from 20 $220 to $24 us so there’s a $4 price increase coming for both licenses coming as well which may start having companies really set back and say do we really need this are we going to take that price increase or we need to figure out another way to use

10:16 we need to figure out another way to use our powerb environment right and I and I believe and again correct me if I’m wrong if I want to purchase premium after April I’m basically purchasing a fabric capacity at that point yeah there’s some weird like Edge case things I think like if you already own a premium and you want to extend it a little bit longer that might let you a little bit more but like you physically if you were an a customer who doesn’t have it I don’t think you can already purchase it now I think you it’s already blocked you can’t even get it today so unless you can renew up to a certain part and then they’re going to say nope you have to move over to fabric so I’m actually hearing organizations now

10:47 actually hearing organizations now planning a lot of work around hey we need to migrate one skew to the other we need to make sure everything’s going to work in the new world that we’re going to be building and right we’re only interested in using powerbi we interested in using fabric yet we need to buy this fabric skew how do we turn off the fabric things so that we don’t to let people because it’s going to be more discoverable you heard that really yeah interesting well they’re powerbi shop right if they’re paying for fabric they’re like we’re using powerbi I don’t want to consume extra capacity

11:18 I don’t want to consume extra capacity because we’re still just powerbi we’re not sure yet if we’re going to use fabric at this point or they have other processes mean there’s companies out there who have already built things in data bricks yeah a lot of things they don’t need in fabric somewhere else I don’t need in fabric somewhere else that’s okay that’s exactly they mean that’s okay that’s exactly they play well together that stuff that’s fine that people say that now but that’s not how Behavior works as soon as you open that up and it’s available people are going to start playing around with it once those options are actually available on the screen that’s just Behavior but that’s just my point though my point is we’re organizations are

11:49 my point is we’re organizations are looking to say look we have to go to the fabric skew but we need to limit access to who can build things in fabric officially right so that’s there is the ability to turn that off you can you can disallow people from greting fabric so it’s more around it’s it’s been a bit sneaky I would say for Microsoft and I think to their point to their credit right they want you to consume more they want you to explore more I think from their perspective if I was Microsoft look we’ve built all these other great new tools that you can use at your disposal these could make your life easier in a lot of different ways we want to make

12:19 lot of different ways we want to make sure that people can easily discover them and use them from their perspective they’re trying to help you out they’re trying to give you better tools I think from the organization standpoint not all organizations are willing to take that bigest step and they want more of a centrally controlled slower roll out of those things to get into fabric things fabric items I feel the words that you’re saying right now were just as true 10 years ago when powerbi premium came out like organizations are not ready for powerbi and an Enterprise setting you I think that’s just

12:50 setting you I think that’s just companies in general they’re always going to be more fearful of these advanced technology breaks I was having conversations with this gez like I feel like in 2020 2019 on organizations going this premium that’s a big leap that’s a big jump for us right now because we like powerbi Pro and this is always how organizations are going to go well I think it’s it again it’s this this story of like balancing the value the cost right and I think right now even at $14

13:23 right and I think right now even at $14 a person powerbi Pro is still a great deal for a lot of people yeah and then if you think about like think about the scenarios of when you’re going to use Fabric and if you can build better models more efficient models your model stays smaller you can actually stay on those lower smaller F SKS for certain things if your organization has lots of big data huge tables you’re probably not going to get away from using a small fabric skew to post those models however if you’re a medium to small siiz business those smaller f skes are amazing I’m using one

13:54 smaller f skes are amazing I’m using one for I’m using an F2 skew for my organization and my reporting I have small data it runs every night I have a SQL Server that runs when it needs to I have data loading from apis every night and multiple times per day it does just fine for what I need and with an F2 I can embed it into other applications and so this is one of my points that I’m thinking about here is is I unpack what’s happening with the price increases I think more organizations need to really seriously look at is embedding something you want to do to bring your average cost per user down I

14:26 bring your average cost per user down I think you really need to take a detailed look at internal to your organization what do people actually do what do you what do your users need from your reporting do you have a large amount of people that are just consuming creating reports or do you have a lot of people creating reports and doing their own data modeling and loading all their information in so depending on what you have in your organization if you can characterize what your users are doing and if you could centralize or limit the amount of people who actually need access to power.com you actually could

14:57 access to power.com you actually could save a lot of money by moving over to a embedded experience yeah and this is one of the reasons why I offer an embedded accelerator to help users and companies save money on a price per user experience well the fact too is the to your point the embedded situations it’s not a foreign concept anymore where we have to go off track a proov a concept is much easier to try now yes I would agree yeah anyways all that to say good things I’m excited about

15:27 things I’m excited about what where they’re going with things I think fabric is going to be the way to go moving forward and I think they’ve made it many entry points to prices that are going to allow you to do a lot of starting with fabric the value I think is there the amount of stuff you can do even at an F2 level I can make notebooks call apis load data to a lake housee like all this stuff really makes a lot of sense to me so anyway yeah I think when we revisit this at episode 500 I think we’re gon to see a lot more fabric training on our books I would agree with you there I think I think you’re going to see a lot more of it moving forward

15:58 to see a lot more of it moving forward but again I is as we go from report Builder down to data engineer the audience does I think get smaller and smaller and smaller as you go from that wide audience and I think that’s one of the reasons why powerbi has been such a great adoption is there’s so many people who need to build reports just get data together like that’s a huge audience but the more we get down towards the data engineering team it’s going to be less and less people so I do think it’s going to be a smaller funnel I do think it’s going to be I think there will be more requests for it you’re making a point here that I will will say for another episode about the data engineer

16:29 another episode about the data engineer side of it where I’m really more and more and I don’t want to go too far off topic with this in the parking lot but I’m leaning more and more that really fabric is going to be for the business and I I think that data engineer even title people don’t even know need to know that that’s what they’re doing but that’s gonna be exactly what they’re doing it’s just too easy for that but we’ll save that okay sounds good all right let’s go into our main topic for today so Tommy frame us out here data flows gen one versus fabric SQL databases for a reference table what what do you think we’re going to go with

17:00 what do you think we’re going to go with this I cannot wait to talk about this is actually something I’ve been wanting to have a conversation for a while so we’ll back up a little data flows gen one if you remember came out in 2017 and it was simply the ability to store data in a table that you could reference in in a powerbi report it worked like a SharePoint list it was just another connector now what a lot of companies in my own organization included utilize SE reference tables for data flows for were these things called reference

17:31 were these things called reference tables were simply a master list of my dimensions and attributes I had all my customers listed I had all the sales territories listed it was basically the Transformations that you had to reuse multiple times if you recall before data flows if you were a powerbi shop and you had a query or Dimension that you had a transform in power query the only way you could reuse that was saving that power query that M file somewhere and go going oh just go here if you need that reference table and especially if it had

18:02 reference table and especially if it had a lot of steps to it well then you had a lot of queries it was a lot of upkeep going from report to report large organizations more well cultured data cultured organizations have these reference tables usually in SQL databases controlled by data Engineers but the problem with that is any changes you need to make especially if you’re dealing with sales teams and territories it needs to be updated rapidly which is where the data flows gen one came into because now I could do these Transformations connect to all of my different sources and create a master

18:33 my different sources and create a master list of whatever my reference reference tables were an easy one was a dimension t a dim date table if my organization didn’t have a dim date table and I wanted to reuse the same one I could store this in a data flow gen one but again I found myself and a ton of other organizations utilize gen one for all of their references especially if they’re data engineering team did not have the upkeep or even just those reference tables mastered in a SQL

19:06 reference tables mastered in a SQL database this was the way for a lot of organizations and now we’ll fast forward to fabric SQL databases coming out which allow us unlike gen one which was the end goal once you created a data flow gen one the only place you could connect to it was a power be model we now have SQL databases in fabric which again which is the preferred way to keep your reference tables it’s the preferred way you have your attributes and your Dimensions stored so that leads us to

19:38 Dimensions stored so that leads us to the question now do we focus for and specifically for reference tables I think that’s the context for our conversation today do we specifically focus on utilizing these SQL databases for reference tables or do gen one data flow still have a major part to play yeah let’s talk about let’s unpack some of this so there’s a lot of things I think that you’re reviewing here Tommy so one is let let me unpack a bit of like the data

20:09 let let me unpack a bit of like the data flows gen one scenario and I’m going to actually maybe give you some Alternatives here that may also be interesting we’ll see how this goes anyways let’s let’s unpack this a bit so data flows gen one that was really like the ripping out of power query out of the semantic model and saying very first time we could separate the data engineering away from the

20:31 the data engineering away from the actual semantic model do it somewhere else and I would also argue like The Lakehouse dat FL Gen 2 all the new other data engineering tools we have now in fabric does the same thing right here we’re going to prepare a bunch of tables up front before you get to powerbi oh and by the way the semantic model could be a direct Lake model like just point at the lake tables and then that becomes your semantic model and I will admit I’ve got a couple semantic models that are used occasionally that are now doing direct Lake those first loads into the direct Lake model are a little bit

21:02 direct Lake model are a little bit slower it does take a little bit of time to like load those initial tables that very first render but once it’s up and rendering I’ve seen the speed of the report being very quick so sometimes it takes a little bit for that first render of the report to get going then after that it runs very smoothly I’ve had a really good performance from my small data sets that I’m doing personally about this stuff so I think that’s where I want to start with just that point there right so SQL database so now and then you went into a topic around like Dimensions where do the dimensions say yeah and I think I

21:33 the dimensions say yeah and I think dimensions are a bit interesting because those are the potential common elements you would want to reuse across different semantic models and so that’s where the data flows would come into play here 100% and data flows were interesting because you could use them for free they would store data in the cloud for free this is where all this gets really nice to me because data flows gen one would just make CSV files behind the scene load the data in drop it down as a CSV file and you could pick

22:03 it down as a CSV file and you could pick it up and then read it with other things after the fact the other part twoo I think you’re talking about Tommy is if you don’t want to beat up your backend SQL server or your backend systems if you’re pulling from production the data flow would go read the data once pull it in and once the data is read in it’s done no more you don’t have to go back continually back to the source system over and over again that’s think that’s another reason why we would use those things I just want to I wanted to priz some context there around that part before I went into like SQL database

22:34 into like SQL database thing I let me just say this I’m pleasantly surprised with how the capacity units the cus are used from the SQL Server so when I started playing with SQL Server I thought oh no this is going to be one of these like we’re we’re turning on a physical server it’s going to be like on all the time we’re we’re spitting up Hardware it’s going to be this thing that’s going to always exist when I looked into it and actually started running SQL Server inside fabric it doesn’t appear to be

23:05 inside fabric it doesn’t appear to be that way it appears to act like a spark session to some degree it turns on you run your queries like you need to it scales up and uses more CU if need to be if you’re running a lot of queries so it seems to scale up a little bit and then when it’s done after a period of time of not touching the SQL database it turns off like it goes away it goes to sleep and you’re no longer charged any more compute units for the SQL database so Tommy to your point right I had a lot of pain in my mind around if I was going to spin up a SQL Server I didn’t

23:36 going to spin up a SQL Server I didn’t traditionally in early powerbi I would go find an Azure SQL Server turn it on it would be on all the time and I would load my tables to that SQL server and then I would load from SQL Server directly to powerbi seemed very redundant because I needed like two hours in the morning to have the server on and I was paying for the server for 24 hours a day didn’t I didn’t like that experience on the and I feel like this new fabric experience is not that way at all and let me make a very important note to what you said that’s even if you

24:07 note to what you said that’s even if you as a powerbi pro at organization had the ability to create a SQL database much less manage it again most organizations the people who are dedicating themselves to powerbi are not the ones who are managing their SQL databases at the organization so you didn’t have that crossover of one permissions or credentials to do both you were very much one or the other I agree and that’s like my

24:37 that’s like my this is where things get a bit more interesting this is what we’re Microsoft is doing here right they’re bringing these tools that were traditionally outside of the hands of the business user and giving them these tools that they could easily go touch and use and build and again we’re bringing that Persona of the analyst and the data engineer into the same working area the effort to make a SQL Server is literally search for the word SQL click the button and it appears like friction like no that friction is gone

25:08 friction like no that friction is gone that’s the ease of integration and oh by the way hey did you want to go use a data flow Gen 2 and write data to the SEL database you can do that like you can or I want to rip out of data like I want to use something to go read data from somewhere else and bring it to the SQL Server I could use that in a notebook like all the tools are like easily communicating with each other and there’s like low lower barriers if you have access to enter ID and you have access to the resource you can talk to it which makes a ton of sense yeah and let me be very clear too

25:41 sense yeah and let me be very clear too I never found great success with data flows gen one with fact data when it came to the actual transactional data that was coming in because here’s the important distinction with data flows gen one when I refresh data flows gen one I have to schedule that in the service if I have a model connected to data flows gen one let’s say I have like three of my tables it’s not going to update the data flow itself whatever is stored in the data flow at the time of the semantic model refresh is what I get

26:12 the semantic model refresh is what I get so fact tables never made a ton of sense nor did we have a lot of success with them but data that was updated maybe once a day maybe again like reference data Dimension date customer accounts sales territory anything I’m using as a dimension based table we found immense success with so I and that’s where I really want to be clear here obviously we know that SQL lake house is GNA make a ton of sense for fact data that’s I think that there’s not much of an argument there but the fact that I can

26:44 argument there but the fact that I can store that data in data flows gen one and again I could manage it like my team could run and manage this business intelligence based Dimension data even if this may be reused across other teams this was int integral for any powerbi report or model that we had and reused the same dimensions over across in reports this was a huge hurdle for a lot of teams before because again if you want to get the same sales team based on the same country or zip code well again

27:16 the same country or zip code well again he was either reusing the same M code but that’s where data flows gen one came into play and I think a lot of teams maybe they relied on a sequel you teams maybe they relied on a sequel their DBA team for this if they if know their DBA team for this if they had that but again a lot of teams we found were very Dynamic it had to be updated closer so either you had a good change management in place to get to your dbas to get them to approve the request and to make that request all in time before a certain deadline or you fell into a lot of trouble so this is

27:46 fell into a lot of trouble so this is where I still find data flows gen one have a huge part to play here yeah I do think so I don’t really mean I do think so I don’t really love there’s some things I think that are downsides I’m not really in love with either gen one or gen two both of these two yeah there’s there’s just little things there’s little death by Thousand Cuts things that I’m like H I’m not sure if I’m loving it and I think that’s also because I have really worked on me and my team to really focus on like learning like

28:16 really focus on like learning like Python and notebooks super powerful and I’m also finding I was actually just playing with yesterday which is Awesome by the way you can use VSS code to connect a notebook so notebook in the service you can use a button that says open in vs code you can open the web or you can open locally if you open locally you can use GitHub co-pilot for free and with GitHub co-pilot you can just start using a co-pilot right next to all of

28:46 using a co-pilot right next to all of your code hey write me a function that does this hey this is my response parse this out in this way from Json it does I this out in this way from Json it does the amount of help the code mean the amount of help the code can Pro the AI can provide you in code assistance is incredible so no one should really be afraid of python anymore or writing code anymore because it’s the co-pilots are learn how to use that learn how to use those tools and your knowledge gaps and how you syntactically write code can pretty much

29:16 syntactically write code can pretty much disappear you can just talk to the AI agent and it will just give you the answers that you need so that one I’m really looking forward to but let me go back to your comment around data flows gen one right I like the idea here I like the idea of reusing things I think when you were talking I had this idea of when we had the SQL DBA and SQL servers we really needed a good definition as to what the data would look like what was the data coming in as and what was the data output going to look like every project I’ve ever been on for anything data engineering or any

29:47 on for anything data engineering or any powerbi project everyone tells me oh the data is in good shape it’s clean it’s got everything we need in it’s all ready to go never have I ever been on any project that has had clean data ready to go in the way that a semantic model would need it right so there data is there it is in maybe like a fact and dimension experience but there’s what happens is when people start asking I want to calculate this about the model one there’s not enough fact tables two there’s usually the wrong granularity on two different fact

30:17 wrong granularity on two different fact tables which doesn’t help me and then there’s also people trying to like do too much in the fact table and there’s not enough being pulled out to the dimensions so every project has been let’s really talk about what you’re what patterns are you trying to compute on what is the output of the report because we always need to shape the data so my point here is yeah anytime you brought in these older projects in order to have a SQL DBA give them requirements of the table needs to look like this it took a little bit of time to get there and a

30:48 little bit of time to get there and a lot of times in our analysts don’t know what the table structure should look like we don’t know yet we’re still trying to figure things out I don’t know the data well enough to be like oh we need to filter this stuff out and drop these columns and pull this in pull that I just have I just know there’s a table and I’m trying to figure out on the other side of that table how do I make value from it what does my leadership need to talk about to get those reports done only after have I played with the data a bit then I can say okay I’ve now built a power query that shapes the data and now we have it ready to go for reporting you’re giving me taals from the to me like the whole power in my story when when I started talking about power RBI before anything fabric came out my whole story was look power query is this amazing tool that Bridges the gap between a business analyst and a DBA that was the that was the tool because I could then click on things I could manipulate the data shape it around do all the things and my still one of my main talking points in my class is work hard on automating work hard on learning how to use power query to do things okay all that I’m going to pause there I know I said a lot of things there reactions Tommy yeah like I said well you’re you’re giving me a little like I said Tales From The Trenches I’m like because you’re just reminding me of stories of hey we need a dimension table oh we have it but again I need to to your point and a major Point here a lot of organizations or DBS it may be in a format for them but not in the right format for a semantic model Tabler model and power right green and and just as almost H as a cruel irony a lot of our data flows gen one for reference tables were off of SQL databases off of their Dimensions but they had to be combed and manipulated to work with the semantic model we needed a few extra columns or we need to remove there are a few distincts that we needed to get out of there but it needed to be combed and and transformed in a way to work with the semantic models but they were coming from the team SQL databases so and that irony and again because but the big win of data flows one was who owned it it was the business intelligence team and usually a lot of times we worked with the business because again it was stuff that they owned too the sales team again is a big one where the sales team may change every quarter on who’s your manager or what your quara is and all this was stored in a data flow gen one so you needed rapid rapid change management you needed a constant communication and I need to be able to change this when I needed to or when the this is really interesting you’re pointing this out now because this is I’m that scenario you just gave right there there’s a new term that I don’t really love but it’s coming out it’s called trans litical so it’s transaction analytical systems so it’s transactional systems that doing analytical things so there’s there’s a lot of language around what this system is and the idea is exactly what you describe right hey we need like a physical SQL Server that is a transactional system and it needs to be right next to our analytical systems so we can have to your point right hey I need a portal I need some place where we can go in and make real changes to budgets in real time for multiple people whenever we want and immediately have those changes being reflected or very quickly having those changes reflected in my reporting this scenario gives me the ability to have visuals or now build a new experience and Microsoft even talked about this they’ve talked about there’s potentially new visuals coming that are going to be built by Microsoft they’re going to unable you to Ed edit data in a report and have that data automatically be sent back to the SQL Server which will then automatically be pushed back into the reporting so there’s been a lot of companies that have been solving this I’m trying to think of power power power on is one of the companies which I think got bought out by someone at some point power On’s been a company there’s been a couple other ones over there as well they’re doing a lot of this real-time analytical things which is great I think it serves a need we have these needs of being able to enter in budgets estimates and things of those nature and keep them up to date yeah thanks Greg the word here is right back right so the right back experience that’s what we’re talking about but in I that’s what we’re talking about but in it’s funny that we talk about right mean it’s funny that we talk about right back because right back is like this concept that’s like you just get it with a SQL Server Like It Is the whole thing is called writing back it’s just doing its job info river is another G yeah thanks Greg infar river is another company that does write back as well so that experience now exists where we previously we use data flows G two and that whole we always had to wait for like the data flows gen one or two to like refresh itself and then we could then utilize that data somewhere else and the problem and this is that’s a really good point because I want to say this with all the love in the world for data flows gen one and I I know I’ve said this before but I my first true love was Data flows gen one I remember the day it came out because was the day after my my first daughter was born and I remember was so upset I was happy because my daughter was born but I was upset because I was on paternity leave and I couldn’t work with data flows and I was so excited to so I remember it was I’ll I’ll just never forget that moment but the problem with right back here’s the thing though this is where data flows gen one and again I say this with all the love in the world they fall short you don’t you cannot write back to the final result of a data flow gen one it is stored in in powerbi the only place I can connect to it is the semantic model and I and and data marks but I’m if I’m going to be technical here but and it’s read only I’m not talking to it now the problem if we want to do that Dynamic right back if we want to say we need to be trans litical well that’s a SQL database and now we have the ability for the very first time Mike that the bi team can own a SQL data datase where if I need to create this app that the business owns Which again I wish I had this six years ago or seven years ago we it’s like hey you own what sales team in their quota rather than putting in an Excel table that I have to then transform through a data flow gen one all you have to do is go to this power app which is connected to a fabric SQL database and you just manage what the quota is and who the manager is with some nice pretty drop downs in a number input table and I don’t have to do a ton of Transformations because again the problem with data flows gen one is it’s only a read only only to power btic models there’s nothing there’s nothing else I can do to it I can only write back to the source which we fall into a lot of traps that way I think so and and I think this is again what Microsoft is doing here is they’re blurring the line between a lot of these traditional systems that we’re used to thinking about and even when they came out with data Marts again I don’t really love data Marts but data Mart has this weird blend of like a SQL Server but it also did some modeling you could write measures I think in the the data Mars thing in the past so like there was this really weird like but if you think about like I thought this makes a lot of sense right it’s a SQL Server that’s running both analysis services and SQL tables so it’s like it’s an engine that can run both elements this makes a ton of sense to me so what I will also note here is I I mentioned the term trans litical it’s actually here in Shada is going to be speaking about this at the Microsoft fabric conference in Las Vegas so I’ll put that here and put here on the session so if you wanted to learn a little bit more about it and see the description of it there’s actually a session be coming in the Microsoft Las Vegas conference around this trans litical workload what that’s going to look like how do you automate things and again the description of the topic here is these task the the trans litical task flow allows you to automate end user actions updating your records adding annotations or creating powerful workflows that can trigger other actions in other systems so learn about trans litical things it’s it’s this new experience where you can then join Real Time stuff with data that you have existing at other places sorry I didn’t mean to no you’re totally fine I think that first was mentioned by air at ignite I feel like that’s the first time we heard that so I was sitting next there should be a better term for this one I don’t feel like it it I don’t like and I both know it’s gonna stick Microsoft marketing is great they just do a good job of Landing these terms love we we love their marketing so but so let me ask you are you using at all right now data flows gen one any clients internally and I’ll say this right data when I like it it’s good I just don’t like the fact that it’s Landing tables in a place that I can’t see them I can’t touch them it doesn’t go to the lake house like if data flows gen one just wrote even the CSV files we have in data flows gen one I’m all Lakehouse now so i’ I’ve pretty much migrated all the way to data flows Gen 2 and I’ve actually migrated away from data flows Gen 2 even because I I don’t like how expensive it is a CU consumption standpoint for new users or people that are doing things if that’s what you’re comfortable using go ahead use it but there is a threshold of size of models or or size of tables where data flows Gen 2 starts timing out it has issues it can’t load stuff there’s there’s some things there it’s not quite as efficient as other compute engines but it’s good I like using it so I I don’t really use J flow gen one I will say this the UI of data flows gen one feels dated now it feels like an old version of powerbi these days it runs well it does great you can still connect everything but it is getting older and I I don’t really love the idea that I have when I run the data flow gen one it’s creating every single table in there is creating a new table to go go grab right so I think when I do data flows gen one

41:05 so I think when I do data flows gen one if I’m using them I’m making data flows if I’m using them I’m making data flows like I used to make like four five six like I used to make like four five six seven tables in data flows gen one I seven tables in data flows gen one I don’t really do that anymore if I need don’t really do that anymore if I need data anymore and data flows gen one I’m data anymore and data flows gen one I’m building less tables in each of my data building less tables in each of my data flows and making them smaller so I’m I’m flows and making them smaller so I’m I’m trying to do the least amount of tables trying to do the least amount of tables I need in data flows gen one in data I need in data flows gen one in data flows gen two I don’t really care flows gen two I don’t really care because I can the data flows there and because I can the data flows there and then I could then tell them where to then I could then tell them where to write the data at the end and so I could write the data at the end and so I could have 10 tables and I only write out two have 10 tables and I only write out two tables to some place or location well

41:37 tables to some place or location well they the two different purposes right they the two different purposes right again gen one is solely based on so I again gen one is solely based on so I can connect to and powerbi that’s the can connect to and powerbi that’s the main intention of a data flow gen one main intention of a data flow gen one gen two is so I can push the data to SQL gen two is so I can push the data to SQL a lake house or a warehouse those are a lake house or a warehouse those are the three really use cases I’m not the three really use cases I’m not really really can you use both systems for different can you use both systems for different things sure but the main point of them things sure but the main point of them and how they work the most efficiently and how they work the most efficiently is one is storing data for a powerbi

42:07 is one is storing data for a powerbi model the other is to push the data model the other is to push the data somewhere I wanna i w to I want to pick somewhere I wanna i w to I want to pick on your brain here a little bit Tommy I on your brain here a little bit Tommy I want this is where I want to challenge want this is where I want to challenge you a you a bit all right so we’ve been talking bit all right so we’ve been talking about data flows gen one I recently have about data flows gen one I recently have been playing with a little bit again I been playing with a little bit again I don’t have a ton of experience this one don’t have a ton of experience this one but if you have an import model and you but if you have an import model and you have tables that are in that import have tables that are in that import model you can turn on the ability for model you can turn on the ability for those tables in the import model to be those tables in the import model to be automatically exposed into your lake

42:38 house I saw you knit your eyebrows there for a second so for those of you on for a second so for those of you on camera Tommy give me Tommy gave me a camera Tommy give me Tommy gave me a little double tap if Tommy was drinking little double tap if Tommy was drinking his coffee he would have done a little his coffee he would have done a little spit take here on this one it would have spit take here on this one it would have been a slight spit take at least what been a slight spit take at least what are you talking about I might have are you talking about I might have managed it but yeah so you can take an managed it but yeah so you can take an existing import model and if you turn on existing import model and if you turn on allow this data to be accessed from The allow this data to be accessed from The Lakehouse those existing models it Lakehouse those existing models it brings in the data it’s able to read it brings in the data it’s able to read it but those models can be accessed or but those models can be accessed or those tables can be accessible through a those tables can be accessible through a lake

43:10 lake housee why would I do that well think of housee why would I do that well think of it this way right the the data flows gen it this way right the the data flows gen one experience is old and one experience is old and Antiquated those powerbi models that you Antiquated those powerbi models that you built previously have maybe like you built previously have maybe like you could have you could essentially think could have you could essentially think of there’s a single semantic model that of there’s a single semantic model that has an import mode that’s schally has an import mode that’s schally running has a a handful of Dimension running has a a handful of Dimension tables in it well what if I don’t want tables in it well what if I don’t want to reuse those Dimension tables and to reuse those Dimension tables and build a hold of the so here’s my point build a hold of the so here’s my point my point is I had a CTIC model I have my point is I had a CTIC model I have some Dimension models in it I don’t want some Dimension models in it I don’t want to rebuild all the data engineering to to rebuild all the data engineering to go build a data flow gen one or gen two

43:41 go build a data flow gen one or gen two like it’s already running that report like it’s already running that report runs it does its job there’s already a runs it does its job there’s already a handful of Dimensions that are there why handful of Dimensions that are there why can’t I just connect to them and reuse can’t I just connect to them and reuse them other places that’s the point of them other places that’s the point of the data flows gen one rip out the power the data flows gen one rip out the power query portion shove it into a common query portion shove it into a common table that you could then reuse in table that you could then reuse in multiple places say this is the same multiple places say this is the same experience except I don’t need to do any experience except I don’t need to do any of that I can still stay in my happy of that I can still stay in my happy place of RBI desktop I can build all the place of RBI desktop I can build all the things I want there publish my model and things I want there publish my model and then have the model Refresh on its then have the model Refresh on its Cadence and now I get all those tables Cadence and now I get all those tables by default in a lake housee that I could

44:12 by default in a lake housee that I could then go use other then go use other places that is so many extra steps and I places that is so many extra steps and I the the feature you’re talking about I the the feature you’re talking about I know exactly what you’re talking about know exactly what you’re talking about what’s the name of the exactly I’m gonna what’s the name of the exactly I’m gonna try to Google that one real quick I just try to Google that one real quick I just remember is push your data to a lake remember is push your data to a lake house so and yeah and it’s those are house so and yeah and it’s those are so many extra steps to have a master so many extra steps to have a master source of data especially if we’re source of data especially if we’re talking reference tables here again in talking reference tables here again in that context because I have to first do that context because I have to first do the powerbi Transformations if there are

44:42 the powerbi Transformations if there are any and let’s just say for the sake of any and let’s just say for the sake of argument that my powerbi model at this argument that my powerbi model at this point is not connecting to anything in point is not connecting to anything in Fabric or else then we’re getting very Fabric or else then we’re getting very complicated with look backs I’m just complicated with look backs I’m just connecting the data doing connecting the data doing transformations loading this in the transformations loading this in the model publishing this and then pushing model publishing this and then pushing that to a that to a Lakehouse or why would I do all those Lakehouse or why would I do all those additional steps because if I have to additional steps because if I have to manage or update something like I have manage or update something like I have to go to that powerbi file which to me

45:13 to go to that powerbi file which to me is completely counterintuitive to a lot is completely counterintuitive to a lot of your philosophy around Fabric and RBI of your philosophy around Fabric and RBI why would I this is like the gold model why would I this is like the gold model but the old version of a master data set but the old version of a master data set and again it’s not stored in a TBL form and again it’s not stored in a TBL form is Sy simply stored a static data in a is Sy simply stored a static data in a Lakehouse this honestly then The Logical

45:34 Lakehouse this honestly then The Logical Point here is I’m pushing my data Mike Point here is I’m pushing my data Mike I’m really leaning more and more towards I’m really leaning more and more towards the primary place that I’m GNA have the primary place that I’m GNA have dimension-based dimension-based data especially if I need write back data especially if I need write back especially if I need business to manage especially if I need business to manage their data and to own their data in a their data and to own their data in a fabric SQL fabric SQL database right because this stored in database right because this stored in the format I already want it if you need the format I already want it if you need to change something from group a to to change something from group a to Group B it’s incredibly easy to do it

46:05 Group B it’s incredibly easy to do it doesn’t go through a pipeline it can doesn’t go through a pipeline it can just happen and for those who missed I just happen and for those who missed I just snapped my fingers to show the just snapped my fingers to show the quickness of that and it’s managed in a quickness of that and it’s managed in a single place I’m your your situation single place I’m your your situation there I want to actually discover in there I want to actually discover in another episode because I don’t know another episode because I don’t know what that features primary purposes for what that features primary purposes for to take a semantic model and then push to take a semantic model and then push into a lake house and I’m struggling into a lake house and I’m struggling with where that is better than where is

46:36 with where that is better than where is that feature better than another feature that feature better than another feature in fabric right now what does that do in fabric right now what does that do better or solve in a more efficient way better or solve in a more efficient way than I think it’s more of like a than I think it’s more of like a backwards compatibility step is how I backwards compatibility step is how I would look at it right it’s the idea would look at it right it’s the idea it’s the idea that you already have it’s the idea that you already have semantic models that are running you’ve semantic models that are running you’ve already built the logic into into the already built the logic into into the actual semantic models actual semantic models themselves those tables could be maybe themselves those tables could be maybe somewhat large maybe they’re using somewhat large maybe they’re using incremental resour are bigger things in incremental resour are bigger things in them right if the models are already them right if the models are already larger and they’re just there why not larger and they’re just there why not reuse the work that you’ve already done reuse the work that you’ve already done right the alternative would be is

47:07 right the alternative would be is understand how that table’s built go understand how that table’s built go build something different or new right build something different or new right and the I think maybe also the concept and the I think maybe also the concept here is if you’re already spending the here is if you’re already spending the money let’s call about let’s talk about money let’s call about let’s talk about the money here right if you’re already the money here right if you’re already spending the money to make one model spending the money to make one model have some tables in it why not not load have some tables in it why not not load that table a second time in a different that table a second time in a different model just reference the original table model just reference the original table that was loaded from the original that was loaded from the original semantic model so that’s where I I think semantic model so that’s where I I think this is really interesting to me is you this is really interesting to me is you don’t need you wouldn’t need to pay for don’t need you wouldn’t need to pay for the loading process twice essentially

47:39 the loading process twice essentially and I think a lot of times we have a lot and I think a lot of times we have a lot of models that pay multiple times to of models that pay multiple times to load the same data which is why we went load the same data which is why we went to data flows gen one the first time and to data flows gen one the first time and now we have lake houses like spend the now we have lake houses like spend the money to compute the table and put it in money to compute the table and put it in the lake house or put it in the data the lake house or put it in the data flow gen one because you’re going to flow gen one because you’re going to spend money to get there reuse it now spend money to get there reuse it now because now I don’t have to load the because now I don’t have to load the data again it’s already there and ready data again it’s already there and ready to go so like this is another scenario to go so like this is another scenario where you’re you’re potentially you’re where you’re you’re potentially you’re now reusing work that you’ve already

48:09 now reusing work that you’ve already done except any of this Dimension done except any of this Dimension reference data is my Indiana Jones gold reference data is my Indiana Jones gold treasure chest or whatever it’s called treasure chest or whatever it’s called the the shoot you’re the movie guy what was shoot you’re the movie guy what was it called the Golden ARS Indiana Jones it called the Golden ARS Indiana Jones Indiana Jones and and McDonald’s like Indiana Jones and and McDonald’s like the golden arches the golden arches the Temple of is what you’re Temple no the Temple of is what you’re Temple no the the one with the ah is a golden the the one with the ah is a golden chest thing so that’s the Holy chest thing so that’s the Holy Grail or the Holy Grail is the

48:41 Grail or the Holy Grail is the cup yeah this was the Arc of the cup yeah this was the Arc of the Covenant but I don’t remember I don’t Covenant but I don’t remember I don’t remember what the movie was called was remember what the movie was called was that Temple anyways was that Temple of D that Temple anyways was that Temple of D I’ll gole it right now keep going I’ll gole it right now keep going Raiders the lost that’s what it’s called Raiders the lost that’s what it’s called anyway Indiana okay Raiders Raiders the anyway Indiana okay Raiders Raiders the Los we are good on movies we like a lot Los we are good on movies we like a lot movies we are great on movies yeah I movies we are great on movies yeah I think yeah Harrison Ford I could tell think yeah Harrison Ford I could tell you that so regardless my dimensions are you that so regardless my dimensions are The Arc in this where I have to be very

49:11 The Arc in this where I have to be very careful things don’t just rapidly or careful things don’t just rapidly or quickly change without me understanding quickly change without me understanding this is almost why I I understand why this is almost why I I understand why gen one had to have that takeover gen one had to have that takeover feature like are you sure you want to feature like are you sure you want to take over the data flow because if you take over the data flow because if you change something this is going to update change something this is going to update a lot of reports and to your point with a lot of reports and to your point with the semantic model pushing to a the semantic model pushing to a Lakehouse well someone could go Lakehouse well someone could go to a old version of the powerbi model or to a old version of the powerbi model or someone just refresh and make a quick someone just refresh and make a quick change and guess what everybody’s models

49:42 change and guess what everybody’s models that are dependent on those reference that are dependent on those reference tables are now updated and changed and tables are now updated and changed and if I have no version history which we if I have no version history which we know SQL databases do we can at least know SQL databases do we can at least go back in time like you have to treat go back in time like you have to treat these reference tables as gold these reference tables as gold they’re gold Jerry from from Seinfeld they’re gold Jerry from from Seinfeld like this is your master reference table like this is your master reference table where quotas your your territories where where quotas your your territories where countries ever your metrics are going to countries ever your metrics are going to be dependent on these reference tables I

50:09 be dependent on these reference tables I I agree Tommy on in that regard but I’m I agree Tommy on in that regard but I’m also thinking like you’re also thinking also thinking like you’re also thinking a very op down organized a very op down organized Centric away I’m not sure every Centric away I’m not sure every department is always going to be that department is always going to be that organized with all their metrics and organized with all their metrics and things that they’re doing it’s going to things that they’re doing it’s going to be a little bit more ad hackish I think be a little bit more ad hackish I think in certain areas so like I think in certain areas so like I think there Spectrum again it’s kind like a there Spectrum again it’s kind like a scale of things right so you’re maybe scale of things right so you’re maybe talking to more of like those really talking to more of like those really critical certified level reporting critical certified level reporting pieces yeah you’re right you’re not pieces yeah you’re right you’re not going to want to muck around with those going to want to muck around with those data flows too much because you’re

50:39 data flows too much because you’re they’re going to be setting those they’re going to be setting those standards for and people are relying on standards for and people are relying on them people are building things off of them people are building things off of those tables great that’s what they those tables great that’s what they should be doing that’s we that’s what should be doing that’s we that’s what we’re doing it for but when you get more we’re doing it for but when you get more of like this departmental or individual of like this departmental or individual level sometimes you just want to have level sometimes you just want to have the data there and you just want to the data there and you just want to reuse it so I think there’s also a part reuse it so I think there’s also a part of this around like when you’re doing of this around like when you’re doing really really good dimensional modeling really really good dimensional modeling practices yes I agree with you 100%

51:00 practices yes I agree with you 100% Tommy but there’s also a world of the of Tommy but there’s also a world of the of this is I’m still exploring the data this is I’m still exploring the data I’ve already loaded sales data once I I’ve already loaded sales data once I just want to reference it again just want to reference it again somewhere else or I’ve already loaded somewhere else or I’ve already loaded this once before how can I easily get this once before how can I easily get that information back out and maybe it’s that information back out and maybe it’s more of more of a bubble gum and and Band-Aids

51:18 a bubble gum and and Band-Aids a bubble gum and and Band-Aids solution right for now but you don’t solution right for now but you don’t this is this is the the challenge I this is this is the the challenge I think I face here a lot me personally is think I face here a lot me personally is how much time do I invest in a solution how much time do I invest in a solution to make it like just right or do I to make it like just right or do I invest just enough to get it working to invest just enough to get it working to prove that it’s valuable and then I can prove that it’s valuable and then I can invest the time just to make it better invest the time just to make it better right so a lot of this is like I’m I’m right so a lot of this is like I’m I’m trying to build a lot of things quickly trying to build a lot of things quickly I need to deliver and this is one of the I need to deliver and this is one of the stories of powerbi 5x5 was their initial stories of powerbi 5x5 was their initial story in five minutes you’ll story in five minutes you’ll have what was it 5 by five five minutes

51:54 have what was it 5 by five five minutes to wow or something like that five to wow or something like that five minutes minutes yeah and click click drop it yeah build yeah and click click drop it yeah build something for five minutes and say look something for five minutes and say look data starts appearing like it’s not that data starts appearing like it’s not that difficult so that has been the whole difficult so that has been the whole mentality is build something quickly mentality is build something quickly that adds value then what has what comes that adds value then what has what comes from that is once you do a lot of that from that is once you do a lot of that you have to start really thinking about you have to start really thinking about okay not everything that’s designed okay not everything that’s designed quickly is always efficient and then we quickly is always efficient and then we have to start stepping back and saying have to start stepping back and saying what’s the efficient way to do what’s the efficient way to do everything so I think there is some everything so I think there is some discovery that happens in the process of discovery that happens in the process of building there there’s definitely

52:27 building there there’s definitely Discovery but I’m going to have to emper Discovery but I’m going to have to emper Empire Strikes Back level of disagree or Empire Strikes Back level of disagree or push back Indiana Jones got on you push back Indiana Jones got on the and they were running away from know the and they were running away from know the and they were running away from same character same actoral same character same actoral ships that were in in space yes this ships that were in in space yes this after yeah after Harrison forn defeated after yeah after Harrison forn defeated the Nazis and then immediately went to the Nazis and then immediately went to Darth Vader Darth Vader so no but this is this is this is my so no but this is this is this is my argument or push back here yeah argument or push back here yeah exploration all those things great fine exploration all those things great fine put that in a Dev Place put that in

53:01 put that in a Dev Place put that in development as soon as you’re dealing development as soon as you’re dealing with anything that’s dimensional or my with anything that’s dimensional or my reference tables the problem and the reference tables the problem and the only thing I’m pushing back on is this only thing I’m pushing back on is this is a top down approach you could focus is a top down approach you could focus simply on a one Department that could be simply on a one Department that could be your life as an analyst but right and it your life as an analyst but right and it but the thing is those Dimensions if but the thing is those Dimensions if they’re not right your numbers are they’re not right your numbers are always wrong and people even if it’s not always wrong and people even if it’s not kpi or relying like if you have zip kpi or relying like if you have zip codes wrong or again there your codes wrong or again there your marketing campaigns wrong well at some marketing campaigns wrong well at some point someone’s going to figure out that

53:35 point someone’s going to figure out that your metrics are wrong and it’s because your metrics are wrong and it’s because your dimensions are not right as soon as your dimensions are not right as soon as you have production level reports your you have production level reports your Dimensions have to be defined because Dimensions have to be defined because someone is gonna spot an error someone’s someone is gonna spot an error someone’s gonna spot this is not right and it’s gonna spot this is not right and it’s Pro more or less what I found it’s not Pro more or less what I found it’s not because your Dax is wrong it’s because because your Dax is wrong it’s because your Dimensions aren’t aligned and they your Dimensions aren’t aligned and they they’re not and again I I know the word they’re not and again I I know the word perfect is a terrible word in our perfect is a terrible word in our business but at the same business but at the same token exploration to me is very token exploration to me is very different than production exploration

54:09 different than production exploration should absolutely be part of what we’re should absolutely be part of what we’re doing but once we have something in an doing but once we have something in an app once I have something that consumers app once I have something that consumers and stakeholders are looking at well I and stakeholders are looking at well I have to be able to speak to those have to be able to speak to those Dimensions at the very least if it’s not Dimensions at the very least if it’s not perfect if someone ask me why Dimension perfect if someone ask me why Dimension is the way it is are aligned with is the way it is are aligned with something you better have the ability to something you better have the ability to speak to it why campaign a is tied to speak to it why campaign a is tied to whatever whatever dollar whatever whatever dollar amount and if you can’t that’s where you amount and if you can’t that’s where you lose TR I think Donald is pointing on lose TR I think Donald is pointing on what you’re talking about there is

54:40 what you’re talking about there is you’re talking about what is good you’re talking about what is good conformed Dimensions versus notc conformed Dimensions versus notc conformed Dimensions this is a a Kimble conformed Dimensions this is a a Kimble practice that is used by organizations practice that is used by organizations around the world very solid good around the world very solid good scenario yeah but my my point here scenario yeah but my my point here though is not everyone understands like though is not everyone understands like I don’t I don’t think I would come into I don’t I don’t think I would come into an organization fully understanding that an organization fully understanding that everyone understands what conform everyone understands what conform dimensions are really doing like how dimensions are really doing like how important they are I dis I agree with important they are I dis I agree with you they are extremely important we need you they are extremely important we need to have people talking about this when to have people talking about this when this is a data culture problem so we’re this is a data culture problem so we’re moving away from the technology here

55:12 moving away from the technology here this is all part of process and people this is all part of process and people about do we really understand what a about do we really understand what a conform Dimension is do we know how to conform Dimension is do we know how to use it because once you understand what use it because once you understand what a really wellc conformed Dimension looks a really wellc conformed Dimension looks like we can now talk about multiple fact like we can now talk about multiple fact tables that can be used to calculate tables that can be used to calculate different things so I agree with you different things so I agree with you Tommy but I think my point here is not

55:33 Tommy but I think my point here is not everyone not the culture and all everyone not the culture and all Departments of an organization are going Departments of an organization are going to have that real Rich understanding to have that real Rich understanding that we do of what conform dimensions that we do of what conform dimensions are and so I think there’s just going to are and so I think there’s just going to be a bit more flexibility here and be a bit more flexibility here and that’s why I was thinking maybe that’s why I was thinking maybe de of flows gen two isn’t really what de of flows gen two isn’t really what you want maybe de of flows gen one isn’t you want maybe de of flows gen one isn’t really what you want or maybe SQL Server really what you want or maybe SQL Server isn’t what you want maybe what you isn’t what you want maybe what you really want is a model that is loading really want is a model that is loading importing data every night and you’re importing data every night and you’re just touching those taes and using them just touching those taes and using them other places because you can reach them other places because you can reach them through like this new experience so all through like this new experience so all this to say I’m not sure I’ve done

56:06 this to say I’m not sure I’ve done enough testing around this to say yes go enough testing around this to say yes go build everything like this this is not build everything like this this is not what I’m recommending I would still what I’m recommending I would still recommend bring everything to the lake recommend bring everything to the lake house that’s where you’re going to get a house that’s where you’re going to get a lot of your your value and to be honest lot of your your value and to be honest Microsoft is spending big bucks with Microsoft is spending big bucks with their development team to work only in their development team to work only in the in the model space anyways like the in the model space anyways like in sorry the lak house space they’re in sorry the lak house space they’re they’re putting there’s lots of teams they’re putting there’s lots of teams working on making that experience working on making that experience premium topnotch like the best premium topnotch like the best experience you can get and now you have experience you can get and now you have all these other tools like SQL all these other tools like SQL databases so anyways I I know we’re

56:39 databases so anyways I I know we’re getting CL of close on time here we getting CL of close on time here we probably should wrap here for people to probably should wrap here for people to get back to their normal daily workflows get back to their normal daily workflows but I think this has been a good but I think this has been a good discussion I’ll just give you my final discussion I’ll just give you my final thoughts my final thoughts here is I thoughts my final thoughts here is I think data flows gen one SQL think data flows gen one SQL databases it doesn’t really matter what databases it doesn’t really matter what you choose I think I’m going to start me you choose I think I’m going to start me personally I’m going to start choosing personally I’m going to start choosing more data flows Gen 2 I’m going to start more data flows Gen 2 I’m going to start because I am a fabric person I think the because I am a fabric person I think the price is right it does so many more price is right it does so many more things I’m going to continually push things I’m going to continually push organizations to start thinking about organizations to start thinking about leveraging some a little bit portions of leveraging some a little bit portions of fabric to help them build better

57:14 fabric to help them build better data structures and but I will say data structures and but I will say this I’ve been very pleased with the this I’ve been very pleased with the performance of the SQL Server there are performance of the SQL Server there are some missing functions and features that some missing functions and features that are not there because it’s it’s you a are not there because it’s it’s you a fabric flavor of sequel but overall I’ve fabric flavor of sequel but overall I’ve been very pleased with this compute been very pleased with this compute usage on it it seems to fill the usage on it it seems to fill the Gap what I need I can turn it on on an Gap what I need I can turn it on on an F2 and not worry about it eating all my F2 and not worry about it eating all my capacity all at once so in that regard capacity all at once so in that regard I’m really interested to try and play I’m really interested to try and play more with it I’ve been spinning up a more with it I’ve been spinning up a couple I’ve been playing with it myself couple I’ve been playing with it myself I’m finding good value from it I think

57:46 I’m finding good value from it I think it makes a lot of sense and as long as it makes a lot of sense and as long as they keep it efficient and cost slow I’m they keep it efficient and cost slow I’m happy with it it seems like a good happy with it it seems like a good solution Tom what are your final

57:54 solution Tom what are your final thoughts here thoughts here if your only option is data flows gen

57:59 if your only option is data flows gen one and you need these reference tables one and you need these reference tables you’re reusing it it is a it is a very you’re reusing it it is a it is a very good solution and it still stands up good solution and it still stands up however if you are moving to the fabric however if you are moving to the fabric world for the very first time in your world for the very first time in your powerbi life SQL databases allow you to powerbi life SQL databases allow you to own these reference tables and can have own these reference tables and can have that collaboration with the business that collaboration with the business what I’m imagining here is a perfect what I’m imagining here is a perfect world is I’m creating these SQL world is I’m creating these SQL databases with reference tables and I databases with reference tables and I can have a power app where the business can have a power app where the business can own this they can do easily do this

58:31 can own this they can do easily do this right back which is has been the right back which is has been the greatest problem if you only have data greatest problem if you only have data flows gen one great they still stand up flows gen one great they still stand up but if you have the opportunity to but if you have the opportunity to utilize a fabric SQL database and own utilize a fabric SQL database and own your data with the business like never your data with the business like never before that to me is now the primary before that to me is now the primary route that organization should go I’m route that organization should go I’m going to throw down the link here that I going to throw down the link here that I found really quick so the the feature I found really quick so the the feature I was talking about was called one Lake was talking about was called one Lake integr a for semantic models that’s the integr a for semantic models that’s the feature I was talking about the feature I was talking about the documentation is going to be in the chat documentation is going to be in the chat window here if I am ambitious enough

59:05 window here if I am ambitious enough I’ll go back and add it to the I’ll go back and add it to the description as well so you can find it description as well so you can find it there additionally but here is the there additionally but here is the the one Lake the one Lake integration for semantic models this integration for semantic models this is the feature I was was talking about is the feature I was was talking about it allows you to basically if you it allows you to basically if you have a single table that is imported in have a single table that is imported in your semantic model you can expose that your semantic model you can expose that entire table in the Lakehouse for other entire table in the Lakehouse for other users to use so you can then bring that users to use so you can then bring that powerbi table in you can have it using powerbi table in you can have it using import mode and then once the table’s in import mode and then once the table’s in the lake housee you could use other the lake housee you could use other models that are direct Lake accessing

59:37 models that are direct Lake accessing that same table so you get the that same table so you get the data flows gen one experience data flows gen one experience where you’re bringing tables in to a where you’re bringing tables in to a common place and you can U common place and you can U utilize them other places so really utilize them other places so really interesting on what that would mean interesting on what that would mean that also means your imported models that also means your imported models could be used in like a python notebook could be used in like a python notebook or you could load them somewhere else or or you could load them somewhere else or like it really I’m telling you this like it really I’m telling you this thing turns into like a A Rat’s Nest thing turns into like a A Rat’s Nest potentially of like different potentially of like different connections across all the things that connections across all the things that are touching other stuff like you could are touching other stuff like you could do a lot of crazy things again this is do a lot of crazy things again this is why I think Tommy and I are very Pro why I think Tommy and I are very Pro on being organized and working on

60:12 on being organized and working on your culture because you can do a lot of your culture because you can do a lot of things not every single thing you would things not every single thing you would do is is useful to the organization or do is is useful to the organization or easy to maintain so you have to make easy to maintain so you have to make choices as an organization about what choices as an organization about what you want to build anyways that being you want to build anyways that being said thank you so much for listening we said thank you so much for listening we really appre apprciate you being on the really appre apprciate you being on the Podcast chat you’ve been amazing a lot Podcast chat you’ve been amazing a lot of good ideas a lot of good comments of good ideas a lot of good comments here so thank you very much for here so thank you very much for participating in the chat as well if you participating in the chat as well if you like this podcast if you like what we’re like this podcast if you like what we’re talking about please make sure you talking about please make sure you subscribe hit the Bell you’ll let us subscribe hit the Bell you’ll let us know we do this every Tuesday and know we do this every Tuesday and Thursday so we’d love to have you on and Thursday so we’d love to have you on and chatting on the conversations as well if

60:44 chatting on the conversations as well if you find this is valuable to you odds you find this is valuable to you odds are somebody else this would are somebody else this would also be valuable to please go out to also be valuable to please go out to somebody else and let them know you somebody else and let them know you found this amazing podcast that you found this amazing podcast that you could want to listen to and learn could want to listen to and learn some more about powerbi and all the some more about powerbi and all the fabric things that are coming out now fabric things that are coming out now Tommy where else can you find the

61:01 Tommy where else can you find the podcast you can find us in apple Spotify podcast you can find us in apple Spotify wherever you at your podcast make sure wherever you at your podcast make sure to subscribe and leave a rating it helps to subscribe and leave a rating it helps us out a ton and please share with a us out a ton and please share with a friend we do this for free do you have a friend we do this for free do you have a question idea or a topic that you want question idea or a topic that you want us to talk about in a future episode us to talk about in a future episode head over to power bi. podcast leave head over to power bi. podcast leave your name and a great question and your name and a great question and finally join us live every Tuesday and finally join us live every Tuesday and Thursday 7:30 a.m. Central and join the Thursday 7:30 a.m. Central and join the conversation on all power tips social conversation on all power tips social media channels nice have a good one and media channels nice have a good one and we’ll see you next time

Thank You

Want to catch us live? Join every Tuesday and Thursday at 7:30 AM Central on YouTube and LinkedIn.

Got a question? Head to powerbi.tips/empodcast and submit your topic ideas.

Listen on Spotify, Apple Podcasts, or wherever you get your podcasts.

Previous

Semantic Models on the Web – Ep. 398

More Posts

Mar 7, 2026

Is Power BI Desktop a Dev Tool? – Ep. 376

Mike and Tommy debate whether Power BI Desktop should be treated like a true development tool or more like a report authoring environment. They break down what “developer workflow” actually means for Power BI teams—source control, testing, deployment, and repeatability.

Mar 4, 2026

AI-Assisted TMDL Workflow & Hot Reload – Ep. 507

Mike and Tommy explore AI-assisted TMDL workflows and the hot reload experience for faster Power BI development. They also cover the new programmatic Power Query API and the GA release of the input slicer.

Feb 27, 2026

Filter Overload – Ep. 506

Mike and Tommy dive into the February 2026 feature updates for Power BI and Fabric, with a deep focus on the new input slicer going GA and what it means for report filtering. The conversation gets into filter overload — when too many slicers and options hurt more than they help.